When one says “imperialism,” what is the first image to come to mind, one that truly represents the practice? Is it the enslaved African, the poor soul who is subjugated, treated as a beast, and physically tortured? Is it the Trail of Tears, the infamous Native American migration forced by the United States government? It would seem as though the word “empire” has taken on a negative, almost sinister meaning in recent years, particularly in the popular media. Ask any child about empires, and they’ll go on about the evil, planet-destroying Darth Vader and his army of Stormtroopers, or about big alien motherships descending upon Washington DC and destroying all signs of life. So, to the modern citizen, “imperialism” seems to entail destruction, domination, and overall evil. It becomes necessary to look closer and give a more thorough examination of the phenomenon that seems to have started this attitude. Over the last two centuries, the Industrial Revolution, along with the discovery of the American continents, sparked a desire in European nations to expand and conquer. This started with the aforementioned Americas, but as the colonies gained independence, European nations were already moving on to places like Africa and the Philippines. Soon enough, almost the entire globe seemed to either be an imperial nation or a colony of one. In many of these colonies, the subjugated peoples faced such hardships as slavery, mass death due to disease or violence, and forced change in culture. While these negative effects are impossible to ignore, it must be noted that Western imperialism has improved other parts of the world, the parts in which a synthesis of cultures and an exchange of ideas truly takes place. Western imperialism, while causing strife for the subjugated, has led to global improvements, such as the increase in trade and wealth, technological improvements, medical advances, and increasing…