Colonialism Definition:
Noun; The policy or practice of acquiring full or partial political control over another country, occupying it with settlers, and exploiting it economically.
Top of Form
Colonialism is the control that a country or government holds over the territory and the people in a foreign country. England colonized many areas in the world. They had colonies in India, Ireland and parts of North America. Spain also had colonies in the Americas, as did France. Colonialism has existed at one time or another in almost every continent in the world. The Dutch also colonized parts of the globe.
Bottom of Form
The concept of Colonialism is to be used in a method of expansion of a country's ownership of land, resources, and economic advancement. Some countries simply did it to get their hands on the material resources of the new countries that they have colonized. There are about only four countries that escaped European colonialism completely. Japan and Korea successfully fended off European domination, in part due to their strength and diplomacy and perhaps their distance from Europe. Thailand was spared when the British and French Empires decided to let it remain independent, unlike Indonesia when they declared independence them selves. Japan, however, colonized both Korea and Thailand during its early-20th-century imperial period.
The main topic ill be talking about is French colonialism in Vietnam. By the late 1880s France controlled Vietnam and Cambodia, which collectively it was called French Indochina. Indochina soon became one of France’s most moneymaking colonial possessions, part of the empire spanned northern and western Africa, as well as islands in the Caribbean and the Pacific. French imperialists claimed it was their responsibility to colonize undeveloped regions in Africa and Asia, to introduce modern political ideas, social reforms, industrial methods and new technologies. In reality, it was a sham. The real