May 2011
Sociological Imagination
Racism in America
Is America in a Post Racial Era? What sort of impact does racism have in our society? In America, it's quite well known that we finally have the first black president in, it is also generally agreed upon that racism is unacceptable in society, and most of us would consider ourselves equal to one another regardless of race. Obviously, we still have people who are racist and the idea that these people will go away completely is almost unbelievable. Racism is the belief that races have distinctive cultural characteristics determined by hereditary factors and that this endows some races with an intrinsic superiority over others. And it can also mean abusive or aggressive behavior towards members of another race on the basis of such a belief. Racism means that people have negative and condescending thoughts about others based on their race. Some of the most infamous acts of racism in the United States occurred in the 1800s and 1900s which involved the discrimination against Indians and African Americans. In the 1800s, Americans believed that the Native Americans should be removed off their land or forced to assimilate into American society. The many Native Americans who chose not to assimilate were forced off their land into “reserves” so they would remain separate from society. “The concept became policy in 1825, with the creation of an Indian Country between the Red and Missouri Rivers… followed by the Removal Act of 1830, leading to the relocation of many eastern tribes. Continuing non-Indian expansion, however, caused the so-called "permanent" Indian Territory to dwindle in size” (Waldman, Carl). The removal act attempted to remove Indians off their land and segregate them from each other and then, “The difference was that instead of one large Indian Country, lands were divided up piecemeal, with tribes confined to separate parcels with specific boundaries” (Waldman, Carl). And because of