To start off, one of the first things that started to change in Africa after colonialism was religion. The Europeans came in, and nearly forced a new religion on to the African people. The religion they wanted everyone to be apart of was Christianity. They even built some churches on African soil. If they gave the …show more content…
It shouldn’t have messed with the Africans identity, it should have helped them. But when something you have known your whole life, goes away, you don’t care how good the next thing is, you want what you loved back. And it didn’t help that the Europeans took land and didn’t let natives have a choice in their laws. Even if the Europeans were doing the right thing in general, there were way better ways to go about it. They could have peacefully asked to come into their land. And they didn’t have to blunty say that the African gods weren’t real. The way the Europeans introduced Christianity warped what it really was. It made it seem like Christianity was a forceful religion, when really it is not at