Feminism in the U.S. is the movement and ideologies that represent the equality of political, economic, cultural, and social rights for women.
I think people have a misunderstanding of what feminism actually is, some people think that it is the female gender thinking they should rise in power above the male gender; it is about gender equality whether you are a man or woman. When someone makes a negative comment about feminism, they are supporting sexism. Stratification of male and females should not be condoned or tolerated, it is sexist; whether you are a man or women we should all be equal.
There isn't just one type of feminism, there are a variety
of feminist groups including: girlie feminist, third-wave feminist, pro-sex feminist, and so on. All these groups aim to deal with different types of discrimination women, and sometimes men, face.