The role of women in society, and how prevailing orthodoxies have changed along with the cultural landscape to further shape this often misconceived topic. With the matter of gender equality ever present in today’s society, something that could not always be said, We feel now is a great time to investigate further what factors have ultimately sculpted popular thought in regards to this tender topic.
The role of women in society has been greatly overseen in the last few decades. They are now becoming a more of a perspective to people, but in the earlier days, women were not seen in the workplace. They were seen as mothers taking care of children, or any household duties like cooking and cleaning. Soon enough the role of women gradually changed as they became to voice their opinions.
Throughout history, the roles of men and women would always be directed by gender. Traditionally, women in America were limited in their roles. Women were once seen as only needed to bear children and take care of household activities such as cooking and cleaning, while their husband would provide for his family. It was common for a man to go out and provide for his family while the woman would stay at home and take care of all the necessary household chores and children. A married women always took the husband's status. Even though we are seeing more and more women succeeding in life, American culture still defines a women as unequal. Society has set men and women apart by labeling them. Men have always been portrayed as the dominant sex. They have always been seen as strong and aggressive vs. women as weak. The culture effortlessly portrays women that they cannot perform jobs like men do; jobs like policemen, firefighters, and running corporate establishments. Society looks at women and puts a negative slide to them, because of most jobs like these having the image of male dominance or masculinity over women. They believe women cannot perform jobs like men do,