Throughout history we have lived in a patriarchal society – male dominating – in fact it is only in the last fifty years that women have had equal rights to men. Is this correct though? Should men have the dominating roles with women being submissive to their fathers and then later to their husbands, or are equal rights the way towards progress and advancement.
Common belief holds that women have often been excluded, marginalized, or silenced in the Christian tradition from its inception. From St. Paul's first-century teaching that "women are to remain silent in church" to the more recent Southern Baptist censure of female pastors, many women have found a less than welcoming presence in Christianity, particularly in leadership positions. Fortunately, recent work in history has uncovered a more varied role for women in the Church, from antiquity through the Middle Ages and throughout the nineteenth and early twentieth century’s. Theological studies have worked to present a more accurate view of Christ's teachings and treatment of women, and have sought to separate patriarchal cultural constraints from the Gospel message and vision of the Church for all genders and races. This listing of websites includes both traditional and reconstructed views of women in the Christian tradition, from the Bible through the present day.
Christianity began between two patriarchal cultures, Jewish and Roman. In Jewish culture, men made all the decisions. Women were valued primarily for producing heirs for men. A woman depended on men for her livelihood throughout her life, first on her father, then on her husband, and finally on her sons. As a result, dowries were serious financial transactions, and widowhood without the support of a son meant financial disaster.
Roman culture divided privileges and responsibilities between men and women. Men made the major decisions regarding possessions and business. Women had some authority over the