Gender Roles
Gender roles are the roles that society assigns to men and women based on their gender. They especially influence relationships between men and women.
Gender roles have been changing in Western society in recent decades, and generally have become more flexible. However, traditional gender roles still have some influence.
For example, it used to be expected that men would experiment sexually before marriage, but that women would not. Women who went against this expectation were considered "loose" or "fallen" women, while men who went against the expectation were considered less than manly.
Years ago another expectation was that women were supposed to get married and stay home to raise a family. The man was expected to go out to work to support his family.
If the woman chose to have a career, she was considered "barren" or "lacking in maternal instinct", and her partner was often considered inadequate, as it was assumed he was not a "good provider".
-Philita J.
No comments:
Post a Comment