Question:

Does society teach gender roles or is it biology?

by  |  earlier

0 LIKES UnLike

i was just wondering does men and women act the way they do because of there gender or is it what we learn from our parents and society that makes us act the way we do. and by that i mean traditional gender roles like certain things men are suppose to do and certain things women are supposed to do.

 Tags:

   Report

2 ANSWERS


  1. I think it's a combination of both.  One study I learned about in psychology discussed how they gave baby monkeys (who hadn't experienced any sort of social training) toys to choose from --and most of the male monkeys chose the toy truck, while the females picked the doll.  Also, in terms of fight/flight response, females have another response called "tend and befriend", where they try to make friends.  From a biological perspective, many aspects of gender roles make sense; on average, males tend to be stronger and more physical, so it makes sense that they would fight, compared to females who are less physically strong (for the most part) and thus would find it more advantageous to gain allies and find less physical means of resolving things.  

    On the other hand, many aspects of gender are purely societal --this can be seen by the differences in gender roles across different societies/cultures.  If it were biological, gender roles would be the same regardless of where you were; however, since they aren't, clearly there is a social aspect to it.  


  2. Biology all the way honey!!!

Question Stats

Latest activity: earlier.
This question has 2 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.