I know this isn't the typical post for here, but I wasn't sure what other community I was in that could really answer this question. Do you guys know any good viable resources, sites, and books on gender roles and how they affect men and women in society? Specifically how they perpetuate the ideas that men should bring in the bread and women should cook and clean etc. I was in argument with my brother today and now I want to form a well-researched opinion so he can't discount it because it's just emotional or whatever. I want facts to back up my opinion so he'll have to take it seriously.