So, the forum title's not very descriptive, but it was the shortest way I could think of summing things up. To expand a bit upon - I recently saw a post on Tumblr (Please keep your collective groaning to a dull roar.) insisting that the phrase "Life is unfair, get used to it" and the philosophy behind it is a product of patriarchal society - that it's a philosophy men espouse, which is why it's used and believed. A matriarchal society would say "Life is unfair, so we should make it fair." The implication there is pretty clear - men are concerned with power and lack sympathy, so they don't care if life is unfair, while woman are somehow inherently more empathizing and compassionate, and would make a better and more providing society overall.
Now, to be clear, I don't AGREE with that. In addition to being pretty sexist towards both genders, it's more of that awful "Women are great and perfect and men are pigs" crap. But if that post and it's 20,000-something notes are anything to go by, some people obviously do agree with it.
So that got me thinking - do you think society would be much different if it were, historically speaking, dominated by women instead of men? If the gender roles were reversed? If so, how? I'm curious to see some discussion on this.
TLDR: How do you think society would be different if were dominated by women? Would it be different at all?
Now, to be clear, I don't AGREE with that. In addition to being pretty sexist towards both genders, it's more of that awful "Women are great and perfect and men are pigs" crap. But if that post and it's 20,000-something notes are anything to go by, some people obviously do agree with it.
So that got me thinking - do you think society would be much different if it were, historically speaking, dominated by women instead of men? If the gender roles were reversed? If so, how? I'm curious to see some discussion on this.
TLDR: How do you think society would be different if were dominated by women? Would it be different at all?