I was thinking and do we women not give Western men throughout history enough credit for taking care of women?
World wide there is a history of men dominating women and that has been true of the western world as well BUT here in America and most of Europe, women have men protected against most abuses i.e. it's NOT OK to rape women. Women have had decent housing most times, running water was provided relatively quickly, efficient household appliances were invited, sanitary options were made so women don't have to be like those in Africa trying to create makeshift sanitary napkins, etc. I look at the Middle East and Africa and I have to say maybe we don't credit men in America enough? Without the laws they upheld we never could have even had a successful feminist movement. Some men are creeps but maybe most western men are OK.
What do you think?
Tags: