It seems like society/Patriarchy has the goal of keeping women from direct power by encouraging them to marry, manipulate and be financially dependent on a man. Religion and government forces seem to try to reinforce these agendas. While women now own property, vote and work, we are less likely to indirectly rely on gaining power and status through a man. But there are still heavy messages that keep women believing that traditional roles are good for them and every body else. What is the best way for women to gain better and more access to direct power?
Tags: