I've been talking with girls I know who consider themselves feminist.
I used to think I was pretty independent, emancipated and probably even a little feminist. Until I talked to them. Everything that requires a woman to be beautiful, seems evil. Everything that doesn't promote total peace amongst women, seems evil.
So how about this? I like looking at beautiful men, so I'm not going to comment on men wanting to see naked women. And if the women want to make a living that way, then that's their choice and I won't tell them not to. I do think women are equal to men in intellegence and intellect... I just don't see the point of fighting human nature of wanting to look at eachother or show our bodies.
So I guess I never was and never will be a feminist?
Tags: