As a woman I have always felt that deep down America is still very sexist. I mean women in sports is still non-existent for the most part, so are women CEOs, women in politics, women's pay etc. I am not black though, so I don't have that experience. Anyways in your opinion what are you discriminated more for?
Tags: