I mean, this is from a black perspective here, but I was just wondering about that. It's like there's virtually nothing in the textbooks about black people outside of slavery and the civil rights movement. Even with world history, the only thing we learned about was Europe and Asia. It's probably just that it was a crappy school I went to lol.
But I notice it in other things. It's just like--"you have no history. You come from nothing, you were never anything, and the only thing that anyone who looks like you has ever done in history is be a slave and fight for equality. Nothing good comes from Africa." I even asked my history teacher about it and he just said "nothing was going on there." Even now the only thing you ever see about Africa is war, poverty and starvation. These are all things we should be concerned about, but there ARE cities there. High rise buildings, businesses, etc. But how come we never see THAT on TV? Outside of South Africa, I mean...?
Do you think that if the history of African Civilizations was taught along with the other stuff from Europe, Asia, India, and Egypt, black people (in the US) would have more self-awareness? Would it assuage the feeling of inherent failure and uselessness if we recognized our place in history and saw that despite popular opinion, we really ARE capable of creating, organizing and leading our own societies?
What do you think?
Tags: