I was educated in england, and history was taught in such a way that I grew up believing all the conquering england had done, both of neighbouring and distant countries, was a good thing because of the "civilization" brought to those places. The reality is that once rich communities and cultures were destroyed and people turned into slaves, their lands and wealth stolen. Children in Wales who were caught speaking Welsh were branded with a rope around their necks, like the n***s did to the jews. Why aren't we taught this in english schools? Why wasn't I taught that the english shot Indians off the end of cannons for refusing to bite gun cartridges dipped in pork fat, in a cynical gross violation of their sacred beliefs? Why are the english so proud of their heritage, when in fact it is a heritage of cruelty, greed and opression? The only reason they advanced as they did was on the back of stolen wealth. It is time english children were told the truth.
Tags: