When white people become the minority in America, what will happen to this nation? I do not mean to be racist in any way (I am not even white), but it seems that the countries with majority whites seem to be the most peaceful/wealthy. Do we really want a United States that isn't majority white?
And again, I am not even white, so don't call me out for being racist. I just feel that for this nation to succeed, there needs to be a strong majority of one race.
Tags: