The USA fought series of wars in foreign countries, the country has done terrible things in the past, but war guilt is never a issue. It never feels remourse. Germany talks about war guilt in WW2, but why the USA doesn't? (The USA is not better unlike what some uneducated Americans like to believe.) Is it because the USA is diverse, and they have someone else in the country to blame?
Tags: