I ask because I have learned about several things that make me question if Germany has really changed its ways since the days of Hitler's dictatorship. To state it frankly, it seems as if Germany is so deadset of not being an evil, Hitleresqe country that they ironically take a dictatorial approach to making the German population submissive, 'tolerant' and 'wordly'. I will just point out several instances, and allow you all to contribute or refute my curiousty.
In Germany it is illegal to publically advocate Hitler, the n**i party, or anything related to that era in German history. Even games like 'Wolfenstein', a game which depicts n**i events in WW2 is banned from the country. This is an obvious sign of a restriction of speech (as opposed to America's freedom of speech). What are your thoughts?
The German government is currently forcing (repeat, forcing) certain cities to purchase solar panels in an effort to make the city green. Going green is good, but is forcing something on people pushing the limits of acceptable government behavior?
German citizens are now at the 'forefront' of combating human rights violations, being the first and most outspoken in issues such as the current Tibet-China storm. I understand a changing of viewpoints following the dismantling of Hitler's regime, but is this world fervor the Germans have legitimately felt, or is it a facade put on to severe all ties between the current Germany and n**i Germany? I know it sounds cynical and a little presumptuous, but I can't help but get the vibe they haven't really changed since WW2 and do these things to sort of.....cloak their true intentions. Please, what ar eyour thoughts?
Tags: