Americans do have better teeth. However, they also have far worse teeth. We English receive free dental care and there are twenty percent of American adults with no insurance.
Why do American visitors of this site say we have bad teeth? They watch it on TV and assume it?
(The same applies to their view of their nation's teeth.)
I guess my question is "How could a whole nation allow themselves to continually believe everything they see on TV?" ie.:-
Weapons of mass destruction.
The war's not about oil.
Guns don't cause death. People do! (I don't believe Americans are just evil and that's why they have the highest murder rate on the planet.)
Bush won in Florida (how can that be allowed? in the US?!)
Oh yeah, Everyone hates Americans. (not true)
Blacks in the US are treated the same
Americans are intrinsically better people
Evil is inate.
We attend church every week or more (those surveyed were watched and found to be lying mostly, they said after, because it was the American way.)
Tags: