It seems to me that a couple hundred years ago the people who formed the United States had a different idea. If you wanted something you worked for it. If someone helped you out, you were embarrassed if you did not do something for them in turn. When your neighbor's barn burned down you helped him re-build it, but then he came and helped you to gather in your crops in the fall.
Those who fought for America did it to gain their own pride in self, and not to look for a free ride.
Now I watch the politics of today and see that there are many people who believe that the government's job is to take care of the American people. Make us either dependent on the government or give to get something in return I don't know, but to watch all those who have their hands sticking out waiting for the free ride, it sickens me.
I think that it may be important for History to teach us what America was based on before we choose someone who will just give us a hand-out instead of a useful hand.
I'd love to hear your thoughts.
Tags: