I have done a little study on the culture of Native Americans. It is my deepest desire to get back to a point where we can live like they did again. Humans have a tendency to try to dominate nature, what if the best way is to let nature take its course and do everything possible not to inhibit it. Many people claim that I am going through the typical adolescent period of "finding myself" (I'm 16), but I do not see it that way. I can feel it deep within my being that the way we are living will lead to no good not only for the environment, but for the human species as well. After all, we are just another species on this planet and it is our duty to live symbiotically with the rest of the species which in many respects have maintained us for millennium. I am not religious in a typical manner, but what if when Europeans were sentencing themselves to future doom when they destroyed the civilization that was at peace with nature. This may be a rant, but I sincerely wish that things could go back to the way they were before the natural ways of life were corrupted. Does anyone think that this is even possible anymore? Opinions/feedback please.
Tags: