The media is not portraying teachers very well these days but really, the media doesn't portray anyone or anything well anymore.
I've been thinking about the teaching, or a nursing career for several long years now. But each time, I lean more towards the teaching career.
The idea of working with little children (I am thinking about going into Elementary teaching at first) seems like a great pleasure to me. Then maybe when I'm older, High Schoolers.
In this day and age, a lot of children are down right brats and it seems the respect for teachers has fallen well bellow what it was ten, fifteen, even twenty years ago.
The pay (from what I understand) is horrible in many place.
But still... Our new President should change that.
What are your views on the teaching field?
As a teacher, would you recomend it?
Tags: