I won't lie and whine about how unhappy I am, because I'm in a pretty enviable set-up right now. I'm going to college and studying to be a professor. The career has a lot of things going for it--you get to use your noggin, work with really bright kids, and the pay is pretty decent.
The problem is that I don't feel like the job is IMPORTANT or USEFUL. The area I study doesn't deal with real-world problems; think more along the lines of writing whole books dedicated to "does that banana represent a p***s?" and "was Dracula actually g*y?". I truly, truly feel that I would be doing my fellow man more good by mowing lawns. Having a job that helped others has always been a big priority of mine, and I'm worried that this one doesn't.
So, your thoughts? Is having a job that is meaningful to you vital to being happy about your job, or is it enough for it to pay the bills and be enjoyable?
Tags: