Question:

Which is more important.....A college degree or work experience?

by  |  earlier

0 LIKES UnLike

Which is more important.....A college degree or work experience?

 Tags:

   Report

4 ANSWERS


  1. I think it depends on the job/career. some jobs want experience and some jobs want degrees. I personally think exp. is a lot better, because a degree means you know the education, but don't have the exp. when making a decision.


  2. It depends on what it is needed for.  Both is the answer, but if it has to be one, I would choose a college degree for the better paying jobs.  I could get a cashier job anywhere if I had some experience, but I could not get a job as a therapist or an engineer if I did not have a degree in that field.

    Good question.

  3. A college degree normally. You can have all of the knowledge and experience in the world, but a lot of jobs won't even look at you twice unless you have a degree on your resume that proves what you know. At the same rate, some people won't hire you if you have a degree but no experience whatsoever. I personally think the best thing to have is a college education from a school that does hands on experience work as part of their curriculum. The best of both worlds!

  4. I would have to say a college degree. At least, your paycheck reflects that.

Question Stats

Latest activity: earlier.
This question has 4 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.