Question:

When did health care begin?

by  |  earlier

0 LIKES UnLike

What is the history of health care in America?

 Tags:

   Report

2 ANSWERS


  1. America as in USA? You don't have any as far as I know


  2. Before doctors were licensed, health care could be done by anyone.  A lot of old women treated ailments because they had seen or survived most of what people suffered from.  They had learned herb lore and practical first aid and treatments from others.  

    They began licensing doctors in the late 1500s in England.  This meant that doctors had to attend university, serve an apprenticeship (now called an internship), then be examined by their peers.  Those who failed to get a license could be jailed (I had an ancestor who was).  

    Doctors were still in such short supply that the wise woman and the midwife continued to practice throughout the colonial era and into the western expansion era.  

    Today's wise women still step in and give medical care when a doctor is not available.  They are licensed nurse practitioners and physician's assistants.  Oh, and midwives also go to school and get licensed.

    Health care has continued to be in the private sector (not publicly funded) throughout our history.

Question Stats

Latest activity: earlier.
This question has 2 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.