Read this ALL before you answer plz.
I'm not saying you don't care and I KNOW you've spent years and years training just to help other people and cure other people's diseases, but what really comes to mind when you have to tell a kid "you have cancer" or tell their parents they could die or "you'll never be the same again"? I know there are some really, really nice doctors and some that do care, but I also know that some don't care as well. It just depends on who it is. A family member of mine got raped by a doctor because there were no nurses in the room and obviously they didn't check to see if there were, and she is fine now and she sued the doctor, but the doctor got his job back. I have a male doctor and I can't even go in the room alone without one of my parents because i'm scared.
I just want to know what you think of this and what do you think of your patients? All answers are appreciated and PLZ no rude answers..
btw, I'm NOT calling you a bad doctor so don't say I am
Tags: