I had a couple encounters this week in which family members straight out told me doctors care more about money and prestige than taking care of patients. I tried to tell them that based on my experience as a physician, the overwhelming majority of physicians love what they do, and care deeply about their patients.
They both went on a rant about how docs are aloof to the problem of the common folks. Why do the general public have that kind of sentiment about us? I don't get it because most of the docs I have worked with are genuine people who care about the wellbeing of their patients
They both went on a rant about how docs are aloof to the problem of the common folks. Why do the general public have that kind of sentiment about us? I don't get it because most of the docs I have worked with are genuine people who care about the wellbeing of their patients