There is a lot of anti-doctor sentiment in the media, whether direct (docs make too much, docs don’t care about patients, etc.) or indirect (nurses provide same level of care, etc.), but seemingly fewer pieces supporting doctors and fewer pieces criticizing similarly competitive fields. I understand that there will always be those people who attempt to criticize others in society who typically make more money or (debatably) garner more respect. I also understand that taking good care of patients of patients is far more important to practicing physicians than dedicating time to defending the profession in the media.
But the fact that there is less criticism of corporate lawyers, investment bankers, and consultants, which many doctors could have been (especially if they went to a competitive ugrad), is absurd to me. The sole purpose of these corporate jobs is to make money (and they usually make more, and sometimes make much more, than physicians) and there is historically a general anti-lower/middle class mantra in these businesses ripe for being ripped apart in the media. On the other hand, the main purpose of being a physician is to treat and better society. Yet despite this (minus occupy wall street, which was definitely significant) the number of anti-doctor pieces seems to far out number anti-other competitive profession pieces.
My question is two fold:
1) Why do doctors get so much flack in the media, especially when compared to other similarly competitive professions and
2) How can this be changed?
Disclaimer: I’m applying to med school. My only real knowledge of the medical profession in society comes from having several family members as attending physicians.
But the fact that there is less criticism of corporate lawyers, investment bankers, and consultants, which many doctors could have been (especially if they went to a competitive ugrad), is absurd to me. The sole purpose of these corporate jobs is to make money (and they usually make more, and sometimes make much more, than physicians) and there is historically a general anti-lower/middle class mantra in these businesses ripe for being ripped apart in the media. On the other hand, the main purpose of being a physician is to treat and better society. Yet despite this (minus occupy wall street, which was definitely significant) the number of anti-doctor pieces seems to far out number anti-other competitive profession pieces.
My question is two fold:
1) Why do doctors get so much flack in the media, especially when compared to other similarly competitive professions and
2) How can this be changed?
Disclaimer: I’m applying to med school. My only real knowledge of the medical profession in society comes from having several family members as attending physicians.