On some level I think there is a motivation to "do it for the patients." Last year I remember calculating that I needed to get like a 15% on my physiology/histology final in order to hit the 75% overall passing level for the class. This is a pass/fail school. I probably could have skipped the last three weeks of class and labs, not studied at all and still gotten far better than a 15% on that final. I would, however, have felt guilt about doing so because of the material I would have been choosing not to learn. Cardiac phys/histo, in case anyone was curious. I even remember telling a friend that if I hadn't studied it I'd feel like I would kill my future patients with cardiac problems.
I think there's enough redundancy built into the system, that this isn't a serious concern. (Unless a person is actually grossly incompetent...) Everything from the redundancy within a class, to the redundancies from year to year, like another poster said- mostly, we all end up learning the "bread and butter" facts.
And even failing that, once we get into residency, we'll pretty much be spoon-fed on the meds they want us to prescribe for particular situations, etc. Then there's the see one, do one, teach one model. In short, there's plenty of supervision going on, as we're building up experience. We all end up knowing how to treat the basic stuff.
BUT... I think where the effort
does come into play is with non-"bread and butter" stuff. Like another poster mentioned- it's that stuff that separates the docs from the other healthcare providers. And it's that stuff that the "slacker" doc might not get.
And so the patient's rare disease goes un-diagnosed for a long time. (Totally rare occurrence, right?
🙄) Anything that's more complicated or subtle (not necessarily even rare) risks getting glossed over by the doc who is diagnosing based on a few "rules of thumb" mostly learned on the job, rather than thinking things through on a "deeper" level. (I.e. the level that we're learning things in our basic sciences years.)
It sort of depends on the specialty, though, actually. Some specialties seem like they would be quite amenable to a sort of "learn on the job" mentality. Derm? Seems quite based on pattern recognition... I'm not super sure how much of the pre-clinical stuff one would really need to retain to be a great dermatologist. (I mean, do you really need to think through the pathophys of things or is it more about visually recognizing patterns?) Whereas something like internal medicine, with its huge ddx, seems like a strong pre-clinical base of knowledge would be considerably more important.
Anyway, to answer the OP... Am I doing it for the patients?
Yes, sure. But I think I would put in the same effort for any other job, because I want to feel competent and do well in my work. I'm not a super outcomes-based person, which is a weird thing to say as a "professional school student". For me, it's more about wanting to understand (along with hating the feeling of not understanding things and not being able to solve problems.)
The thing about med school (or professional schools in general) is... it churns out professionals. So in that sense, none of us are irreplaceable. If I'm not around to be that ER doc (or whatever), another doc punches in and does the job. Being that I'm not providing a "unique" service/contribution to any given patient, I don't feel like I can honestly claim that I feel that I'm in this totally "for" that patient.
I don't know if that made any sense, but I'm tired and it will have to suffice.
😴