MD & DO how well do tests predict functional knowledge in medicine?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Newyawk

Full Member
7+ Year Member
Joined
Sep 30, 2016
Messages
751
Reaction score
1,584
so I recently overheard another student say that while they stink at uworld and boards they're much better in clinical practice and basically know their stuff. what do y'all think?

Members don't see this ad.
 
Depends. Some people are not good at taking tests due to difficulty understanding what the question writer was asking, over or underthinking responses, not identifying clanging questions, etc. However others are not good at tests because they lack key clinical knowledge. Likewise, some individuals may have great bedside manner, instincts, and physical exam skills with an excellent fund of knowledge to back it up and some dont. Personally, it's best to have a good fund of knowledge and excellent clinical exam and reasoning skills.
 
  • Like
Reactions: 1 users
Members don't see this ad :)
I never understood the whole "I'm not a good test-taker" argument. What part aren't you good at, the part where you have to use your knowledge to show what you know?
 
  • Like
Reactions: 30 users
I never understood the whole "I'm not a good test-taker" argument. What part aren't you good at, the part where you have to use your knowledge to show what you know?
Savage
 
  • Like
Reactions: 15 users
so I recently overheard another student say that while they stink at uworld and boards they're much better in clinical practice and basically know their stuff. what do y'all think?

Based on the way you’ve phrased the question I’m sure you already know. Also, M3 isn’t a good place to gauge your clinical skills and know your stuff. There are enough useless people in every M3 class who focus on anticipating what’s next rather than learning (the kind who UptoDate on their phone before anyone else can) and practice reading their note to shine relative to others when the whole point of M3 should be learning without being afraid of making mistakes.
 
  • Like
Reactions: 2 users
Based on the way you’ve phrased the question I’m sure you already know. Also, M3 isn’t a good place to gauge your clinical skills and know your stuff. There are enough useless people in every M3 class who focus on anticipating what’s next rather than learning (the kind who UptoDate on their phone before anyone else can) and practice reading their note to shine relative to others when the whole point of M3 should be learning without being afraid of making mistakes.
does this work to improve grades?
 
  • Like
Reactions: 1 user
Based on the way you’ve phrased the question I’m sure you already know. Also, M3 isn’t a good place to gauge your clinical skills and know your stuff. There are enough useless people in every M3 class who focus on anticipating what’s next rather than learning (the kind who UptoDate on their phone before anyone else can) and practice reading their note to shine relative to others when the whole point of M3 should be learning without being afraid of making mistakes.
my opinion is definitely in line with the above savage poster (aside from those with legit performance anxiety) but just wanted to hear out others. from my own experience people who make this claim also tend to not prove themselves so knowledgeable when prompted in a "clinical setting". but yea, m3 is pretty friggin annoying with all these kids who know exactly how to play the game to shine in front of the attendings and residents.
 
I never understood the whole "I'm not a good test-taker" argument. What part aren't you good at, the part where you have to use your knowledge to show what you know?

Well, basically, yeah? You're also messing the second part of your sentence - the other part they might not be good at...

But seriously, a lot of datas show that testing doesn't have great correlation to, well, much...other than other testing outcomes. There is a reason why there is an entire industry around testing; test taking strategies are just as important as actual knowledge of the material. How many times have you read a stem wrong, or were tripped up by a double negative, etc. but actually knew the answer. Funnily enough, those that do well on tests, when surveyed, think that tests are more important.

From what I've seen, MCAT is somewhat predictive of step, step is fairly strongly correlated with other steps & shelf exams. Also - written evals do a poor job of matching with testing;
Does the subjective evaluation of medical student surgical knowledge correlate with written and oral exam performance? - "There was poor correlation between the subjective perception and objective measures of surgical knowledge (Table 1)"
Does the subjective evaluation of medical student surgical knowledge correlate with written and oral exam performance? - PubMed - NCBI
 
I feel like the only people making these arguments are people that scored low on their exams.
 
  • Like
Reactions: 3 users
Well, basically, yeah? You're also messing the second part of your sentence - the other part they might not be good at...

But seriously, a lot of datas show that testing doesn't have great correlation to, well, much...other than other testing outcomes. There is a reason why there is an entire industry around testing; test taking strategies are just as important as actual knowledge of the material. How many times have you read a stem wrong, or were tripped up by a double negative, etc. but actually knew the answer. Funnily enough, those that do well on tests, when surveyed, think that tests are more important.

From what I've seen, MCAT is somewhat predictive of step, step is fairly strongly correlated with other steps & shelf exams. Also - written evals do a poor job of matching with testing;
Does the subjective evaluation of medical student surgical knowledge correlate with written and oral exam performance? - "There was poor correlation between the subjective perception and objective measures of surgical knowledge (Table 1)"
Does the subjective evaluation of medical student surgical knowledge correlate with written and oral exam performance? - PubMed - NCBI
http://www.stat.columbia.edu/~gelma...ing_the_Relationships_Between_USMLE.98203.pdf
 
  • Like
Reactions: 1 user
does this work to improve grades?

It can because it can make you look smarter than you actually are which convinces some people to give them a stronger evals.

One of my biggest gripes with third year was that it’s biased towards people who pick things up quickly and are anally retentive. It takes me longer to get how things work (EMR, what attendings want, who to ask for what, etc.) but I’m always willing to improve and try to maintain a good attitude. I think it’s fair that people who pick things up faster get better evals because chances are they’ll pick things up faster in residency. I just wish that they gave us some time to learn at our own pace too where we didn’t have to worry about evaluations, but rather whether not we achieved competencies (good notes, good form physical exams, crisp presentations) and slower students were more accepted. 4th year’s better in that way and if you apply yourself you’ll stand out a bit because ppl will be wondering why you’re not checked out, but for me I plan to squeeze every last learning opportunity to see patients out of it.
 
Last edited:
  • Like
Reactions: 3 users
so I recently overheard another student say that while they stink at uworld and boards they're much better in clinical practice and basically know their stuff. what do y'all think?

In all honesty, nothing is useless. You must work hard all the way through training and in career. Some people have their focus waiver early and do worse on Step 1, but then work hard for the remainder of their careers and do well. Others will work hard on their tests and then get lazy and will have problems because they got lazy. Others work hard all the way through and they thrive. Others skate by and people learn not to depend on them and they end up having problems.

tl;dr: Work hard no matter where are you are or what you are doing. Making excuses for poor performance in medicine is considered a felony in the eyes of many in the profession and in the end, will hurt you far more than help you.
 
  • Like
Reactions: 1 user
Members don't see this ad :)
It can because it can make you look smarter than you actually are which convinces some people to give them a stronger evals.

One of my biggest gripes with third year was that it’s biased towards people who pick things up quickly and are anally retentive. It takes me longer to get how things work (EMR, what attendings want, who to ask for what, etc.) but I’m always willing to improve and try to maintain a good attitude. I think it’s fair that people who pick things up faster get better evals because chances are they’ll pick things up faster in residency. I just wish that they gave us some time to learn at our own pace too where we didn’t have to worry about evaluations, but rather whether not we achieved competencies (good notes, good form physical exams, crisp presentations) and slower students were more accepted. 4th year’s better in that way and if you apply yourself you’ll stand out a bit because ppl will wondering why you’re not checked out, but for me I plan to squeeze every last learning opportunity to see patients out of it.

I only work with residents and not med students at the moment, so it's not something I need to worry about, but I really have never figured out what I want to see from students and how to most fairly grade them. They all come in knowing basically nothing, some learn more than others, some know more but aren't as proficient at the "doing" parts of the rotation. Some are super helpful and try hard but never quite 'get it'. I'm colored by my own awful experience at being an M3. I mostly wish I could just focus on narrative evals for pass/fail clinical grades, but schools that actually do that are few and far between.
 
  • Like
Reactions: 3 users
I never understood the whole "I'm not a good test-taker" argument. What part aren't you good at, the part where you have to use your knowledge to show what you know?

I think it’s that 60-70% of what is asked on standardized medical tests is the stuff you rarely, if ever, see in clinical practice, so being good at it and knowing it doesn’t have as much clinical relevance as you might expect. In actual practice it’s much better to be able to efficiently handle the common things you see over and over again, whereas you can always look up the less common stuff if you need to.

Another issue is that standardized medical tests assess your knowledge broadly whereas clinical practice tends to be narrow, and you get a ton of practice in your specialty during residency. So what’s the difference between a cardiologist who got a 257 and one who got a 230 on STEP1? The one who got a 257 probably knows more about peds, psych, obgyn than the one who got a 230. That’s not really relevant in clinical practice as both probably know everything they need to know about cardio. So that’s what it means when people say they aren’t good test takers but are good clinically.
 
  • Like
Reactions: 1 user
I only work with residents and not med students at the moment, so it's not something I need to worry about, but I really have never figured out what I want to see from students and how to most fairly grade them. They all come in knowing basically nothing, some learn more than others, some know more but aren't as proficient at the "doing" parts of the rotation. Some are super helpful and try hard but never quite 'get it'. I'm colored by my own awful experience at being an M3. I mostly wish I could just focus on narrative evals for pass/fail clinical grades, but schools that actually do that are few and far between.

The AMA Core Entrustable Professional Activities for Entering Residency is a good starting point. It lays out core competencies that 3rd and 4th year medical students should be learning, which include the ability to interview, examine, workup, and present a patient, both in oral and written form. They also include being able to work as a team, being able to do basic procedures like starting IVs, putting in catheters, and so on. This is what 3rd and 4th year are designed to teach medical students, what residency programs expect them to know day 1, and what they should be assessed on. Many medical students do not realize this and focus their clinical efforts on either surviving or on other things which, while still possessing some value, are not the most valuable thing they can be doing.

Literally the most valuable skill set a medical student can have is the ability to interview and examine a patient, come up with a reasonable assessment and plan, including further workup as necessary, then communicate their findings and assessment to other medical professionals, both orally and in writing. This process is the backbone of every medical specialty. If you can do this effectively and accurately as a medical student, you’re ahead of many interns in my opinion. This is what you should assess medical students on and how you should give them feedback for improvement.
 
  • Like
Reactions: 2 users
my opinion is definitely in line with the above savage poster (aside from those with legit performance anxiety) but just wanted to hear out others. from my own experience people who make this claim also tend to not prove themselves so knowledgeable when prompted in a "clinical setting". but yea, m3 is pretty friggin annoying with all these kids who know exactly how to play the game to shine in front of the attendings and residents.

I'm one of those people who did relatively poorly on boards and tests but did really well in clinical years (per shelf scores and attending evals). Imo there's three things that could be the case when someone is legitimately better in clinic than during first 2 years.

The first is learning style. Some people are such strong kinesthetic learners, that they struggle to learn anything when they can't do it "hands on". They're the people who do mediocre on written tests then blow everyone away on the anatomy practicals. This is sometimes the case, but even those people should be able to do decently on written tests if they implement the proper study habits (which not everyone does).

The second is the type of material being learned. As someone pointed out, during the first two years there's a TON of minutiae that you have to learn for boards. Much of which is going to be 100% irrelevant to you once you finish boards. While there is still minutiae in the clinic, most of it is far more relevant to the cases you're seeing and it's easier to understand why it's important to know. For example, I couldn't care less that a translocation at the Philadelphia chromosome may have a BCR-ABL fusion involving tyrosine kinase which may mutate and cause a patient to develop ALL, but I care very much that I should be careful starting certain antibiotics (like sulfas) on a patient currently taking Lamotrigine who's had a history of rash because I don't want them developing SJS. If you look at my grades from the first two years, every section I'd do terrible on the first test (which was heavy on the basic sciences) then do extremely well on the second test (aka our "clinical medicine" test) which led to me having mostly low B's and C's because of this. Same thing with my Step breakdowns (I'd do great on the clinical portions but pretty poor on the other sections). For someone like me, it's pretty easy to look at what sections I dominated in pre-clinical years and expect me to perform much better on actual rotations, which I did.

The third point, which I think is the most commonly applicable to the students who do poorly on tests but well in the clinical setting, is actually gathering information. Boards do not test this at all. You get a question stem with everything already laid out, you just have to be able to interpret it and pick the best of 5 options. That's completely different from giving someone a patients whose CC is "headache" and telling them to figure out what's wrong and develop a treatment plan. I think this area where people who are mediocre or poor at testing can improve the most and shine in the clinics, because the 260+ Step 1 score doesn't mean jack if a student can't get a decent history from their patients. For me, this was the thing that really made me stand out on clinical rotations more than anything else. I'd catch things in my patient interview that even the attendings sometimes missed (usually because I got more time with the patients, but sometimes because I legit asked a question that no one else had thought of yet). Being able to take a history, perform an accurate physical, making a diagnosis, and developing a treatment plan (all from scratch) takes a completely different skill set than just reading a question and picking the right answer from 5 choices. Some people are really good at it, some just aren't, and I don't think Step scores are necessarily a decent predictor of how well a person can actually function in the clinic.

Keep in mind, I'm not saying that those who do poorly on boards are going to magically be rock stars on the wards or that doing exceptional during early years won't give people a leg up later on. Just pointing out that the skills needed to do well on tests are only one part of what actually goes into making a good physician. So yes, there are people who do far better during clinical years than pre-clinical because their strong points are not just regurgitating information (which is what is most heavily tested by boards). Of course, there are those people who are just trying to make themselves feel better about being poor students in general, and I don't think you can really determine who falls into which of those two categories without seeing how they function in a clinical setting.
 
Last edited:
  • Like
Reactions: 5 users
Do real patients spit out 5-7 answer choices after I do my H&P?
 
  • Like
Reactions: 5 users
Yes but it isn't really super common. Some people are better than others at getting a good H&P and knowing what's relevant/serious and what isn't. This can guide your differential drastically.
 
  • Like
Reactions: 1 user
And frequently involves an organ system that has nothing to do with what's actually wrong or some super rare disease that doesn't exist...

A more realistic Step 2 would have you click through a series of boxes after every multiple choice question.


Are you SURE the answer isn't "D: Pheochromocytoma?" [yes/no]

[no]

But the patient read about the disease online and the symptoms seem to fit. Are you still sure? [yes/no]

[no]

How long have you been a doctor? Have you actually seen and treated a patient with "D: Pheochromocytoma?" [yes/no]

[yes]

Are you sure you're not dismissing this patient's concern for having "D: Pheochromocytoma" due to racial bias? [yes/no]

[yes]

Yeah? can you prove it? [essay prompt]

[types essay]

Well either way the patient is really anxious. They took some xanax from a friend it it made them feel better. How about Rx'ing some for her? [yes/no]

[no]

Do you actually care about this patient's anxiety? How can you be so heartless? [essay prompt]

[types essay]

The National Board of Medical Examiners would like to inform you that you are cold and heartless and didn't care about this patient at all.




/which I guess isn't that different from the end of step 3 or the current vignette question sets, but for realism, you need it after EVERY question... :shifty:
 
  • Like
Reactions: 11 users
Ooooh, another idea:

During timed question blocks, you suddenly get a pop up that a patient has walked in and urgently needs FMLA paperwork, and you have to complete a 4 page PDF before being able to return to your questions, with the clock running the whole time.

/You're all lucky I have no desire to ever move back to Philly, otherwise I'd start working my way up the latter at the NBME right now.
 
  • Like
Reactions: 12 users
I think the info gathering and presenting is a great argument, but to suggest somebody would know a diagnosis in the clinical setting when they cant identify it when its literally in their face is just asinine
 
  • Like
Reactions: 1 user
Most of the attending physicians I have worked with basically admit that they've forgotten 85% of what was on the boards and focused hard on their specialty. Boards and clinical practice do have some correlation, most likely, as specialty boards tend to cover knowledge one actually uses in practice and pass rates correlate well with Step performance.
 
  • Like
Reactions: 1 user
Most of the attending physicians I have worked with basically admit that they've forgotten 85% of what was on the boards and focused hard on their specialty. Boards and clinical practice do have some correlation, most likely, as specialty boards tend to cover knowledge one actually uses in practice and pass rates correlate well with Step performance.

The ABPN exam wasn't the worst thing I've ever taken, but it made me appreciate how much better written the Steps' questions are by comparison.

Plus I took it the year that half the exam was about dissociative fugue for some reason.
 
  • Like
Reactions: 1 user
The ABPN exam wasn't the worst thing I've ever taken, but it made me appreciate how much better written the Steps' questions are by comparison.

Plus I took it the year that half the exam was about dissociative fugue for some reason.
Step 1 was the most well written, fair, and objective exam I've ever taken in my life. Everything else is a comparative disappointment, like when you have a meal prepared perfectly at one restaurant and then can never enjoy the same dish elsewhere because it just doesn't measure up.
 
Step 1 was the most well written, fair, and objective exam I've ever taken in my life. Everything else is a comparative disappointment, like when you have a meal prepared perfectly at one restaurant and then can never enjoy the same dish elsewhere because it just doesn't measure up.
even compared to the mcat? I thought that was a fair exam.
 
  • Like
Reactions: 1 user
Step 1 was the most well written, fair, and objective exam I've ever taken in my life. Everything else is a comparative disappointment, like when you have a meal prepared perfectly at one restaurant and then can never enjoy the same dish elsewhere because it just doesn't measure up.
I disagree on the basis that the misery of step 1 could never be comparable to delicious food.
 
  • Like
Reactions: 1 users
People keep saying that step scores don’t correlate with how good a doctor you become. But do MS3 grades do any better of a job at that? To me, MS3 grades were a crapshoot, and mostly dictated by how much you could stand out; not necessarily how competent you were.
 
  • Like
Reactions: 1 user
I think the info gathering and presenting is a great argument, but to suggest somebody would know a diagnosis in the clinical setting when they cant identify it when its literally in their face is just asinine
I think motivation is a factor. Some people simply can't get too motivated to study ultra hard for tests. Plus a lot of boards and exams are memorizing buzzwords etc.
I have seen some students drastically change their motivation level when they hit the clinical floors. As such, they absorb every bit of relevant info despite still not knowing enzymes and receptors :)
 
People keep saying that step scores don’t correlate with how good a doctor you become. But do MS3 grades do any better of a job at that? To me, MS3 grades were a crapshoot, and mostly dictated by how much you could stand out; not necessarily how competent you were.

The dean of clinical affairs at my med school loved to bring up a study that showed that clinical evals were the best possible correlation to resident performance (basically, clinical evals as a med student correlated to clinical evals as a resident, however important you consider those).

She was also the lead author on the study, so I took anything she told me with a massive grain of salt. I will admit there's at least a kernel of truth there though. People like me who were apathetic and burned out M3-4s and much better residents are more likely to be the exception than the rule.
 
  • Like
Reactions: 1 user
People keep saying that step scores don’t correlate with how good a doctor you become. But do MS3 grades do any better of a job at that? To me, MS3 grades were a crapshoot, and mostly dictated by how much you could stand out; not necessarily how competent you were.
Have the MS3 see a patient with multiple complaints and a long list of pathologies. If they come out with a reasonable plan, they'll be a competent physician. Evals are horse ****.
 
even compared to the mcat? I thought that was a fair exam.
Way better than the MCAT, if only because the preparation leading into the exam is pretty standardized. The MCAT is so broad and questions can be approached in so many more ways, whereas the Step exam is much, much more straightforward.
 
  • Like
Reactions: 2 users
Way better than the MCAT, if only because the preparation leading into the exam is pretty standardized. The MCAT is so broad and questions can be approached in so many more ways, whereas the Step exam is much, much more straightforward.
speak for yourself, my professors make it their mission to include stuff that is only found on sub specialty boards.
 
  • Like
Reactions: 3 users
speak for yourself, my professors make it their mission to include stuff that is only found on sub specialty boards.
I suppose your problem was listening to your professors to begin with, I was ignoring them for the most part and going 100% for the boards from day 1
 
  • Like
Reactions: 4 users
I never understood the whole "I'm not a good test-taker" argument. What part aren't you good at, the part where you have to use your knowledge to show what you know?
This clip comes to mind:
 
  • Like
Reactions: 2 users
It can because it can make you look smarter than you actually are which convinces some people to give them a stronger evals.

One of my biggest gripes with third year was that it’s biased towards people who pick things up quickly and are anally retentive. It takes me longer to get how things work (EMR, what attendings want, who to ask for what, etc.) but I’m always willing to improve and try to maintain a good attitude. I think it’s fair that people who pick things up faster get better evals because chances are they’ll pick things up faster in residency. I just wish that they gave us some time to learn at our own pace too where we didn’t have to worry about evaluations, but rather whether not we achieved competencies (good notes, good form physical exams, crisp presentations) and slower students were more accepted. 4th year’s better in that way and if you apply yourself you’ll stand out a bit because ppl will be wondering why you’re not checked out, but for me I plan to squeeze every last learning opportunity to see patients out of it.
You mean, like... intelligent people???
 
  • Like
Reactions: 2 users
I thought it was a wonderful challenge. Like, surviving that exam is one of the greatest accomplishments of my life.
If I can make it through biochem review I’ll feel a lot better lol
 
  • Like
Reactions: 1 users
For me at least, it's not usually the question stem/diagnosis I have trouble with, it's the answers. Unless it's something that I literally didn't study/see in clinic, I can usually figure out the diagnosis and in a clinical setting, and I would be able to sit down and tell you what labs and imaging I want to order, what I think is wrong, what I want to rule out, how we should approach treatment, etc. But on exams, the question is usually what do you do next, and you have to pick ONE thing to do when in reality you would probably be doing multiple answer choices at once. Maybe I'm overthinking the answer choices, idk. And then of course the rote memorization questions that are "which of the following 5 similar-sounding enzymes are deficient in this incredibly rare disease" are where I really struggle, and that's what a lot of M1/M2 and Step 1 are, which I performed worse on (I've been average/above average on most shelf exams).
Enzyme questions really arent that common on step 1, at least anymore. Its not a valid excuse anymore. I dont really know what to make of the rest of your statement. I guess it would also depend on how poorly a person does on step 1. I can see the argument made for a 230 or 240 but 200 no way.

Say 2 students are equal in their information gathering ability - the student with more knowledge as indicated by higher board scores should be able to formulate a better differential without the help of outside resources that just list differentials for you. This is the knowledge that im referring to. When a student is prompted with questions from the attending without having the ability to google, the student with the higher boards should know more. In my experience this has been the case. If this is true, how does this not translate to better clinical knowledge overall? And im not talking about specialized docs here just general stuff.
 
  • Like
Reactions: 1 user
It's easier to write less g loaded tests. MCAT, LSAT, and GRE all take a higher level of thinking to absolutely dominate and write good questions for. STEP1 is largely about having adequate reasoning coupled with Herculean work ethic and memory.
 
Have the MS3 see a patient with multiple complaints and a long list of pathologies. If they come out with a reasonable plan, they'll be a competent physician. Evals are horse ****.
1. COPD
-appreciate pulm recs

2. ESRD
-appreciate renal recs

3. T2DM
-appreciate endocrine recs

4. tummyache
-appreciate GI recs

5. stubbed toe
-RPR r/o tabes dorsalis
-outpt ortho f/u

6. excessive complaints
-dispo to inpt psych
 
  • Like
Reactions: 4 users
People keep saying that step scores don’t correlate with how good a doctor you become. But do MS3 grades do any better of a job at that? To me, MS3 grades were a crapshoot, and mostly dictated by how much you could stand out; not necessarily how competent you were.

I don't think you'll get a straight answer on this because of how inconsistently 3rd year grades are determined. At my school, shelf exams count for 100% of the grade and preceptor evals only count towards the dean's letter (unless you do something egregious enough to warrant a failure). At other schools evals may count for 50% or more of the clinical grade with shelf exams having minimal impact. So the using grades as a predictor is a poor metric imo.

If you only look at shelf scores you encounter the same problems as using only Step 1. At the same time I don't think attending evals are a very good metric either as different attendings have different standards and consistency becomes an issue.

Personally, I'd like to see better implementation and standardization for Step 2 CS as well as actual scores, as it's the only consistent metric that measure students' ability to take a history, gather info, and create a treatment plan. No, it might not be relevant to every field, but I think it would probably be a better metric for clinical performance than others.
 
  • Like
Reactions: 1 user
One thing i just realized - students entering med school are much more prepared for preclinicals than they are for clinicals. Every student has been studying and taking written tests for years before med school. Very few have real clinical experience. given this i think preclinicals and m4 should hold the most weight
 
  • Like
Reactions: 1 user
One thing i just realized - students entering med school are much more prepared for preclinicals than they are for clinicals. Every student has been studying and taking written tests for years before med school. Very few have real clinical experience. given this i think preclinicals and m4 should hold the most weight
Very few medical students have held real jobs before medical school starts. I assume that also plays a role here.
 
  • Like
Reactions: 1 users
Very few medical students have held real jobs before medical school starts. I assume that also plays a role here.
DING DING DING! This is the saddest part about talking to my classmates. Digging proverbial ditches before school would make a lot of these people not only do exponentially better during the clinical years, but they would also be much less insufferable.
 
  • Like
Reactions: 2 users
Say 2 students are equal in their information gathering ability - the student with more knowledge as indicated by higher board scores should be able to formulate a better differential without the help of outside resources that just list differentials for you. This is the knowledge that im referring to. When a student is prompted with questions from the attending without having the ability to google, the student with the higher boards should know more. In my experience this has been the case. If this is true, how does this not translate to better clinical knowledge overall? And im not talking about specialized docs here just general stuff.

Just my experience, but, some of the people that get the highest scores, aren't the ones that are able to apply knowledge in the best way. As soon as you throw them a patient/disease presenting atypically, they're lost. It doesn't fit what first aid said, that is what they memorized. I think thats why it's good to have a mix of different things being looked at. Evals for things like OSCE's, tests to gauge fundamental concepts, and evals. They all have drawbacks, but using things together at least would help sort the picture a bit.

DING DING DING! This is the saddest part about talking to my classmates. Digging proverbial ditches before school would make a lot of these people not only do exponentially better during the clinical years, but they would also be much less insufferable.

That's one thing I love about my class, we only have a couple people that didn't have some sort of real world job. The average age of our class was north of 26, which I think really helped with that.
 
  • Like
Reactions: 1 users
Top