Are we about to witness a return to high Step 1 fail rates?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
I agree, I'm not sure why we normalized studying for a test full time for two months after two full years of med school when we were taking tests during those two years to test our knowledge.

To the people saying there is a significant knowledge gap for the people that arent able to pass, draw the TCA cycle out for me. We forget these small details anyway. We learn the info only to pass step 1, then its gone, but the cool thing is you can look it up if you need it.
Step 1 was far more about conceptual thinking than gross knowledge. It is third order processing, and honestly it was the best and most fair exam I've ever taken. Everything I didn't know I knew I'd seen at some point and it was my fault for not studying harder. The foundation of Step 1 has been enormously helpful to my practice and ability to understand information presented in widely varying clinical studies. It's a good test because it evaluates a strong foundation. Many of the errors I see on the part of NPs are related to Step 1 material.

Members don't see this ad.
 
  • Like
Reactions: 13 users
No im studying for it now
A word to the wise, don’t memorize the TCA. That won’t be on the test.

People like to pretend Step 1 is esoteric basic science factoids unrelated to the day to day of medical practice. Couldn’t be further from the truth.
Step 1 was far more about conceptual thinking than gross knowledge. It is third order processing, and honestly it was the best and most fair exam I've ever taken. Everything I didn't know I knew I'd seen at some point and it was my fault for not studying harder. The foundation of Step 1 has been enormously helpful to my practice and ability to understand information presented in widely varying clinical studies. It's a good test because it evaluates a strong foundation. Many of the errors I see on the part of NPs are related to Step 1 material.
This. Step 1 level info is foundational for day to day clinical practice.
 
  • Like
Reactions: 14 users
A word to the wise, don’t memorize the TCA. That won’t be on the test.
Haha I know, but that was the first thing that came into my mind, and the stereotypical thing that premeds/med students complain about learning then forgetting multiple times. But specific steps, cofactors are important.
 
  • Like
Reactions: 1 user
Members don't see this ad :)
Step 1 was far more about conceptual thinking than gross knowledge. It is third order processing, and honestly it was the best and most fair exam I've ever taken. Everything I didn't know I knew I'd seen at some point and it was my fault for not studying harder. The foundation of Step 1 has been enormously helpful to my practice and ability to understand information presented in widely varying clinical studies. It's a good test because it evaluates a strong foundation. Many of the errors I see on the part of NPs are related to Step 1 material.
The conceptual reasoning isn’t the issue. It’s the sheer brute force memorizing of facts that’s the problem and a major reason why people wanted Step 1 to go P/F rather than spending hours using Anki
 
  • Like
  • Dislike
Reactions: 2 users
Make the questions easier???? Went to grade school anymore!. The whole point of the exam is to assess competence. Nothing more.

What I’m saying is stop asking us to diagnose mitochondrial disease or Cru de chat syndrome or differentiate between papillary and medullary thyroid disease lol. How does that help anybody?. Just test us on high yield stuff like heart failure or causes of respiratory failure, this is supposed to be a basic competency test. The Step 1 used to resemble the old NBMEs with a short paragraphs instead of long intricate IQ test type questions also
 
  • Dislike
Reactions: 1 user
I believe if people study seriously for classes, passing Step 1 comfortably shouldn’t be difficult.

I think something else to consider is that in addition to score creep, there's also resource creep, similar to the MCAT. Before, there was primarily textbooks and lectures, but now you've got such highly streamlined outside materials. Cost aside, watching BnB, Pathoma, or Sketchy on 1.5x-2x speed is just much more efficient, especially because they pander specifically to the boards.
 
  • Like
Reactions: 2 users
Dare I say the reason people may be failing is that it is pass/fail and there is no longer a drive to study hard to get every last point possible to secure your spot in residency. A Cs get degrees mentality doesn't tend to build a strong desire to work hard for As, and puts individuals far closer to the failure threshold.
Basically. As soon as it became P/F, I got e-mails from our med students asking me how to get a baller step 2 score. Since PDs use these scores to filter applicants, I can't really blame them
 
  • Like
Reactions: 1 user
The conceptual reasoning isn’t the issue. It’s the sheer brute force memorizing of facts that’s the problem and a major reason why people wanted Step 1 to go P/F rather than spending hours using Anki

What I’m saying is stop asking us to diagnose mitochondrial disease or Cru de chat syndrome or differentiate between papillary and medullary thyroid disease lol. How does that help anybody?. Just test us on high yield stuff like heart failure or causes of respiratory failure, this is supposed to be a basic competency test. The Step 1 used to resemble the old NBMEs with a short paragraphs instead of long intricate IQ test type questions also
I don't get the argument here. We are in a rigorous field. I think we should have rigorous exams. If the concern is that the test was not originally designed to be difficult and test minutiae and should be scrapped for a norm-referenced test, I think that ship has sailed and is unrealistic. I think it is more appropriate for it to be a test of competency than comparison anyway.

We should expect physicians to have learned the difference between medullary and papillary thyroid cancer. Throw in anaplastic thyroid cancer for good measure. I'm as subspecialized as they come (in non-thyroid matters) but when I get a consult for a patient with thyroid cancer with spine mets, I need to know the details. The way I approach a patient with each one is very different.

Furthermore this is fundamentally what makes us different from midlevels. 99% of these consults come from NPs who page me and say "hi I have a patient with thyroid cancer and spinal cord compression."

What kind? What treatment is he on and what is he eligible for? Has he had radiation? If so what modality? What's his functional status? Does he have MEN2a/b? Is he going to die in the next 10 days? Is a cervical corpectomy in his best interest based on his systemic disease burden?

And 98% of the time, the oncology NP's response is "I don't know, my attending just wanted me to consult you. It's all in the chart." The other 1% of the time it turns out the patient actually had parathyroid cancer.

But I bet the NP can still tell you about the "high-yield stuff" like heart failure or causes of respiratory failure.
 
  • Like
  • Love
Reactions: 22 users
I don't get the argument here. We are in a rigorous field. I think we should have rigorous exams. If the concern is that the test was not originally designed to be difficult and test minutiae and should be scrapped for a norm-referenced test, I think that ship has sailed and is unrealistic. I think it is more appropriate for it to be a test of competency than comparison anyway.

We should expect physicians to have learned the difference between medullary and papillary thyroid cancer. Throw in anaplastic thyroid cancer for good measure. I'm as subspecialized as they come (in non-thyroid matters) but when I get a consult for a patient with thyroid cancer with spine mets, I need to know the details. The way I approach a patient with each one is very different.

Furthermore this is fundamentally what makes us different from midlevels. 99% of these consults come from NPs who page me and say "hi I have a patient with thyroid cancer and spinal cord compression."

What kind? What treatment is he on and what is he eligible for? Has he had radiation? If so what modality? What's his functional status? Does he have MEN2a/b? Is he going to die in the next 10 days? Is a cervical corpectomy in his best interest based on his systemic disease burden?

And 98% of the time, the oncology NP's response is "I don't know, my attending just wanted me to consult you. It's all in the chart." The other 1% of the time it turns out the patient actually had parathyroid cancer.

But I bet the NP can still tell you about the "high-yield stuff" like heart failure or causes of respiratory failure.
The other thing is that the "minutiae" actually are tested upon because they demonstrate prototypical concepts that allow one to build a strong foundation for future study. If you don't understand the prototypical disease processes and meds, how are you going to understand the unusual cases? Not a month goes by that I don't get some zebra
 
  • Like
Reactions: 3 users
I don't get the argument here. We are in a rigorous field. I think we should have rigorous exams. If the concern is that the test was not originally designed to be difficult and test minutiae and should be scrapped for a norm-referenced test, I think that ship has sailed and is unrealistic. I think it is more appropriate for it to be a test of competency than comparison anyway.

We should expect physicians to have learned the difference between medullary and papillary thyroid cancer. Throw in anaplastic thyroid cancer for good measure. I'm as subspecialized as they come (in non-thyroid matters) but when I get a consult for a patient with thyroid cancer with spine mets, I need to know the details. The way I approach a patient with each one is very different.

Furthermore this is fundamentally what makes us different from midlevels. 99% of these consults come from NPs who page me and say "hi I have a patient with thyroid cancer and spinal cord compression."

What kind? What treatment is he on and what is he eligible for? Has he had radiation? If so what modality? What's his functional status? Does he have MEN2a/b? Is he going to die in the next 10 days? Is a cervical corpectomy in his best interest based on his systemic disease burden?

And 98% of the time, the oncology NP's response is "I don't know, my attending just wanted me to consult you. It's all in the chart." The other 1% of the time it turns out the patient actually had parathyroid cancer.

But I bet the NP can still tell you about the "high-yield stuff" like heart failure or causes of respiratory failure.
I agree that this is something that doctors should know, but in year 2 of 7? To be honest I think having the knowledge level of a practicing NP at step 1 (of 3) is almost funny considering most schools dont let us treat a patient yet.
 
  • Dislike
Reactions: 1 user
It’s not just the minimum knowledge. The NBME has added much more obscure content and severely ramped up the difficulty of questions to address the score creep as a direct consequence of PDs abusing Step 1. That’s a horrible thing. The pass threshold for a much more difficult Step 1 being a 190s is arguably comparable to a 220-230 in the original Step 1 few decades ago. Combine that with the absurdly high standard errors of Step 1 when it was scored, and it’d suggest people scoring in the low 200s in practice tests are at risk for failing. Doctors back then (or even one decade ago) would fail Step 1 in its current form and in no way are they incompetent or incapable of practicing medicine
"Doctors back then (or even one decade ago) would fail Step 1 in its current form and in no way are they incompetent or incapable of practicing medicine".....This is totally not true, and quite frankly, head shaking that you would make that statement.
 
  • Like
  • Love
Reactions: 2 users
I don't get the argument here. We are in a rigorous field. I think we should have rigorous exams. If the concern is that the test was not originally designed to be difficult and test minutiae and should be scrapped for a norm-referenced test, I think that ship has sailed and is unrealistic. I think it is more appropriate for it to be a test of competency than comparison anyway.
I’m saying Step 1 was originally designed to be difficult and it was artificially made to be much more difficult for no other reason other than to respond to the score creep. Step 1 was hard in 1990 and 2000s. There was zero reason to make that exam magnitudes harder.
 
"Doctors back then (or even one decade ago) would fail Step 1 in its current form and in no way are they incompetent or incapable of practicing medicine".....This is totally not true, and quite frankly, head shaking that you would make that statement.
Have you looked at any comments from attendings who are alarmed by the increasing craziness of Step 1 in recent years? Seriously even the FA size has increased many times and that’s not just memorizing more facts. The questions are magnitudes harder and more difficult compared to old exam forms and there is literally no reason for that to happen other than the score creep from Step 1 mania
 
  • Like
  • Dislike
Reactions: 1 users
Members don't see this ad :)
I’m saying Step 1 was originally designed to be difficult and it was artificially made to be much more difficult for no other reason other than to respond to the score creep. Step 1 was hard in 1990 and 2000s. There was zero reason to make that exam magnitudes harder.
I think that ship has sailed. It may have been considered difficult but if it were reset to its original difficulty today, I'm sure we would all consider it a joke.

I get that not everyone agrees and some people fail but I really don't think it's difficult to pass Step 1 as it stands (or stood when I took it scored). A test that 5% of people fail is probably not unnecessarily difficult.

More importantly I don't think that making the test easier as a matter of principle and reverence for the exam's original design is in our field's best interest, especially with the rise of midlevels who claim similar scope.
 
  • Like
Reactions: 4 users
Furthermore this is fundamentally what makes us different from midlevels. 99% of these consults come from NPs who page me and say "hi I have a patient with thyroid cancer and spinal cord compression."
Midlevels will badly fail Step 3 which is the one exam that’s completely resistant to the utter craziness that plagues Steps 1 and 2 (and also Step 3 is the easiest of the three). Step 3 is still a hard exam from the residents who have taken it. Steps 1 and 2 are still hard exams even in original forms. There is no reason to make Step 1 much much harder just to prove superiority over midlevels, who would decisively fail all Steps badly
 
I think that ship has sailed. It may have been considered difficult but if it were reset to its original difficulty today, I'm sure we would all consider it a joke.

I get that not everyone agrees and some people fail but I really don't think it's difficult to pass Step 1 as it stands (or stood when I took it scored). A test that 5% of people fail is probably not unnecessarily difficult.
I don’t think it’s hard to pass Step 1 either. But I still maintain comparatively Step 1 now is significantly harder than Step 1 in the 1990s, with harder questions and higher pass thresholds. I just don’t think they are necessary and we’d be better off just reverting all Steps back to 1990s levels.

But if that’s impossible to go back, well… let’s just hope the fail rates won’t be catastrophic. 5% of US MD students failing Step 1 is already a problem, and if the fail rates rise to double digits, something has gone very wrong
 
What I’m saying is stop asking us to diagnose mitochondrial disease or Cru de chat syndrome or differentiate between papillary and medullary thyroid disease lol. How does that help anybody?. Just test us on high yield stuff like heart failure or causes of respiratory failure, this is supposed to be a basic competency test. The Step 1 used to resemble the old NBMEs with a short paragraphs instead of long intricate IQ test type questions also

Why do I need to differentiate between a really benign slow growing cancer and a super aggressive one? Hrmmmmmmmm.....

I think this is a good example of why we need the cutoff higher if anything. Weed out the weak. 190 is too high? Come on guys you cant get higher than 50-60% on a test you have 2 years to prepare for? Pathetic.
 
  • Like
  • Love
  • Haha
Reactions: 14 users
Why do I need to differentiate between a really benign slow growing cancer and a super aggressive one? Hrmmmmmmmm.....

I think this is a good example of why we need the cutoff higher if anything. Weed out the weak. 190 is too high? Come on guys you cant get higher than 50-60% on a test you have 2 years to prepare for? Pathetic.
Condescension aside, I don’t think it’s a bad idea to sympathize with those struggling in Step 1 even despite taking the exam seriously. And i’m the guy aggressively pushing for 1 yr preclinical/3 yr clinical curricula everywhere with attendings and education leaders telling me that such proposals will lead to disastrous outcomes for a lot of US med students.
 
  • Like
Reactions: 1 users
Midlevels will badly fail Step 3 which is the one exam that’s completely resistant to the utter craziness that plagues Steps 1 and 2 (and also Step 3 is the easiest of the three). Step 3 is still a hard exam from the residents who have taken it. Steps 1 and 2 are still hard exams even in original forms. There is no reason to make Step 1 much much harder just to prove superiority over midlevels, who would decisively fail all Steps badly
Step 3 is silly. The hardest part is figuring out how to use the computer. It should not be the gatekeeping exam. It is also not the time to go back and test basic pathology and pathophysiology—it's a "step-"wise process

I agree that this is something that doctors should know, but in year 2 of 7? To be honest I think having the knowledge level of a practicing NP at step 1 (of 3) is almost funny considering most schools dont let us treat a patient yet.
Yes. I'm not saying you should know how to manage patients with thyroid cancer or answer any of the questions I posed, but that's how we learn medicine; it starts with the fundamentals of pathology and physiology

I'll take that bet any day.
Sadly true for most, but there are plenty who run around in ICUs alone overnight managing this stuff
 
  • Like
Reactions: 1 user
Wouldn't more people failing ensure that Step 1 is still seen as somewhat valuable? It sucks for those who don't pass, but if people from Top 20s are having a hard time passing at rates like before, it makes a pass still worthwhile. I see this also eventually impacting Step 2 scores, since I've heard from most upperclassmen that their Step 1 score has been pretty well correlated with their Step 2 score.


That person isn't a med student, they were posting on behalf of their significant other and many details in that thread seem wonky because of it. Am I throwing out everything said in the thread, or this one? No, but to me top tier schools have STEP 1 pass rates of 98-99%, hell my schools only a T30 and ours is 99%. Are top tier schools actually having 10% fail rates?
 
This is one way to solve the residency shortage problem

Race to the bottom seems like the solution

Condescension aside, I don’t think it’s a bad idea to sympathize with those struggling in Step 1 even despite taking the exam seriously. And i’m the guy aggressively pushing for 1 yr preclinical/3 yr clinical curricula everywhere with attendings and education leaders telling me that such proposals will lead to disastrous outcomes for a lot of US med students.

I don't. I really don't. Because in the end you're talking about people who are going to struggle on step 2, step 3, board certification, etc. Better now than 6 years down the line when they're 400K in debt and can't secure a residency spot that they don't deserve anyway. I see enough terrible care from doctors of all specialties as it is and don't see how loosening the standards would be a good thing for patients or for our profession. Yes, I know test scores don't necessarily make good doctors but at least it ensures a minimum competency which I already think is low.
 
  • Like
  • Hmm
Reactions: 8 users
I also find it interesting how much easier it is to pass COMLEX. I passed a practice exam very comfortably at the beginning of dedicated. That same week I got a 50% on nbme. Passing both of these tests allow someone to be a doctor.
 
I don’t think it’s hard to pass Step 1 either. But I still maintain comparatively Step 1 now is significantly harder than Step 1 in the 1990s, with harder questions and higher pass thresholds. I just don’t think they are necessary and we’d be better off just reverting all Steps back to 1990s levels.

But if that’s impossible to go back, well… let’s just hope the fail rates won’t be catastrophic. 5% of US MD students failing Step 1 is already a problem, and if the fail rates rise to double digits, something has gone very wrong
I believe you that it is harder. I don't think that's a bad thing. My understanding of your argument is that it's a bad thing because it happened for a reason unrelated to the fundamental purpose of the exam, so it should be changed back now that the score is irrelevant. I just don't see how that serves any purpose other than pure academic indulgence. I do not think it's in the best interest of our profession.

I also don't think it's a problem that 5% of people fail a test that is not difficult to pass. Either they should prepare better or they may not have what it takes to be a good doctor (and if it turns out they did have what it takes, oh well; they should have passed the exam). If more people suddenly start failing because they are preparing inadequately for an examination of competency, that's their fault.

We should seek opportunities to demonstrate that we as a profession are highly educated and have high standards. In today's anti-scientific, "do your own research" world we should not be lowering the bar.
 
  • Like
Reactions: 2 users
The vast majority of step 1 is basic pharmacology and pathophysiology. The amount of minutiae is greatly overhyped, typically by bitter preclinical students who do not know what’s actually clinically relevant.

I know multiple people who scored over 240 and never memorized any biochem, gene names, chromosome numbers or second messenger stuff. It’s not that hard.

If someone can’t at least pass step 1, they just don’t have the basic science knowledge to progress in training. It makes me wonder what they’ve even been doing for two years; because it sure as hell ain’t been learning anything about medicine.
 
  • Like
Reactions: 7 users
I think we also have to be careful about jumping to conclusions re causation. I mentioned the Covid restrictions and 2 years of zoom lectures as one other possible explanation. Another potential issue is the rapid expansion of medical school slots and the possibility that more students who previously wouldn’t have even been admitted are now taking the step exams. Presumably if you open a bottom of the barrel school, your student body would likely have a higher failure rate than more competitive programs. Some of these issues may be further driving up the failures beyond people simply not studying hard enough.

I do think we should resist the urge to dumb things down. Getting into medical school does not entitle one to be a physician, merely grants the opportunity. If anything, higher failure rates will help with the low supply of residency slots and maybe weed people out before they’ve sunk a full 4 years of tuition in the endeavor.

I actually do use a lot of step 1 level fundamentals in day to day practice. I may not be drawing out the kreb’s cycle, but thinking through basic fundamentals has led me to a number of unusual diagnoses, at least for an ent. Found a colon cancer a couple weeks ago, and no it wasn’t with a misplaced scope! Just good old basic fundamentals, many of which are absolutely classic step 1 fodder.

Medicine itself is getting tougher and tougher. Our response needs to be rising to the challenge rather than lowering our standards.
 
  • Like
Reactions: 6 users
An interesting discussion.

In general, the fail rate on S1 has been around 5-6% of first time MD students. We have clearly seen a slow increase in S1 means, without a big change in the fail rate -- due to slow increases in the minimum passing score.

The USMLE used to claim that scores were equivalent across years -- that a 200 today was equivalent to a 200 fifteen years ago. They no longer claim this, now state that scores more than 3-4 years apart shouldn't be compared. The USMLE is not transparent about what the scaled scores represent. Has the relationship between a score and percentage questions correct remained the same? This would be the simplest explanation, but isn't commented upon in their documentation. They do state that to currently pass you need to get approx 60% of the questions correct - and this is independent of the step, so I think it's unlikely that the scaled score <=> percent correct has remained stable over time.

The thought that equivalent scores over time represent equivalent performance is hard to believe. The exam has changed dramatically over time -- in content and in style -- such that it's impossible to compare how someone who passed the exam years ago would do with today's exam. Are today's students really that much smarter than those 15 years ago that they all would pass now? Also seems suspect.

Theoretically the minimum pass is set by some set of experts. A common method is the Angoff process, but there are others. As the content has changed, and as time has moved on, it's also possible that experts expect more from students and hence the true minimum pass level may have increased -- whether that's good (holding physicians to a higher standard) or not (needlessly preventing some from being physicians) is a matter of debate.

Ultimately before we panic about this, let's see what the fail rate is. I've seen plenty of predictions on SDN that have never come to fruition. Right now this is all rumor and hearsay. If the fail rate has increased, then it raises interesting questions as to whether students should just study harder and do better, or whether the pass level is too high. Politically I agree that lowering the pass rate seems untenable -- especially as then those with previously-failing-but-now-passing scores would be very bitter.

No matter what is done, the pass line is arbitrary. The person who scores a barely passing score and the one who gets one more question incorrect and fails -- their performances are not significantly different. That's the nature of any cutoff.

The comment above about COMLEX and USMLE is very interesting, I've pointed it out before. If you look at any study comparing USMLE to COMLEX performance, the regression line always shows that a minimally passing COMLEX score is a failing USMLE score. Which is "correct" is also a matter of debate.
 
Last edited:
  • Like
Reactions: 5 users
I think we also have to be careful about jumping to conclusions re causation. I mentioned the Covid restrictions and 2 years of zoom lectures as one other possible explanation. Another potential issue is the rapid expansion of medical school slots and the possibility that more students who previously wouldn’t have even been admitted are now taking the step exams. Presumably if you open a bottom of the barrel school, your student body would likely have a higher failure rate than more competitive programs. Some of these issues may be further driving up the failures beyond people simply not studying hard enough.

I do think we should resist the urge to dumb things down. Getting into medical school does not entitle one to be a physician, merely grants the opportunity. If anything, higher failure rates will help with the low supply of residency slots and maybe weed people out before they’ve sunk a full 4 years of tuition in the endeavor.

I actually do use a lot of step 1 level fundamentals in day to day practice. I may not be drawing out the kreb’s cycle, but thinking through basic fundamentals has led me to a number of unusual diagnoses, at least for an ent. Found a colon cancer a couple weeks ago, and no it wasn’t with a misplaced scope! Just good old basic fundamentals, many of which are absolutely classic step 1 fodder.

Medicine itself is getting tougher and tougher. Our response needs to be rising to the challenge rather than lowering our standards.
And historically (meaning the last 20-ish years) the pass rate was still pretty high. I was a pretty sub-par medical student and I still passed with a 20 point cushion.

If I were to bet on it, I think its a combination of COVID-learning AND the test going to pass/fail. COVID can't explain it all as there are lots of very good resources that students can use to do very well on the boards; however, there are lots of students who do best at in-person lectures where they can ask questions. But, moving to pass/fail likely meant people weren't hitting those as hard as they were previously.
 
  • Like
Reactions: 2 users
An interesting discussion.

In general, the fail rate on S1 has been around 5-6% of first time MD students. We have clearly seen a slow increase in S1 means, without a big change in the fail rate -- due to slow increases in the minimum passing score.

The USMLE used to claim that scores were equivalent across years -- that a 200 today was equivalent to a 200 fifteen years ago. They no longer claim this, now state that scores more than 3-4 years apart shouldn't be compared. The USMLE is not transparent about what the scaled scores represent. Has the relationship between a score and percentage questions correct remained the same? This would be the simplest explanation, but isn't commented upon in their documentation. They do state that to currently pass you need to get approx 60% of the questions correct - and this is independent of the step, so I think it's unlikely that the scaled score <=> percent correct has remained stable over time.

The thought that equivalent scores over time represent equivalent performance is hard to believe. The exam has changed dramatically over time -- in content and in style -- such that it's impossible to compare how someone who passed the exam years ago would do with today's exam. Are today's students really that much smarter than those 15 years ago that they all would pass now? Also seems suspect.

Theoretically the minimum pass is set by some set of experts. A common method is the Angoff process, but there are others. As the content has changed, and as time has moved on, it's also possible that experts expect more from students and hence the true minimum pass level may have increased -- whether that's good (holding physicians to a higher standard) or not (needlessly preventing some from being physicians) is a matter of debate.

Ultimately before we panic about this, let's see what the fail rate is. I've seen plenty of predictions on SDN that have never come to fruition. Right now this is all rumor and hearsay. If the fail rate has increased, then it raises interesting questions as to whether students should just study harder and do better, or whether the pass level is too high. Politically I agree that lowering the pass rate seems untenable -- especially as then those with previously-failing-but-now-passing scores would be very bitter.

No matter what is done, the pass line is arbitrary. The person who scores a barely passing score and the one who gets one more question incorrect and fails -- their performances are not significantly different. That's the nature of any cutoff.

The comment above about COMLEX and USMLE is very interesting, I've pointed it out below. If you look at any study comparing USMLE to COMLEX performance, the regression line always shows that a minimally passing COMLEX score is a failing USMLE score. Which is "correct" is also a matter of debate.
And historically (meaning the last 20-ish years) the pass rate was still pretty high. I was a pretty sub-par medical student and I still passed with a 20 point cushion.

If I were to bet on it, I think its a combination of COVID-learning AND the test going to pass/fail. COVID can't explain it all as there are lots of very good resources that students can use to do very well on the boards; however, there are lots of students who do best at in-person lectures where they can ask questions. But, moving to pass/fail likely meant people weren't hitting those as hard as they were previously.
Anecdotally, there has also been rampant cheating in preclinical at the several med schools with which I have personal/professional connections. This is due to improper proctoring secondary to Covid. So there could just be an element of people taking the exam that never should have been allowed to get that far.
 
  • Like
Reactions: 2 users
Well, I just finished the exam today, and I'm not entirely sure that I passed it. If more people really are failing it this year, they're probably like me and didn't study very much for it. Most of the advice I've received about P/F STEP is that you can easily pass it if you just pay attention in class and do some basic studying. That turned out to be completely false lmao. I think people who took STEP I scored don't even realize just how much they studied for it in hindsight. "Just passing" definitely isn't easy by any means. Whether it should be easy or not is a whole different debate.

And as much as I hate to say this because I know it's just going to convince people to trash on DOs even more, I have to agree with the comments above about COMLEX vs STEP. I was passing COMLEX practice exams quite easily even before dedicated, but by the end of dedicated (admittedly I didn't do much studying during it) I was only getting like 60% average on UWorld practice questions, and the actual exam was slightly harder than UWorld on top of that. COMLEX is just so much easier, which came as a shock to me because I thought if I was passing COMLEX practice exams so effortlessly, I should be able to do the same for STEP I. Turns out that isn't the case at all.
 
The comment above about COMLEX and USMLE is very interesting, I've pointed it out below. If you look at any study comparing USMLE to COMLEX performance, the regression line always shows that a minimally passing COMLEX score is a failing USMLE score. Which is "correct" is also a matter of debate.
This may be a lazy observation, but my assumption is that people who make these tests have decided that the 93-96% pass rate seems to be ideal. It has nothing to do with being competent or having a certain level of comprehension of the material. PDs never looked COMLEX the same way they looked at STEP, thus that arms race never happened. People try harder on STEP thus the threshold to pass has to go up. Also I'm aware the populations are different, but have no idea how to factor that in. STEP has Harvard grads and Caribbean grads. COMLEX is only DO.
 
How bad is a step one failure for residency?

Could a Harvard grad who fails the first time still match at MGH, BWH, BIDMC?

(For im)
 
The whole point of the exam is to assess competence
Seriously???

The test is full of tricky questions dealing with minutiae almost no practicing physician uses after Step 1
 
  • Like
  • Dislike
Reactions: 2 users
At my school we certainly didn’t ignore preclinicals, but our admin used p/f boards as an excuse to cut our dedicated time in half despite the study body begging them not to, and we received very little in the way of help for preparation compared to prior classes. I suspect other schools have enacted similar policies based on what I’m reading online.
This is the comment I’m looking for.

The school I’m matriculating too has said they are possibly planning on halving step 1 dedicated to transfer it over to step 2 dedicated due to p/f.

I would match rather take a couple weeks off our 12 week surgery rotation … but that’s why they don’t pay me the big bucks.
 
  • Like
Reactions: 1 user
The fact that PDs abused Step 1 because they have no answer to the overapplication problem doesn’t justify keeping abnormally high Step 1 pass thresholds. Use the Steps for their intended purpose: to assess basic medical competency of doctors in training and nothing else. To have a Step 1 pass threshold in the 190s is completely absurd.
is 190s High or low?
 
There's no public data I can back this up with, but...

I've now heard from multiple students at local med schools, and also seen a bunch of mentions on med school forums, that people in the first Pass/Fail cohorts are failing CBSEs and Step 1 itself at unprecedented rates.

This is probably to be expected, given there were 30 years of score creep followed by the test losing most of its importance essentially overnight. People in MS1-MS2 are probably studying for the test like their counterparts of the year 2000 instead of 2020, which is a problem since the pass mark moved up ~15 points between these eras.

Interesting to note, when Step 1 first rolled out it had a ~15% failure rate, several fold higher than the 3-5% of recent years. With that historical context, I think the NBME is very unlikely to drop the pass threshold down even if we start to see many, many more retakes.

For any current students, what's the situation at your school, is this as widespread a problem as its starting to seem to me?
Our failure rate hasn't gone up this year, but the number of students extending dedicated has gone up significantly. The student affairs people have been hearing about unprecedented failure rates at other schools, including some really heavy hitters (think T5).

From my perspective, the most likely explanation for 80% of this phenomenon is just that people took their eye of the ball when the test went P/F. The next class will see this train wreck and make different choices. After about three cohorts we'll have a good idea of what the "new normal" looks like. Until then everyone just needs to take some deep breaths.
 
  • Like
Reactions: 7 users
Have you looked at any comments from attendings who are alarmed by the increasing craziness of Step 1 in recent years? Seriously even the FA size has increased many times and that’s not just memorizing more facts. The questions are magnitudes harder and more difficult compared to old exam forms and there is literally no reason for that to happen other than the score creep from Step 1 mania
if you say so........
 
An interesting discussion.

In general, the fail rate on S1 has been around 5-6% of first time MD students. We have clearly seen a slow increase in S1 means, without a big change in the fail rate -- due to slow increases in the minimum passing score.

The USMLE used to claim that scores were equivalent across years -- that a 200 today was equivalent to a 200 fifteen years ago. They no longer claim this, now state that scores more than 3-4 years apart shouldn't be compared. The USMLE is not transparent about what the scaled scores represent. Has the relationship between a score and percentage questions correct remained the same? This would be the simplest explanation, but isn't commented upon in their documentation. They do state that to currently pass you need to get approx 60% of the questions correct - and this is independent of the step, so I think it's unlikely that the scaled score <=> percent correct has remained stable over time.

The thought that equivalent scores over time represent equivalent performance is hard to believe. The exam has changed dramatically over time -- in content and in style -- such that it's impossible to compare how someone who passed the exam years ago would do with today's exam. Are today's students really that much smarter than those 15 years ago that they all would pass now? Also seems suspect.

Theoretically the minimum pass is set by some set of experts. A common method is the Angoff process, but there are others. As the content has changed, and as time has moved on, it's also possible that experts expect more from students and hence the true minimum pass level may have increased -- whether that's good (holding physicians to a higher standard) or not (needlessly preventing some from being physicians) is a matter of debate.

Ultimately before we panic about this, let's see what the fail rate is. I've seen plenty of predictions on SDN that have never come to fruition. Right now this is all rumor and hearsay. If the fail rate has increased, then it raises interesting questions as to whether students should just study harder and do better, or whether the pass level is too high. Politically I agree that lowering the pass rate seems untenable -- especially as then those with previously-failing-but-now-passing scores would be very bitter.

No matter what is done, the pass line is arbitrary. The person who scores a barely passing score and the one who gets one more question incorrect and fails -- their performances are not significantly different. That's the nature of any cutoff.

The comment above about COMLEX and USMLE is very interesting, I've pointed it out below. If you look at any study comparing USMLE to COMLEX performance, the regression line always shows that a minimally passing COMLEX score is a failing USMLE score. Which is "correct" is also a matter of debate.
True but don’t most PDs only look at USMLE anyways so it doesn’t really matter if the complex exam is easier?
 
This may be a lazy observation, but my assumption is that people who make these tests have decided that the 93-96% pass rate seems to be ideal. It has nothing to do with being competent or having a certain level of comprehension of the material. PDs never looked COMLEX the same way they looked at STEP, thus that arms race never happened. People try harder on STEP thus the threshold to pass has to go up. Also I'm aware the populations are different, but have no idea how to factor that in. STEP has Harvard grads and Caribbean grads. COMLEX is only DO.
Agree with most of this. But the averages of scores have always been based off of US and Canadian MDs.
 
  • Like
Reactions: 1 user
True but don’t most PDs only look at USMLE anyways so it doesn’t really matter if the complex exam is easier?
If youre not trying to do something competitive, passing it would allow you to become a doctor, even if you would have failed step 1.
 
  • Like
Reactions: 1 user
How bad is a step one failure for residency?

Could a Harvard grad who fails the first time still match at MGH, BWH, BIDMC?

(For im)
Step failure is arguably one of the biggest red flags out there. It would probably mean the Harvard grad isn’t sticking around Harvard for residency unless they managed to ingratiate themselves to key players in the department who would overlook their step failure.

Many competitive programs won’t even review apps from people who fail a step.
 
  • Like
  • Love
Reactions: 7 users
Anecdotally, there has also been rampant cheating in preclinical at the several med schools with which I have personal/professional connections. This is due to improper proctoring secondary to Covid. So there could just be an element of people taking the exam that never should have been allowed to get that far.
Excellent point. The cheating thing is potentially a big part of it too, especially if this meant people didn’t really learn the material the first time.
 
  • Like
Reactions: 1 users
Our failure rate hasn't gone up this year, but the number of students extending dedicated has gone up significantly. The student affairs people have been hearing about unprecedented failure rates at other schools, including some really heavy hitters (think T5).

From my perspective, the most likely explanation for 80% of this phenomenon is just that people took their eye of the ball when the test went P/F. The next class will see this train wreck and make different choices. After about three cohorts we'll have a good idea of what the "new normal" looks like. Until then everyone just needs to take some deep breaths.
I wonder if it may be that people switched their studying back towards school curriculums.

My friend circle at a heavy hitter school was a bunch of folks who all scored top percentile MCATs, and all of us ended up with >250 on Step 1 as well. Most people in this group had been grinding tens of thousands of Anki flashcards, reading First Aid, and doing extra question banks (like Kaplan, Amboss) throughout preclinical. Myself and one other friend liked some of the mini-lecture resources like Pathoma and Boards & Beyond, but we didn't do any practice banks or flashcards, and both did well on all our in-house testing.

The school requires students to sit for a CBSE at the end of preclinicals. Those friends grinding Step 1 specific materials all hit very high CBSE scores, equivalent to 240s-250s right out the gate. I and my similar friend bombed it - 210s equivalent - and suffered a lot more during our dedicated to catch up. I can also state from some internally published data that this school, with one of the highest MCAT medians and strongest reputations, had a class average Step score of only 235 the year I enrolled.

All that to say, I think we could see this phenomenon even with identically bright cohorts who are identically studious. They might just be studying what their professors are teaching instead of USMLE prep.
 
Last edited:
  • Like
Reactions: 1 user
The USMLE used to claim that scores were equivalent across years -- that a 200 today was equivalent to a 200 fifteen years ago. They no longer claim this, now state that scores more than 3-4 years apart shouldn't be compared.
I think anyone who dug up expired practice forms and also used current ones could tell you they are not comparable at all.

In fact, by comparing the raw score to scaled score conversions between years, you can see the difficulty escalating rapidly.

Example with real numbers based on NBME Form 18:
In 2015-2016, 90% correct equated to a 260.
In 2018-2019, 90% correct equated to a 248.

That's a 12 point change in 3 years for exactly the same performance. I can see why they had to stop saying scores could compare across more than a short time.
 
  • Like
Reactions: 1 users
I think anyone who dug up expired practice forms and also used current ones could tell you they are not comparable at all.

In fact, by comparing the raw score to scaled score conversions between years, you can see the difficulty escalating rapidly.

Example with real numbers based on NBME Form 18:
In 2015-2016, 90% correct equated to a 260.
In 2018-2019, 90% correct equated to a 248.

That's a 12 point change in 3 years for exactly the same performance. I can see why they had to stop saying scores could compare across more than a short time.
Interesting. Does that hold true across other newer forms as well? I know the old forms would vary in difficulty so sometimes the easier forms requires higher percent correct for the same scaled score as a harder one.
 
  • Like
Reactions: 1 users
I'll take that bet any day.
Ill double down on that bet. My N+1 is when I heard a cardiology NP ask the why a patient with SIADH who had hyponatremia was being treated with furosemide. The foundation between medicine and nursing is just not comparable.
 
  • Like
Reactions: 1 users
Last edited:
  • Like
  • Haha
Reactions: 1 users
Interesting. Does that hold true across other newer forms as well? I know the old forms would vary in difficulty so sometimes the easier forms requires higher percent correct for the same scaled score as a harder one.
Yeah it's true the percent-to-scaled conversion varies per form, but when you compare a given form against itself in prior years, it was always the case the scale got more punishing.
 
  • Like
Reactions: 1 user
Top