New Charting Outcomes 2014

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Just because it wasn't designed for that purpose, doesn't mean it can't be used for it. The exam does give a score, so clearly it feels it can differentiate candidates, which is exactly what is needed when you're filtering out tons of great candidates for limited spots. In any case, it isn't the NBME's place to tell the individual programs how to select their candidates.

I can say that I felt it was a fair exam and the smartest people that I know did very well on the exam. If you are one of those people that doesn't traditionally do well on standardized tests, you should know that and put more time and effort into it. If, as you say, a "reasonably smart person" can pull 240 with the appropriate preparation, what does that say about your friend that failed it twice? He either wasn't "reasonably smart" or wasn't willing to prepare appropriately. And anyway, there are always outliers.
Yes, but correlating USMLE scores with selection of future residents is the issue. The purpose of the exam is for qualifying for a medical license only. Period. The dentistry profession has also realized this and recently converted the NBDE exams to Pass/Fail and they also apply to residencies as well - Orthodontics, Periodontics, etc. and even they don't see value in a numerical score on their board exams with residency performance: http://www.studentdoctor.net/2012/0...toral-dental-applicants-passfail-nbde-part-i/

Unless residency is only taking multiple choice exams (which it isn't - not even close), correlating your Step 1 score with supposed future residency performance is silly. There are SO MANY other competencies that you have to have to be a successful resident that just can not be tested in a multiple choice exam.

That being said, I do think GutOnc's example is a huge outlier, but in life there will always be.
 
Yes, but correlating USMLE scores with selection of future residents is the issue. The purpose of the exam is for qualifying for a medical license only. Period.

That's an odd lead-off statement considering that's exactly what I addressed in my post. How USMLE scores are used is up to the individual programs. Not all of them put much weight on exam scores. Others do. It's up to them. Eliminating any metric (and especially the only objective metric) just weakens the selection process.

The dentistry profession has also realized this and recently converted the NBDE exams to Pass/Fail and they also apply to residencies as well - Orthodontics, Periodontics, etc. and even they don't see value in a numerical score on their board exams with residency performance: http://www.studentdoctor.net/2012/0...toral-dental-applicants-passfail-nbde-part-i/

Again, it isn't the licensing authority's place to tell individual programs how to select their candidates. There are a lot of things dentists do that we don't do. I don't understand why you think they should be the model for us to follow.

Unless residency is only taking multiple choice exams (which it isn't - not even close), correlating your Step 1 score with supposed future residency performance is silly. There are SO MANY other competencies that you have to have to be a successful resident that just can not be tested in a multiple choice exam.

Correct, but residents do need to pass their specialty boards which are multiple choice, and in some programs they need to achieve a certain score on their in-service (which are multiple choice as well), so test-taking ability is at least one important metric - and one that can currently be addressed by using a USMLE cutoff score.

Obviously this is just one of many competencies that a resident must attain, but it's the one that is most easily addressed and the others are much harder to "screen-for." LORs and rotation evals are so subjective and all of them end up sounding the same. You can't really use grades because each school's grading system is different. Should we select candidates based on the name of the school they went to? I'm curious what would be your alternative?
 
That's an odd lead-off statement considering that's exactly what I addressed in my post. How USMLE scores are used is up to the individual programs. Not all of them put much weight on exam scores. Others do. It's up to them. Eliminating any metric (and especially the only objective metric) just weakens the selection process.

Again, it isn't the licensing authority's place to tell individual programs how to select their candidates. There are a lot of things dentists do that we don't do. I don't understand why you think they should be the model for us to follow.

Correct, but residents do need to pass their specialty boards which are multiple choice, and in some programs they need to achieve a certain score on their in-service (which are multiple choice as well), so test-taking ability is at least one important metric - and one that can currently be addressed by using a USMLE cutoff score.

Obviously this is just one of many competencies that a resident must attain, but it's the one that is most easily addressed and the others are much harder to "screen-for." LORs and rotation evals are so subjective and all of them end up sounding the same. You can't really use grades because each school's grading system is different. Should we select candidates based on the name of the school they went to? I'm curious what would be your alternative?
The USMLE is a licensing exam. The test writers have even said that it is ONLY a licensing exam. It isn't the state medical licensing board that is making the case.
Do you really think that Medicine is that "special" that the rules and trends that apply to other professions don't apply to us? Medicine is not, and does not deserve its own special bubble, just bc you think so.

Again, doing well in residency encompasses more than just taking a standardized multiple choice exam. There are many residents (many of the IMG variety) who are fantastic at taking multiple choice standardized exams but absolutely suck at day to day clinical practice - obtaining a clinical history, knowing what to order, etc. Once you get above a certain score on an exam, you've met competency. Period.

Newsflash - many residency programs already do screen candidates based on the medical school they go to. They realize that achieving an "Honors" at Wash U is different than getting an "Honors" at Mercer. People know the letter writers many times at certain institutions so naturally it carries more weight.
 
The USMLE is a licensing exam. The test writers have even said that it is ONLY a licensing exam. It isn't the state medical licensing board that is making the case.

Gutonc pulled the quote right off of their website. But whether it's the NBME or the test-writers themselves making the case, it doesn't really change anything. Either way, the exam gives an objective metric to differentiate candidates based on their test-taking ability.

Do you really think that Medicine is that "special" that the rules and trends that apply to other professions don't apply to us? Medicine is not, and does not deserve its own special bubble, just bc you think so.

You're really reaching here, DV. Just because one profession does something doesn't mean they're right and that all others should follow suit. If you believe otherwise, why don't we just have one medical board over all healthcare professions? Many NP programs have moved to an online curriculum - should we be doing that too? It isn't that we're "special," it's that we're different.

Engineering has a great example of this kind of back-and-forth with their PE exam: they were initially pass/fail, then moved to giving a score, then went back to pass/fail, then back to giving a score. It's an evolving train of thought. It isn't a right and wrong kind of argument.

Again, doing well in residency encompasses more than just taking a standardized multiple choice exam. There are many residents (many of the IMG variety) who are fantastic at taking multiple choice standardized exams but absolutely suck at day to day clinical practice - obtaining a clinical history, knowing what to order, etc. Once you get above a certain score on an exam, you've met competency. Period.

At least, you've met their metric of competency. Maybe not to the level expected of individual specialties or programs.

Newsflash - many residency programs already do screen candidates based on the medical school they go to. They realize that achieving an "Honors" at Wash U is different than getting an "Honors" at Mercer. People know the letter writers many times at certain institutions so naturally it carries more weight.

Not exactly late-breaking news. My questions was how would YOU select candidates?
 
Gutonc pulled the quote right off of their website. But whether it's the NBME or the test-writers themselves making the case, it doesn't really change anything. Either way, the exam gives an objective metric to differentiate candidates based on their test-taking ability.



You're really reaching here, DV. Just because one profession does something doesn't mean they're right and that all others should follow suit. If you believe otherwise, why don't we just have one medical board over all healthcare professions? Many NP programs have moved to an online curriculum - should we be doing that too? It isn't that we're "special," it's that we're different.

Engineering has a great example of this kind of back-and-forth with their PE exam: they were initially pass/fail, then moved to giving a score, then went back to pass/fail, then back to giving a score. It's an evolving train of thought. It isn't a right and wrong kind of argument.



At least, you've met their metric of competency. Maybe not to the level expected of individual specialties or programs.



Not exactly late-breaking news. My questions was how would YOU select candidates?
Newsflash: The test writers who write the USMLE are on the NBME. Of all people who would know what exactly the test measures, it would be the people who actually write it. I don't see how that is a difficult concept for you to grasp.

You realize for the longest time basic sciences was taught together with dental students and medical students in the same room right? In fact, Harvard still does this with HMS med students and Harvard dental students learning basic scieences together. You can parse it by saying we're "different", but that's obviously not the case as medicine doesn't work in a vaccuum. Unlike other professions, medicine purposefully doesn't look to others to see if they're doing it better and we can adopt the good parts. It's why the federal govt. has now said enough is enough, and the medical profession doesn't get a monopoly over giving medical care to the nation - see: NPs and PAs.

Again with respect to "level expected of individual specialties or programs" - do you actually think that Internal Medicine has changed so much to where a certain score demanded now is different from that a few years ago? What about Radiology? Programs demand a certain score not bc it's actually needed for a certain specialty, but bc they're responsing to the level of competition, pure and simple. A specialty being competitive doesn't mean that a higher Step score is required to be successful in that specialty - esp. when it can be gamed - i.e. IMGs taking months to years to study to blow out the Step.
 
Newsflash: The test writers who write the USMLE are on the NBME.

Once again, this isn't something we didn't already know. The point is that the USMLE/NBME's official institutional statement may differ from what the individual test-writers feel or what the individual state authorities feel. That doesn't really change the discussion here.

Of all people who would know what exactly the test measures, it would be the people who actually write it. I don't see how that is a difficult concept for you to grasp.

Because it's simply not true - at least not the way that you're extrapolating it. The question-writers write questions to assess medical competency. Clearly that's what those questions are doing - no argument there. If the question-writers knew that the exam would be used to differentiate candidates for residency selection, would they change those questions? I doubt it. And if they did, how would they do it?

Statisticians then use those questions to come up with a score to determine how well candidates perform on those same questions. If you're looking to differentiate candidates, why wouldn't you use a score based on how they perform on questions of medical competency? Just because the USMLE specified some arbitrary cutoff to say someone is minimally competent, doesn't mean you can't take it one step farther and say these people perform better than their peers - especially when they give you a metric to do just that.

The USMLE doesn't need to tell us how to think for ourselves. If an exam gives a standardized score to different examinees, you can use it to differentiate those applicants. Whether you actually should is a judgment call and one left up to the individual PDs. So in reality, it's the PDs (not the question-writers) that are actually in the best position to evaluate if the exam measures the qualities that they're looking for in their program.

You realize for the longest time basic sciences was taught together with dental students and medical students in the same room right? In fact, Harvard still does this with HMS med students and Harvard dental students learning basic scieences together. You can parse it by saying we're "different", but that's obviously not the case as medicine doesn't work in a vaccuum. Unlike other professions, medicine purposefully doesn't look to others to see if they're doing it better and we can adopt the good parts. It's why the federal govt. has now said enough is enough, and the medical profession doesn't get a monopoly over giving medical care to the nation - see: NPs and PAs.

No argument with anything you've said here. I just don't see how dentists "are doing it better."

Again with respect to "level expected of individual specialties or programs" - do you actually think that Internal Medicine has changed so much to where a certain score demanded now is different from that a few years ago? What about Radiology?

The USMLE seems to think so, because they've increased the passing score almost every year.

Programs demand a certain score not bc it's actually needed for a certain specialty, but bc they're responsing to the level of competition, pure and simple.

Yep, and same thing happened with the MCAT - another exam that institutions use to differentiate candidates.

A specialty being competitive doesn't mean that a higher Step score is required to be successful in that specialty - esp. when it can be gamed - i.e. IMGs taking months to years to study to blow out the Step.

Right, but IMGs don't perform as well as USMGs in the match, probably because PDs already know this. Either way, it goes to show that these scores aren't everything, otherwise IMGs would dominate the top sub-specialty programs.
 
Top