Just replying to the idea about a specialty specific exam. I’ve thought about this a lot and I keep coming to the conclusion that it would be a lot better for a number of reasons:
I think specialty specific exams, either with scored or P/F USMLE, is certainly something to think about. In fact, perhaps the "best" way to move this idea forward is to create the specialty specific exam first. If everyone applying to IM took some IM specific exam, IM programs might start weighing USMLE less and this new exam more, and then ultimately making USMLE P/F would be no big deal. But then the specialty exam becomes another "high stakes choke point". I guess you could allow people to take it multiple times (like the MCAT), but then that raises the problem of getting a 60th percentile score, and then retaking and getting a 40th percentile score -- but I guess that's the student's problem and programs can do whatever they want with multiple scores. It would certainly increase costs to students, who now would need to take yet another exam (and perhaps encourage schools to create yet more "dedicated study time").
You could even have the surgical exam include knot tying, etc (although this would probably make the exam even more expensive). Each field could decide what content they wanted to cover.
1) renewed investment in their medical education by students rather than attempting to hack a high stakes exam.
Sure, although then students will "hack" the specialty exams.
2) an exam that is potentially better at comparing applicants than the POS that is step one where a 250 and a 235 are not significantly different. We instinctively think those are scores are entirely different leagues but the Nbme data says the difference does not reach the threshold of statistical significance.
I think this depends upon your definition of "statistical significance". If you use the research publication cutoff of p=0.05, then perhaps (I haven't looked at the numbers myself). But I don't need 95% certainty -- I'm fine with much less. And I'm usually not comparing just two applicants -- I think that students who score 250 in general have better knowledge (as measured on an MCQ test) than those with a 235.
3) incoming interns/residents with better field- relevant knowledge. Imagine a world where students put the same effort into studying what we want them to know rather than a lot of basic science stuff that - while important foundationally - May largely comprise material that is less relevant.
I think this is the best of your arguments. Those fields that want lots of basic science can test for it.
I’d like to see the usmle continue to report detailed scores to students and their schools. The one issue I can see with field specific exams is the delay in taking doesn’t allow much time to prepare backup plans. Giving people and their schools scaled data would help someone in the 10th percentile nationally realize that it may be risky applying only to a highly competitive field. Having national comparative data would help schools properly advise their students.
This is a real potential problem. Score a 198 on S1, and you're basically not going to be a neurosurgeon. Make S1 and 2 P/F, and you might be deep into your 4th year until you score poorly on the NS specific exam, and then it's much harder to change paths.
I think residency programs could also adopt other objective screening methods. There has been some flirtation with telephone based behavioral question screens that are scores by an independent company and the scores given to programs. This is pretty standard in corporate interviews and in the HR literature has proven highly predictive of job performance. Add something like that to field specific exam and you’ve got some powerful objective data.
Be careful what you wish for. The AAMC is testing a standardized video interview. So far, only ED is using it. 3 video recorded questions all 5 minutes long (or something like that), each scored on a 1 to 5 scale with a total score of 3 to 15. Plenty of posts here on SDN of people having AOA, straight honors, 260+ steps, and then a 6 on their SVI. Any high stakes eval at the end of your educational path could completely derail your application.
Why not something like how EM uses SLOEs only we do it for every field? That way we can have evidence that a student can function appropriately in the environment of that specialty choice.
We've tried in IM. Some IM department letters are helpful. Most are not.