ACGME Merger and USMLE/COMLEX

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

kaj

Full Member
10+ Year Member
Joined
Aug 13, 2013
Messages
55
Reaction score
2
For those of us medical students graduating in 2020, the year the ACGME merger is supposed to be complete and we apply for to a single system of residencies, which exam are we supposed to take? USMLE or COMLEX? My thinking is, COMLEX will be phased out since DO students will be applying to joint ACGME/AOA residencies and the majority of students will be MD so they'll want to see USMLE scores...

Thanks in advance!

Members don't see this ad.
 
The COMLEX will not die. You see how much cash the NOBME pulls in with that? Money is the name of the game.
 
  • Like
Reactions: 8 users
Take the usmle.
 
  • Like
Reactions: 4 users
Members don't see this ad :)
According to the AOA site, many PDs will have guidelines to translate COMLEX score performance in USMLE terms, but I would still take both exams.
 
According to the AOA site, many PDs will have guidelines to translate COMLEX score performance in USMLE terms, but I would still take both exams.

Unfortunately they have them but don't care to use them. Far too many students and residents mention about how the cut off for certain residency programs can be in the 15% tile for the UMSLE, but the 50% tile for the COMLEX.
 
  • Like
Reactions: 2 users
Unfortunately they have them but don't care to use them. Far too many students and residents mention about how the cut off for certain residency programs can be in the 15% tile for the UMSLE, but the 50% tile for the COMLEX.

Unrelated, but although screening and discrimination this has been the status quo for DO students in GME, it still baffles me to no freaking end how the reliance/culture of standardized testing can be so unceremoniously discarded for some nebulous ideations of prestige. Is this real life? Did we not, for all this time, quantify potential to succeed with the SATs and the MCAT (mostly) regardless of academic background? Estranging a highly qualified DO student for a categorically less qualified MD student drives me insane.
 
  • Like
Reactions: 1 users
Unrelated, but although screening and discrimination this has been the status quo for DO students in GME, it still baffles me to no freaking end how the reliance/culture of standardized testing can be so unceremoniously discarded for some nebulous ideations of prestige. Is this real life? Did we not, for all this time, quantify potential to succeed with the SATs and the MCAT (mostly) regardless of academic background? Estranging a highly qualified DO student for a categorically less qualified MD student drives me insane.
Yup. It's not fair and it's not likely to get fixed any time soon, and even less likely now that the merger is happening.
That's why a highly-qualified DO student should take the USMLE. While some programs will still not accept a DO student with a high USMLE score, it might open up some doors that would otherwise be closed.
 
  • Like
Reactions: 6 users
Yup. It's not fair and it's not likely to get fixed any time soon, and even less likely now that the merger is happening.
That's why a highly-qualified DO student should take the USMLE. While some programs will still not accept a DO student with a high USMLE score, it might open up some doors that would otherwise be closed.
I wish there wasn't a conflict of interest from NBOME so DO students can just take a version of USMLE that has additional OMM question sections. We can even call it USMLE+
 
  • Like
Reactions: 1 user
Technically the ACGME has no rules about the COMLEX and USMLE. It's the actual residency programs that decide which exam to look at.
 
  • Like
Reactions: 1 user
For those of us medical students graduating in 2020, the year the ACGME merger is supposed to be complete and we apply for to a single system of residencies, which exam are we supposed to take? USMLE or COMLEX? My thinking is, COMLEX will be phased out since DO students will be applying to joint ACGME/AOA residencies and the majority of students will be MD so they'll want to see USMLE scores...

Thanks in advance!

The COMLEX and USMLE are the licensing exams to get your respective degree, either DO or MD. Regardless of residency changes you will still have to take and pass the respective exams for your degree. I don't think this is going to change any time soon.

It's always been a good idea to take the USMLE as a DO student.
 
  • Like
Reactions: 2 users
The COMLEX is going to be the licensing exam for DOs for the foreseeable future. Probably the unforeseeable future, too. No way out of it.

I do think that as the merger continues, your actual score on COMLEX is going to become less important and how you do on the USMLE will be more important. All programs will be required to accept the COMLEX, but that doesn't mean they have to like it/will learn what a good score on it means.
 
  • Like
Reactions: 2 users
You have to take COMLEX to get your DO degree. If you are aiming for a better specialty, or a batter program, you'll need USMLE as well.

it will take some years to find out now many [former] ACGME residencies will accept COMLEX only.

I predict that in time, COMLEX will evolve into an exit exam for the COMs, and that everyone will simply take USMLE. I'll long be Professor Emeritus by that point though.


For those of us medical students graduating in 2020, the year the ACGME merger is supposed to be complete and we apply for to a single system of residencies, which exam are we supposed to take? USMLE or COMLEX? My thinking is, COMLEX will be phased out since DO students will be applying to joint ACGME/AOA residencies and the majority of students will be MD so they'll want to see USMLE scores...

Thanks in advance!
 
  • Like
Reactions: 1 user
COMLEX is going to stay for a long assed time. It is a money maker plain and simple. They won't let that revenue flow dry up.
 
  • Like
Reactions: 1 users
Members don't see this ad :)
Unrelated, but although screening and discrimination this has been the status quo for DO students in GME, it still baffles me to no freaking end how the reliance/culture of standardized testing can be so unceremoniously discarded for some nebulous ideations of prestige. Is this real life? Did we not, for all this time, quantify potential to succeed with the SATs and the MCAT (mostly) regardless of academic background? Estranging a highly qualified DO student for a categorically less qualified MD student drives me insane.

Have you taken the COMPOOP yet? Genuinely curious--not trying to be condescending.

It is a wretched, wretched exam. Spelling errors. I had a question on my exam that didn't even provide answer choices. I had a question that had an answer choice repeated twice word for word.

On top of that, the questions are absolutely ridiculous. A very large portion of my micro was based on bioterrorism--I literally laughed out loud during the middle of my exam at some of the ridiculous questions.

The timing during the exam is ridiculous. You only get 1 bathroom break for the entire exam where they actually pause the clock for you. You get like 11 less seconds per question than the USMLE gives you.

There's a reason why PDs of AOA programs take audition rotations so seriously--because they personally know how much of a joke the COMPOOP is.

I didn't believe the hype behind the COMPOOP until I actually took it. I was one of the lucky ones who somehow ended up with a good score...but I could have easily ended up with a much worse score.

It is truly an awful thing and I'm glad that PDs do not take it seriously.
 
  • Like
Reactions: 3 users
The COMLEX will disappear when the NBOME loses the floppy disks that have the questions (and the WordStar program used to type them).
 
  • Like
Reactions: 10 users
The COMLEX will disappear when the NBOME loses the floppy disks that have the questions (and the WordStar program used to type them).
I believe they use a mimeograph to create hard copies should the software fail.
 
Estranging a highly qualified DO student for a categorically less qualified MD student drives me insane.

I'd be with you, if you could determine "qualified" by test scores only. You can't. We had a visiting DO fourth year who had never done inpatient OB rounds, because her "clerkship" preceptor didn't do them, and she just followed him all day. You want to see fear in a PD's eyes? Tell her that one of her entering interns will need extra catching up to even reach competence in things you expect Day 1 interns to be able to do. You can talk about suboptimal allopathic rotations all you want, and they do exist, but at the end of the day there are experiences you can reasonably expect 99% of new MDs to have. I think when you can say the same about new DO grads, then you will see that discrimination subside.
 
  • Like
Reactions: 7 users
I'd be with you, if you could determine "qualified" by test scores only. You can't. We had a visiting DO fourth year who had never done inpatient OB rounds, because her "clerkship" preceptor didn't do them, and she just followed him all day. You want to see fear in a PD's eyes? Tell her that one of her entering interns will need extra catching up to even reach competence in things you expect Day 1 interns to be able to do. You can talk about suboptimal allopathic rotations all you want, and they do exist, but at the end of the day there are experiences you can reasonably expect 99% of new MDs to have. I think when you can say the same about new DO grads, then you will see that discrimination subside.
So is the USMLE a fair, objective, and reliable measure of a student's competency or is it not?
 
Last edited:
So is the USMLE a fair, objective measure of a student's competency or is it not?

Competency at the material it tests? Sure! Competency at everything that comes with being a resident? Of course not, that's why rank lists aren't simply an excel file sorted by step scores.

My point was, you can't call one person "categorically less qualified" based on only one component of what is being evaluated. Someone with a 260 who doesn't know how to check a fundal height isn't automatically "more qualified" than someone with a 210 who does.
 
  • Like
Reactions: 6 users
Competency at the material it tests? Sure! Competency at everything that comes with being a resident? Of course not, that's why rank lists aren't simply an excel file sorted by step scores.

My point was, you can't call one person "categorically less qualified" based on only one component of what is being evaluated. Someone with a 260 who doesn't know how to check a fundal height isn't automatically "more qualified" than someone with a 210 who does.
Is there evidence that DO students are worse at checking fundal height?
 
Is there evidence that DO students are worse at checking fundal height?

...What? You're going to have to work on being so literal. Life can't be boiled down to "But is there evidence?"

In any case, I was referring to the example I gave, in which a fourth-year student had to get a primer on OB rounding because she did not do that during her clerkship. Unfortunately, that contributed to my program being reluctant to take DOs. My point was simply to respond to the statement that an applicant with a higher test score is "more qualified."
 
  • Like
Reactions: 7 users
Here is the thing that makes me pull my hair out when I see that new DO schools are opening or being planned. The weaknesses in clinical education are never, ever considered, only the notion that more DOs is somehow good. Talk about not being able to see past one's nose!!!

I'd be with you, if you could determine "qualified" by test scores only. You can't. We had a visiting DO fourth year who had never done inpatient OB rounds, because her "clerkship" preceptor didn't do them, and she just followed him all day. You want to see fear in a PD's eyes? Tell her that one of her entering interns will need extra catching up to even reach competence in things you expect Day 1 interns to be able to do. You can talk about suboptimal allopathic rotations all you want, and they do exist, but at the end of the day there are experiences you can reasonably expect 99% of new MDs to have. I think when you can say the same about new DO grads, then you will see that discrimination subside.
 
  • Like
Reactions: 10 users
I'd be with you, if you could determine "qualified" by test scores only. You can't. We had a visiting DO fourth year who had never done inpatient OB rounds, because her "clerkship" preceptor didn't do them, and she just followed him all day. You want to see fear in a PD's eyes? Tell her that one of her entering interns will need extra catching up to even reach competence in things you expect Day 1 interns to be able to do. You can talk about suboptimal allopathic rotations all you want, and they do exist, but at the end of the day there are experiences you can reasonably expect 99% of new MDs to have. I think when you can say the same about new DO grads, then you will see that discrimination subside.
Thank you for the clarification. I was definitely being somewhat idealistic with standardized testing due to my own inexperience with the goings on of GME. I stand humbled and corrected.
 
Thank you for the clarification. I was definitely being somewhat idealistic with standardized testing due to my own inexperience with the goings on of GME. I stand humbled and corrected.

It's more complex than I ever imagined before becoming involved with it. There is a significant human aspect, as algorithmic as it may seem. So an "unknown" aspect- in this case, clinical training- can make people really cautious/wary. My own motivation in learning about the setup at different schools is because I think we are missing out on some strong potential applicants because of this caution. Unfortunately I haven't been involved in selecting visiting students, or the initial filtering of ERAS applications, for a couple of cycles due to other obligations.
 
  • Like
Reactions: 2 users
So is the USMLE a fair, objective, and reliable measure of a student's competency or is it not?

Is there evidence that DO students are worse at checking fundal height?

You really need to learn to shut up and listen. I know you think you are being clever but you absolutely aren't.

PS I had to think for a second what fundal height meant... oh OB, such a distant memory
 
  • Like
Reactions: 2 users
You really need to learn to shut up and listen.
Calm down there buddy. If something bothers you that much you have the option to ignore it. Or to refute it.
 
Last edited:
Competency at the material it tests? Sure! Competency at everything that comes with being a resident? Of course not, that's why rank lists aren't simply an excel file sorted by step scores.

My point was, you can't call one person "categorically less qualified" based on only one component of what is being evaluated.
This part is a fair point.
...What? You're going to have to work on being so literal. Life can't be boiled down to "But is there evidence?"

In any case, I was referring to the example I gave, in which a fourth-year student had to get a primer on OB rounding because she did not do that during her clerkship. Unfortunately, that contributed to my program being reluctant to take DOs. My point was simply to respond to the statement that an applicant with a higher test score is "more qualified."
Whether there is evidence of a pattern is important, especially if the alleged pattern is used to treat one group of people differently than another. It's unfortunate that you had a negative experience with that one student, though.
 
This part is a fair point.

Whether there is evidence of a pattern is important, especially if the alleged pattern is used to treat one group of people differently than another. It's unfortunate that you had a negative experience with that one student, though.

I think your attempt at claiming that 3rd and 4th year training at some of the lower tier schools is quality is just confusing. We know they're obviously not good enough and we know that they either need to be improved or their schools need to be ended, end of story. It's why programs like Idaho COM should be outright stopped.
 
I think your attempt at claiming that 3rd and 4th year training at some of the lower tier schools is quality is just confusing. We know they're obviously not good enough and we know that they either need to be improved or their schools need to be ended, end of story. It's why programs like Idaho COM should be outright stopped.
It's unlikely that I or anyone here on the forum can honestly attest to having a deep understanding of the quality of rotations at all of the newer DO schools. That's why it's important to gather evidence about differences in outcomes rather than rely on anecdote. Of course there can and should be improvements in quality control, but simply assuming that all DO schools (or even all new DO schools) do not adequately prepare their graduates is lazy and should be discouraged wherever possible.
 
It's unlikely that I or anyone here on the forum can honestly attest to having a deep understanding of the quality of rotations at all of the newer DO schools. That's why it's important to gather evidence about differences in outcomes rather than rely on anecdote. Of course there can and should be improvements in quality control, but simply assuming that all DO schools (or even all new DO schools) do not adequately prepare their graduates is lazy and should be discouraged wherever possible.

It doesn't take all of the DO schools being bad to tarnish the opinion of PDs. It takes a few bad programs existing. The quality control is simply higeher on the LCME side in that respect.
 
  • Like
Reactions: 3 users
Here's a question for you, Guh. If you were chatting with a friend and they said "I went to Home Depot once and got terrible service. I only go to Lowe's now," would your response be "What is the evidence that all Lowe's stores have better service than all Home Depots?" And then say it's lazy of them not to investigate that and just go off their personal experience? Because that is the impression you are giving now. You are trying to be rigid and quantitative about something that by nature is qualitative. My experience was unfortunate, yes. It may not be representative of the entirety of clinical training across all med schools. You don't have to tell me that, it's common sense. But what matters is the way that it affected the perception of everyone involved. Choosing residents affects the next X years of a program's life in a very real way- a perceived negative doesn't have to have a list of associated studies in order to have an effect. That is how the world works.
 
  • Like
Reactions: 3 users
Here's a question for you, Guh. If you were chatting with a friend and they said "I went to Home Depot once and got terrible service. I only go to Lowe's now," would your response be "What is the evidence that all Lowe's stores have better service than all Home Depots?" And then say it's lazy of them not to investigate that and just go off their personal experience? Because that is the impression you are giving now. You are trying to be rigid and quantitative about something that by nature is qualitative. My experience was unfortunate, yes. It may not be representative of the entirety of clinical training across all med schools. You don't have to tell me that, it's common sense. But what matters is the way that it affected the perception of everyone involved. Choosing residents affects the next X years of a program's life in a very real way- a perceived negative doesn't have to have a list of associated studies in order to have an effect. That is how the world works.

To add to this: PDs and residency programs have no obligation or need to do this rigorous quantitative assessment. Why are you putting the onus of proof on them? It is the DO schools that need to prove to PDs and residency programs that they are adequately training their students and that their product doesn't suck. So far they haven't done this because they're getting $$$ either way and the existence of AOA residencies made it unnecessary for them to have to prove that they produce a good product since these students had exclusive residencies waiting for them either way.

Do car companies require that the government prove to them that their car is dangerous or do the car companies have to prove to the government that their product is safe?


Sent from my iPhone using SDN mobile app
 
  • Like
Reactions: 8 users
To add to this: PDs and residency programs have no obligation or need to do this rigorous quantitative assessment. Why are you putting the onus of proof on them? It is the DO schools that need to prove to PDs and residency programs that they are adequately training their students and that their product doesn't suck. So far they haven't done this because they're getting $$$ either way and the existence of AOA residencies made it unnecessary for them to have to prove that they produce a good product since these students had exclusive residencies waiting for them either way.

Do car companies require that the government prove to them that their car is dangerous or do the car companies have to prove to the government that their product is safe?


Sent from my iPhone using SDN mobile app

Which is fundamentally my problem with expansion. It looks bad on the DO schools who do teach their students well and actually invest in making sure their educations are actually substantiated.
 
  • Like
Reactions: 1 users
I feel like this conversation is a bit lop sided in that people feel the need to blame only the schools that are expanding out. What about the current schools that are also contributing to this problem. There are schools with class sizes in the 300s and sending their students several states down. How does the schools expect to maintain quality control of those sites when the clinical director is in Missouri and the core site is in Florida? Are all students getting rotations the vast majority of their rotations under residency programs or are a certain 10-20% are left out? There are schools that have been around for 10-20 years and yet their rotation quality is all over the place. This problem has been here long before the school expansions and COCA hasn't done anything about it.

Cut down class sizes, make faculty do more research for funding, bring rotations sites close to the mother ship, have a clinical director willing to bust some heads when something goes really wrong, give your preceptors and hospitals some benefits being associate to the school, and generate more GME. Quality will skyrocket if all these thing happen.
 
  • Like
Reactions: 1 user
I feel like this conversation is a bit lop sided in that people feel the need to blame only the schools that are expanding out. What about the current schools that are also contributing to this problem. There are schools with class sizes in the 300s and sending their students several states down. How does the schools expect to maintain quality control of those sites when the clinical director is in Missouri and the core site is in Florida? Are all students getting rotations the vast majority of their rotations under residency programs or are a certain 10-20% are left out? There are schools that have been around for 10-20 years and yet their rotation quality is all over the place. This problem has been here long before the school expansions and COCA hasn't done anything about it.

Cut down class sizes, make faculty do more research for funding, bring rotations sites close to the mother ship, have a clinical director willing to bust some heads when something goes really wrong, give your preceptors and hospitals some benefits being associate to the school, and generate more GME. Quality will skyrocket if all these thing happen.

Some schools have begun to improve and change. No matter which way you look at it, it's going to be an uphill battle to upgrade and bring DO schools up from primarily educational institutes to more rounded research schools. KCU has invested in building the school's research creds up and we have faculty who do individual research here and on other campuses. We've got a pretty strong backing and horse however in ALS research and we have some big wigs here.

Regarding reducing the size and improving rotation sites. Honestly at KCU some of our best rotation sites are out of state ones, so it's not inherently to say that being shipped off means your education is going to be poor, or even that it's inherently a poor decision to move. In either case it seems like most of us are ending up pretty decent residents and or doctors and are pretty well represented in local university based residencies.

Personally I think the solution is to see more state and federal recognition of COMs in providing for physicians in their local area like how KCU despite only having about 70 KS/MO students provides more physicians for the area than the other two medical schools. And as such I think both the states and the federal government should be upping funding for schools who specifically impact their local. But that's my view on the matter.

* Also slowing and reevaluating who we're allowing to open DO schools is worth considering. We're getting close to honestly being like the Psy.D in the Psychology world, where the degree is a duality of programs more like PhDs ( Or in our case MDs) and some that are questionable.
 
Does anyone have a list of DO schools with known issues in terms of quality of rotations? Of course I have seen mentions of one or two here and there but does anyone who is more in the know than me (most of you) have a list off the top of your head?
 
  • Like
Reactions: 1 user
Some schools have begun to improve and change. No matter which way you look at it, it's going to be an uphill battle to upgrade and bring DO schools up from primarily educational institutes to more rounded research schools. KCU has invested in building the school's research creds up and we have faculty who do individual research here and on other campuses. We've got a pretty strong backing and horse however in ALS research and we have some big wigs here.

Regarding reducing the size and improving rotation sites. Honestly at KCU some of our best rotation sites are out of state ones, so it's not inherently to say that being shipped off means your education is going to be poor, or even that it's inherently a poor decision to move. In either case it seems like most of us are ending up pretty decent residents and or doctors and are pretty well represented in local university based residencies.

In either case I'd like to see more state and federal recognition of COMs in providing for physicians in their local area like how KCU despite only having about 70 KS/MO students provides more physicians for the area than the other two medical schools. And as such I think both the states and the federal government should be upping funding for us and other schools who specifically impact their local. But that's my view on the matter.

I don't mean to say all far away sites are poor. However, if there is a malignant site or an issue that has been persistent with multiple student it cannot be addressed by the clinical director who is most likely at the main school. Ensuring students are learning a equally as possible is much easier if rotations are closer.

I also agree with the below paragraph as well. In general, if the school is providing a service to the community it should get more funding. KCU is also doing a good job with research funding, I totally agree. However, in general most schools don't push faculty to do research and thus limits funding (it is part of the reason why tuition is high).
 
I don't mean to say all far away sites are poor. However, if there is a malignant site or an issue that has been persistent with multiple student it cannot be addressed by the clinical director who is most likely at the main school. Ensuring students are learning a equally as possible is much easier if rotations are closer.

I also agree with the below paragraph as well. In general, if the school is providing a service to the community it should get more funding. KCU is also doing a good job with research funding, I totally agree. However, in general most schools don't push faculty to do research and thus limits funding (it is part of the reason why tuition is high).

In either case I'm waiting for the LCME to take over and for the same thing as the merger to happen. I.e Where DOs still get to maintain the OMM and the philosophy and everyone is happy.
 
  • Like
Reactions: 1 user
Some schools have begun to improve and change. No matter which way you look at it, it's going to be an uphill battle to upgrade and bring DO schools up from primarily educational institutes to more rounded research schools. KCU has invested in building the school's research creds up and we have faculty who do individual research here and on other campuses. We've got a pretty strong backing and horse however in ALS research and we have some big wigs here.

Regarding reducing the size and improving rotation sites. Honestly at KCU some of our best rotation sites are out of state ones, so it's not inherently to say that being shipped off means your education is going to be poor, or even that it's inherently a poor decision to move. In either case it seems like most of us are ending up pretty decent residents and or doctors and are pretty well represented in local university based residencies.

Personally I think the solution is to see more state and federal recognition of COMs in providing for physicians in their local area like how KCU despite only having about 70 KS/MO students provides more physicians for the area than the other two medical schools. And as such I think both the states and the federal government should be upping funding for schools who specifically impact their local. But that's my view on the matter.

* Also slowing and reevaluating who we're allowing to open DO schools is worth considering. We're getting close to honestly being like the Psy.D in the Psychology world, where the degree is a duality of programs more like PhDs ( Or in our case MDs) and some that are questionable.
Just curious, do you know, for the purposes of reviewing candidates for GME, that PDs evaluate each school within a vacuum or along with the context of all the subpar DO schools?
 
Just curious, do you know, for the purposes of reviewing candidates for GME, that PDs evaluate each school within a vacuum or along with the context of all the subpar DO schools?

Most DO applicants are evaluated the same. PDs don't know DO schools well enough if they're outside the midwest honestly. However the top 5 and the state schools are generally at least known by most programs as they likely have rotated or had a resident from those programs.
 
Just curious, do you know, for the purposes of reviewing candidates for GME, that PDs evaluate each school within a vacuum or along with the context of all the subpar DO schools?

It depends. If a program receives lots of applicants from regional DO schools and routinely has to settle for IMGs/FMGs the PD may look into how to more thoroughly evaluate DO candidates in order to improve the caliber of his residents. On the other hand the PD of a competitive program that can easily fill with average or above average US MDs may have no interest in figuring out the intricacies of evaluating DO candidates and has the luxury of trashing all of their applications. Of course there is a grey area where PDs may only consider DOs who rotated at the program or are recommended by certain colleagues, or who come from schools that former residents graduated from, etc.
 
  • Like
Reactions: 4 users
Here's a question for you, Guh. If you were chatting with a friend and they said "I went to Home Depot once and got terrible service. I only go to Lowe's now," would your response be "What is the evidence that all Lowe's stores have better service than all Home Depots?" And then say it's lazy of them not to investigate that and just go off their personal experience? Because that is the impression you are giving now. You are trying to be rigid and quantitative about something that by nature is qualitative. My experience was unfortunate, yes. It may not be representative of the entirety of clinical training across all med schools. You don't have to tell me that, it's common sense. But what matters is the way that it affected the perception of everyone involved. Choosing residents affects the next X years of a program's life in a very real way- a perceived negative doesn't have to have a list of associated studies in order to have an effect. That is how the world works.
I see what you're saying but using your analogy I would recommend that my friend at least try a different Home Depot if one opened up in the area the following year. If he still refused to try any Home Depot ever again (even if the staff received their customer service training elsewhere), I would argue that his attitude is irrational. Leaving the analogy, making this decision is definitely important so ideally any limitations are considered as carefully as possible in order to ensure that the best candidates are selected.
To add to this: PDs and residency programs have no obligation or need to do this rigorous quantitative assessment. Why are you putting the onus of proof on them? It is the DO schools that need to prove to PDs and residency programs that they are adequately training their students and that their product doesn't suck. So far they haven't done this because they're getting $$$ either way and the existence of AOA residencies made it unnecessary for them to have to prove that they produce a good product since these students had exclusive residencies waiting for them either way.

Do car companies require that the government prove to them that their car is dangerous or do the car companies have to prove to the government that their product is safe?


Sent from my iPhone using SDN mobile app
Residency programs can use whatever means they want to select their classes. That doesn't mean that it's a good idea to throw out a large number of applications which could include some excellent candidates.
And if you don't think that the thousands of practicing DOs in all specialties throughout the past several decades don't prove that "their product doesn't suck", then we are clearly not going to be able to have a rational discussion on the matter.
And my opinion about government consumer safety regulations is probably not relevant.
 
It depends. If a program receives lots of applicants from regional DO schools and routinely has to settle for IMGs/FMGs the PD may look into how to more thoroughly evaluate DO candidates in order to improve the caliber of his residents. On the other hand the PD of a competitive program that can easily fill with average or above average US MDs may have no interest in figuring out the intricacies of evaluating DO candidates and has the luxury of trashing all of their applications. Of course there is a grey area where PDs may only consider DOs who rotated at the program or are recommended by certain colleagues, or who come from schools that former residents graduated from, etc.
I 100% respect where you are coming from, and I agree that the current GME status quo is the most convenient path with least resistance for PDs, and obviously PDs have total say over however they want to recruit residents .

My contention is that this approach is hardly patient-focused, and that this point alone morally compels against en bloc dismissal of an entire applicant pool which historically demonstrates comparable clinical and USMLE performance with their counterparts.

Using your own example, let's assume a program with 8 spots that screens out DOs can fill their program with 5 above average MDs + 3 average MDs, but is it not fair to argue that the same program without DO screening can fill their program with 4 above average MDs and 4 above average DOs? Does this not lead to a higher caliber of residents overall?
 
Last edited:
Most of my classmates who didnt sit for the USMLE but scored high on the comlex now regret not taking the USMLE. The classmates who scored poorly on the comlex and opted out of the USMLE chose well.
 
  • Like
Reactions: 1 users
I 100% respect where you are coming from, and I agree that the current GME status quo is the most convenient path with least resistance for PDs, and obviously PDs have total say over however they want to recruit residents .

My contention is that this approach is hardly patient-focused, and that this point alone morally compels against en bloc dismissal of an entire applicant pool which historically demonstrates comparable clinical and USMLE performance with their counterparts.

Using your own example, let's assume a program with 8 spots that screens out DOs can fill their program with 5 above average MDs + 3 average MDs, but is it not fair to argue that the same program without DO screening can fill their program with 4 above average MDs and 4 above average DOs? Does this not lead to a higher caliber of residents overall?
The places that are entirely screening out DOs for the most part can fill all of their spots with stellar MDs.
 
  • Like
Reactions: 1 users
The places that are entirely screening out DOs for the most part can fill all of their spots with stellar MDs.
Then what I hear must be from people who speak from extremes quoting MD applicants with 220s who outmatch DO applicants with 240s.
 
The places that are entirely screening out DOs for the most part can fill all of their spots with stellar MDs.

My program actually isn't that strong a draw. Hence my interest in expanding our "reach" to try and avail ourselves of the strongest applicants possible.

My contention is that this approach is hardly patient-focused, and that this point alone morally compels against en bloc dismissal of an entire applicant pool which historically demonstrates comparable clinical and USMLE performance with their counterparts.

The issue with thinking this way is that the entire process assumes some competence at patient care. Resident selection really comes down to, what will this person be like to work with and teach for X years. In the entire time I've been in GME, I have never once heard "But [insert criteria here] will be better for the patients" used to differentiate candidates. Maybe others have, but I haven't. Not once, ever.

It's not that we don't care about patients, of course we do. But assuring good patient care is part of my job once I get whomever we get. Barring egregious issues, which most people won't have, patient care is not used as a differentiator before that point. Instead, it's things like: how well will this person catch on to resident life (clinical performance)? How will they get along with current staff and residents (LORs, interviews)? How likely are they to struggle with standardized testing and get the program flagged by the powers that be (scores)? We even think, how will having this person affect future recruiting? (Example- if all the residents come from the same state, applicants from other states are less likely to apply.) That's why all the requests for evidence re: patient care and outcomes kind of misses the mark. It's not a front-end criterion.
 
  • Like
Reactions: 3 users
My program actually isn't that strong a draw. Hence my interest in expanding our "reach" to try and avail ourselves of the strongest applicants possible.



The issue with thinking this way is that the entire process assumes some competence at patient care. Resident selection really comes down to, what will this person be like to work with and teach for X years. In the entire time I've been in GME, I have never once heard "But [insert criteria here] will be better for the patients" used to differentiate candidates. Maybe others have, but I haven't. Not once, ever.

It's not that we don't care about patients, of course we do. But assuring good patient care is part of my job once I get whomever we get. Barring egregious issues, which most people won't have, patient care is not used as a differentiator before that point. Instead, it's things like: how well will this person catch on to resident life (clinical performance)? How will they get along with current staff and residents (LORs, interviews)? How likely are they to struggle with standardized testing and get the program flagged by the powers that be (scores)? We even think, how will having this person affect future recruiting? (Example- if all the residents come from the same state, applicants from other states are less likely to apply.) That's why all the requests for evidence re: patient care and outcomes kind of misses the mark. It's not a front-end criterion.
Very enlightening, thank you for the clarification, 22031 Alum!

Of the following factors you've delineated above......
  • How well will this person catch on to resident life (clinical performance)?
  • How will they get along with current staff and residents (LORs, interviews)?
  • How likely are they to struggle with standardized testing and get the program flagged by the powers that be (scores)?
  • How will having this person affect future recruiting?
Which ones primarily illuminate the reasoning behind MDs over DOs? To my best understanding, I want to say DO-screening PDs see DO students are seen as less able to perform clinically and more likely to struggle on standardized testing? Did I get that correctly?
 
Top