Med School Tiers?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
I see this thread has been resuscitated so I'll try to add a follow-up question since I've been curious about this.

Do higher GPA and MCAT scores on the MSAR correlate with a 'better' education? If tier is imaginary, then why the heck do the same people in this thread recommend high stat individuals to go to apply high stat when they could easily get into mid stat schools that offer the SAME quality of education and chance for competitive residencies. If MD is MD no matter what, than why are some schools more competitive than others? If it truly doesn't matter, then EVERYONE's goal should be to get in somewhere and go to the cheapest place or the place with your favorite weather or student body or location or whatever.

Someone explain this to me.

The other posters are flat out incorrect. Here is data showing the correlation between GPA/MCAT and education equality.

Members don't see this ad.
 
  • Like
Reactions: 1 user
The other posters are flat out incorrect. Here is data showing the correlation between GPA/MCAT and education equality.

Thank you for that, this is EXACTLY what I'm referring to.

Then again, maybe its not the school itself but rather a function of the students themselves; do the schools make these students score well or do these students score well because the schools accepted high stat students?
 
  • Like
Reactions: 1 user
Thank you for that, this is EXACTLY what I'm referring to.

Then again, maybe its not the school itself but rather a function of the students themselves; do the schools make these students score well or do these students score well because the schools accepted high stat students?

Exactly. Correlation isn't causation. But I do think there is something happening in the education -> score direction. If you look at the Step1 in the T25, there is a very obvious positive linear relationship. But the difference between ~40 and ~80 doesn't seem to matter.
 
Members don't see this ad :)
than top schools effectively give you a 'better' education.

You asked if having a higher MCAT and gpa mean meant a better education. The answer is no. Harvard could have a 3.3/500 average and their match list would look the exact same. Just because the top schools can open every door doesn’t mean they have a “better education.” You need to define what you mean with that term.

Here is data showing the correlation between GPA/MCAT and education equality.

There isn’t a single metric in that graph that measures education quality.
 
  • Like
Reactions: 1 user
You asked if having a higher MCAT and gpa mean meant a better education. The answer is no. Harvard could have a 3.3/500 average and their match list would look the exact same. Just because the top schools can open every door doesn’t mean they have a “better education.” You need to define what you mean with that term.



There isn’t a single metric in that graph that measures education quality.

US News ranking and Step1 are as close of a proxy to education quality as you’ll get. The ranking is a third based on quality assessment by peers and PDs, and the step exam is a measure of medical mastery (the goal of a medical education).
 
US News ranking and Step1 are as close of a proxy to education quality as you’ll get.

And they aren’t even close. Both are terrible metrics.

The ranking is a third based on quality assessment by peers and PDs,
And PDs like brand names. I would argue that this metric is better than either Step 1 or USNews by a landslide but still comes up far short of supporting the argument that these schools somehow give a different, higher quality education than other schools.
the step exam is a measure of medical mastery (the goal of a medical education).

Lol, no it’s not. It’s an arms race to see who can memorize the most minutia. The step 1 exam is to determine a baseline of competence for those entering the clinical education portion of their training.

Giving the schools credit for their students scoring high on Step 1 is like giving the Yankees credit for Aaron Judge hitting 50 homer runs.
 
  • Like
Reactions: 6 users
Giving the schools credit for their students scoring high on Step 1 is like giving the Yankees credit for Aaron Judge hitting 50 homer runs.
But you can give them credit for drafting him! (I am sure there is a potential analogy involving Fresno State in there, as well).
 
  • Like
Reactions: 1 user
@STLouisVuittonDon

There's low correlation between MCAT or Step1 scores to success in the clinical years of medical school (Which is really the closest metric we have to describing a good physician).

If the goal of medical school education was to create good physicians, and step 1 was an accurate measure of a good education, you would expect a strong correlation between Step scores and Clinical Success.

Source of Step1 Statement: UC Davis' video on why MMI is best for predicting success in clinical years. They go over the fact that MCAT and Step1 are not great predictors of clinical year success.
 
  • Like
Reactions: 1 user
And they aren’t even close. Both are terrible metrics.


And PDs like brand names. I would argue that this metric is better than either Step 1 or USNews by a landslide but still comes up far short of supporting the argument that these schools somehow give a different, higher quality education than other schools.


Lol, no it’s not. It’s an arms race to see who can memorize the most minutia. The step 1 exam is to determine a baseline of competence for those entering the clinical education portion of their training.

Giving the schools credit for their students scoring high on Step 1 is like giving the Yankees credit for Aaron Judge hitting 50 homer runs.

It's easy to claim that T5 school's and Wayne State give the same education when you throw out all the metrics. If all the step exams aren't measures of training, then why do we use them? And for PD rankings, why do brand names exist in the first place (did we all just randomly decide to say that HMS is the best?) And yes, I would certainly give the Yankees credit for Aaron Judges' success. You do realize that these people are coached and mentored for more time than they actually play the game
 
It's easy to claim that T5 school's and Wayne State give the same education when you throw out all the metrics. If all the step exams aren't measures of training, then why do we use them? And for PD rankings, why do brand names exist in the first place (did we all just randomly decide to say that HMS is the best?) And yes, I would certainly give the Yankees credit for Aaron Judges' success. You do realize that these people are coached and mentored for more time than they actually play the game
LOL
 
  • Like
Reactions: 2 users
If all the step exams aren't measures of training, then why do we use them?

Because they determine a base level of competence necessary to practice medicine... they have nothing to do with quality of training. You can take a student and give him UFAP + Anki and stick him in a closet for two years and he could come out and dominate Step. That’s why even people from brand new DO schools can, and do, ace Step. It has nothing to do with the school and everything to do with the cohort of students that attend these schools.

And yes, I would certainly give the Yankees credit for Aaron Judges' success. You do realize that these people are coached and mentored for more time than they actually play the game

Lol, you’re kidding right?
 
US News ranking and Step1 are as close of a proxy to education quality as you’ll get. The ranking is a third based on quality assessment by peers and PDs, and the step exam is a measure of medical mastery (the goal of a medical education).
No, it's not. It's competency exam, period.

And PDs like brand names. I would argue that this metric is better than either Step 1 or USNews by a landslide but still comes up far short of supporting the argument that these schools somehow give a different, higher quality education than other schools.

It's not as cut and dry as that. PDs like med schools that are known feeder programs, in the same way that med schools like UG feeder schools. Their graduates are known quantities. Then there's also the inbreeding factor. One SDN resident quipped that there's more inbreeding at Brown than in an Alabama trailer park.
 
  • Like
Reactions: 6 users
Members don't see this ad :)
Anatomy, you act like there is only one Step exam when there are 3, which when looked st together do test clinical skills. Goro, yes it’s a competency exam, that’s exactly right! And the point of a medical education is to train competent physicians, the more competent, the better
 
Anatomy, you act like there is only one Step exam when there are 3, which when looked st together do test clinical skills. Goro, yes it’s a competency exam, that’s exactly right! And the point of a medical education is to train competent physicians, the more competent, the better
What you are loosing sight of is that a high Step I score doesn't mean that you're more competent. Medical education doesn't end with Step I. Your mastery comes out of residency.
 
  • Like
Reactions: 2 users
So you're saying someone who graduates with a 205 Step score is equally as competent of a physician as someone with a 260 Step score? I think every PD in the US would disagree with you on that, else they wouldn't use that as one of their main metrics for seeing if someone is qualified for their program. And I completely agree mastery comes out of residency and fellowships. But medical schools are designed to build a platform for that in general medical competency and that's what the USMLEs test
 
So you're saying someone who graduates with a 205 Step score is equally as competent of a physician as someone with a 260 Step score? I think every PD in the US would disagree with you on that, else they wouldn't use that as one of their main metrics for seeing if someone is qualified for their program. And I completely agree mastery comes out of residency and fellowships. But medical schools are designed to build a platform for that in general medical competency and that's what the USMLEs test
I don't know what it is with SDN posters and their love of extrapolating other peoples' words. Nowhere did anyone insinuate a radical step score difference and state that they are the same. Back to your "T5" love affair, those schools rank high for RESEARCH FUNDING. Last time I checked, research does not correlate with being an excellent clinician. I'd be willing to bet a considerable amount of students from unranked MDs outscore Harvard grads, but luckily the Harvard kids have the prestige to carry them into comfy residencies.
 
  • Like
Reactions: 5 users
I don't know what it is with SDN posters and their love of extrapolating other peoples' words. Nowhere did anyone insinuate a radical step score difference and state that they are the same. Back to your "T5" love affair, those schools rank high for RESEARCH FUNDING. Last time I checked, research does not correlate with being an excellent clinician. I'd be willing to bet a considerable amount of students from unranked MDs outscore Harvard grads, but luckily the Harvard kids have the prestige to carry them into comfy residencies.

Yes they did insinuate that. They said that a high step score doesn't imply more competency. So I am questioning if they truly believe that by giving a tangible example. They are also saying that Step scores and USNWR are not related to clinical education at all, which is absolutely preposterous. And yeah the T5 rank high for research funding, AND peer rank, AND program director rank, AND faculty:student ratio, and many others metrics. You're implying that the ranking is just a proxy for research $$ which isn't true at all.
 
They are saying that Step scores and USNWR are not related to clinical education at all, which is absolutely preposterous. And yeah the T5 rank high for research funding, AND peer rank, AND program director rank, AND faculty:student ratio, and many others metrics. You're implying that the ranking is just a proxy for research $$ which isn't true at all.
I would say they are correct. Again many of those kids who end up at those places have their career goal set on academic medicine. I would bet my life that there would be no difference in an ER staffed with docs from "T5" schools vs. an ER staffed with other MDs or DOs. The most incompetent ER physician at a hospital I worked at per at least a few other doctors I worked with was from a "T10" school ... meanwhile one of the best ones was a brand new attending...who just happened to be a DO. This is an individual example, but it goes to show clinical education is only as good as the individual you are educating.
 
I would say they are correct. Again many of those kids who end up at those places have their career goal set on academic medicine. I would bet my life that there would be no difference in an ER staffed with docs from "T5" schools vs. an ER staffed with other MDs or DOs. The most incompetent ER physician at a hospital I worked at per at least a few other doctors I worked with was from a "T10" school ... meanwhile one of the best ones was a brand new attending...who just happened to be a DO. This is an individual example, but it goes to show clinical education is only as good as the individual you are educating.

I don't know why you're suggesting that we look at ER staffing to judge competency, but I'm absolutely positive that the powerhouse hospitals would have better physicians. The same way that the powerhouse hospitals/medical schools are ranked the best on health metrics, death rates, patient satisfaction, etc. How's that for competency and quality? If we're going to throw that out too, then I've had enough SDN for one day. National competency exams, peer/residency rankings, patient satisfaction, health quality metrics, all point to the higher ranked schools. The goal of a medical school is to teach students to be competent, successful clinicians and all of these metrics get at that in some way. When they ALL point to the same places, it show's that there is a clear and tangible discrepancy between institutions. To pretend like there isn't, is absolute lunacy
 
I don't know why you're suggesting that we look at ER staffing to judge competency, but I'm absolutely positive that the powerhouse hospitals would have better physicians. The same way that the powerhouse hospitals/medical schools are ranked the best on health metrics, death rates, patient satisfaction, etc. How's that for competency and quality? If we're going to throw that out too, then I've had enough SDN for one day. National competency exams, peer/residency rankings, patient satisfaction, health quality metrics, all point to the higher ranked schools. The goal of a medical school is to teach students to be competent, successful clinicians and all of these metrics get at that in some way. When they ALL point to the same places, it show's that there is a clear and tangible discrepancy between institutions. To pretend like there isn't, is absolute lunacy

Jokes on you, I work with directly with a bunch of doctors at a major powerhouse hospital on the west coast (I work with about 15 clinicians to help organize collaborations/do lab work/hear juicy gossip about other departments). The majority of the physicians hate it here, say half the clinicians are completely useless/incompetent. No competent physicians want to come here because it's located in a very expensive area and they'd be "living like a peasant even if they made 400k a year." (actual quote from a physician I work with)

Combine that with a ton of other problems at powerhouse hospitals like high rates of patient and staff infections (Our website's home page has a "Days since last CLABSI" and "Days since last HAC" counter and I don't think either have ever gone over 5), really high death rates because our patients are generally the sickest, and horrendous patient satisfaction (can't be satisfied if you're dead).

Overall, being a powerhouse hospital just means you do a bunch of research and have cool toys while maintaining a semi-decent facade that you have your **** together.

Don't even get me started on the amount of bureaucratic crap there is because the hospital is afraid to get sued.

Also, our residents and fellows hate it here.

Edit: I'm actually wrong, our days since last HAC counter is currently sitting pretty at 6.
 
Last edited:
  • Like
Reactions: 5 users
So you're saying someone who graduates with a 205 Step score is equally as competent of a physician as someone with a 260 Step score? I think every PD in the US would disagree with you on that, else they wouldn't use that as one of their main metrics for seeing if someone is qualified for their program.

Because I’m reality there isn’t a difference between a 205 and a 250 competency wise. That isn’t why the step exam even exists. PDs use it as a metric because it’s a very easy, standardized way to weed through the massive amounts of applicants they get.

They are also saying that Step scores and USNWR are not related to clinical education at all, which is absolutely preposterous

Because they aren’t.... USNWR literally has nothing to do with it at all, and there are a number of low tier schools with higher Step averages than the top schools. I remind you the original argument was that a higher MCAT and GPA correlated to a better education.

I'm absolutely positive that the powerhouse hospitals would have better physicians

Now you are talking about a different topic entirely, and I also don’t really agree with this statement. It’s far to broad of a generalization.

patient satisfaction,
Patient satisfaction is the worst metric to base anything off of. Have you worked in a clinical setting before? The PS scores are purely based on who gave in to the most patient demands. There have been studies that have even showed a potential correlation between PS and worse outcomes.

Are you really a medical student like your profile says? Because you talk about all this the way a pre-med would. I’ve literally never heard a medical student talk about this the way you are.
 
  • Like
Reactions: 4 users
409.gif
 
  • Like
Reactions: 3 users
Yes they did insinuate that. They said that a high step score doesn't imply more competency. So I am questioning if they truly believe that by giving a tangible example. They are also saying that Step scores and USNWR are not related to clinical education at all, which is absolutely preposterous. And yeah the T5 rank high for research funding, AND peer rank, AND program director rank, AND faculty:student ratio, and many others metrics. You're implying that the ranking is just a proxy for research $$ which isn't true at all.

I mean, when for years PDs and peers are told a program is really good, of course it influences their future ranks, meaning current ranking could well be based on old performance. There is a reason nobody is assigning an academic level of rigor to rankings.

Also, you're just not correct about outcomes. The top hospitals by no means always have the top patient experience scores (and there is a robust literature about what HCAHPS even mean anyway), and many major hospital systems get dinged by Medicare for being below the safety threshold (Stanford and Brigham are two of those hospitals - full list of 751 center's on Medicare.gov). Admittedly you're not going to a community hospital for open heart surgery necessarily, but top 5 is by no means better than some other academic institution.

Also also, as is the case with hospitals, different schools really are good for different things. Seems silly to think that a single ranking could possibly inform every possible outcome metric of interest. Basically if you want a Med school with the most public presitge because you hope to jump to VC or something, sure USNews is probably mirroring that.

Ultimately I think these rankings are a self-fulfilling prophecy. I do think more multifaceted students go to "top" schools, which in turn does probably confer some benefit to the training environment. But blanketing top schools or hospital with praise just doesn't accurately represent reality in my opinion.
 
  • Like
Reactions: 2 users
"patient satisfaction" aka did he give my dilaudid :angelic:
 
  • Like
Reactions: 1 users
Jokes on you, I work with directly with a bunch of doctors at a major powerhouse hospital on the west coast (I work with about 15 clinicians to help organize collaborations/do lab work/hear juicy gossip about other departments). The majority of the physicians hate it here, say half the clinicians are completely useless/incompetent. No competent physicians want to come here because it's located in a very expensive area and they'd be "living like a peasant even if they made 400k a year." (actual quote from a physician I work with)

Combine that with a ton of other problems at powerhouse hospitals like high rates of patient and staff infections (Our website's home page has a "Days since last CLABSI" and "Days since last HAC" counter and I don't think either have ever gone over 5), really high death rates because our patients are generally the sickest, and horrendous patient satisfaction (can't be satisfied if you're dead).

Overall, being a powerhouse hospital just means you do a bunch of research and have cool toys while maintaining a semi-decent facade that you have your **** together.

Don't even get me started on the amount of bureaucratic crap there is because the hospital is afraid to get sued.

Also, our residents and fellows hate it here.

Edit: I'm actually wrong, our days since last HAC counter is currently sitting pretty at 6.

You sound like a premed who's worked in one hospital and trying to use the experience card and it's really cringey. Doctors complain about their jobs regardless of where they work, that's just part of the territory. It doesn't speak to the caliber of the institution at all. The Powerhouse hospitals (Mayo, CC, MGH, Hopkins) objectively have better patient outcomes than the other hospitals and that is why they are ranked the top hospitals in the country. It is NOT due to their research funding, go look at the methodology yourself. And yes, hospitals have safety tracking protocols (at least good one's do), and I don't see how that adds to your argument at all. Look at the nationally recognized health metrics and then look at the top hospitals for each one, it's always saturated with the traditional names. I don't see why people find this so hard to believe. Everyone is pretend that all these metrics and rankings are just made up and arbitrary, and that's just completely false
 
Because I’m reality there isn’t a difference between a 205 and a 250 competency wise. That isn’t why the step exam even exists. PDs use it as a metric because it’s a very easy, standardized way to weed through the massive amounts of applicants they get.



Because they aren’t.... USNWR literally has nothing to do with it at all, and there are a number of low tier schools with higher Step averages than the top schools. I remind you the original argument was that a higher MCAT and GPA correlated to a better education.



Now you are talking about a different topic entirely, and I also don’t really agree with this statement. It’s far to broad of a generalization.


Patient satisfaction is the worst metric to base anything off of. Have you worked in a clinical setting before? The PS scores are purely based on who gave in to the most patient demands. There have been studies that have even showed a potential correlation between PS and worse outcomes.

Are you really a medical student like your profile says? Because you talk about all this the way a pre-med would. I’ve literally never heard a medical student talk about this the way you are.

If 205 and 250 were the same, then why would PDs be using that metric to weed them out. PDs want the best possible applicants and if 205 and 250 made no difference, then after a few years of taking 250s and realizing they weren't special, you'd think they'd stop using that as a metric. I don't think you understand correlation at all. Just because a lower tier school has a higher average than a higher tier school, doesn't negate the correlation. The P-value is <0.001 there. I agree that PS or any of these metrics in and of themselves aren't perfect. But they all point to the same groups of institutions. But if you think patient outcomes, clinical competency scores, residency director ratings, etc. are all useless indicators of quality of an institution, then I guess I'm just crazy
 
You sound like a premed who's worked in one hospital and trying to use the experience card and it's really cringey. Doctors complain about their jobs regardless of where they work, that's just part of the territory. It doesn't speak to the caliber of the institution at all. The Powerhouse hospitals (Mayo, CC, MGH, Hopkins) objectively have better patient outcomes than the other hospitals and that is why they are ranked the top hospitals in the country. It is NOT due to their research funding, go look at the methodology yourself. And yes, hospitals have safety tracking protocols (at least good one's do), and I don't see how that adds to your argument at all. Look at the nationally recognized health metrics and then look at the top hospitals for each one, it's always saturated with the traditional names. I don't see why people find this so hard to believe. Everyone is pretend that all these metrics and rankings are just made up and arbitrary, and that's just completely false

And you sound like a premed who's worked in no hospitals and is trying to extrapolate to all of them.

I'm not using my own experience either. Everything I'm telling you is directly from physicians with a bit of creative commentary on the side.

Clearly you're a glutton for being proven wrong. Here's the US news world rankings for medical schools:

https://www.usnews.com/best-graduat...rch-rankings?int=8de407&int=b3b50a&int=b14409

Here's the Truven's list for hospitals with the best patient outcomes and lowest cost:

https://truvenhealth.com/Portals/0/..._18725_0318_100_Top_Hospitals_Study_final.pdf

Please note: Of all teaching hospitals that were ranked as a top 100 hospital, very few belong to top medical schools. The list is also dominated by small-medium community hospitals.
 
  • Like
Reactions: 2 users
The Powerhouse hospitals (Mayo, CC, MGH, Hopkins) objectively have better patient outcomes than the other hospitals
Lol this is completely false... as seen in the citation above. It is clear you really don't know what you are talking about if you are making arguments that have easily googled proof against them.

then why would PDs be using that metric to weed them out.

Because it is literally the ONLY objective measurement with which to compare candidates. Everything else on a residency app is completely subjective.

But if you think patient outcomes, clinical competency scores, residency director ratings, etc. are all useless indicators of quality of an institution, then I guess I'm just crazy

1. What does patient outcomes have to do in the slightest with medical students? Nothing.
2. Clinical competency scores, what are you even talking about? Step 2? People from the top 5 medical schools aren't getting magical clinical rotations that give their students a better education than, say U Minnesota.
3. We have already discussed PD surveys, and I have agreed that this is the closest metric we have to even give insight into what schools may be better but there is so much subjectivity to that process that it can't be used as a reliable indicator. There was a DO school that came in ahead of 23 MD schools.
 
  • Like
Reactions: 1 users
If 205 and 250 were the same, then why would PDs be using that metric to weed them out. PDs want the best possible applicants and if 205 and 250 made no difference, then after a few years of taking 250s and realizing they weren't special, you'd think they'd stop using that as a metric. I don't think you understand correlation at all. Just because a lower tier school has a higher average than a higher tier school, doesn't negate the correlation. The P-value is <0.001 there. I agree that PS or any of these metrics in and of themselves aren't perfect. But they all point to the same groups of institutions. But if you think patient outcomes, clinical competency scores, residency director ratings, etc. are all useless indicators of quality of an institution, then I guess I'm just crazy
Both the student with the 205 and 250 are incompetent to practice medicine (aka why they still have 3+ more years of training after they graduate) so arguing who is more competent is pointless and stupid.

Being an orthopedic surgeon doesn't mean you're more competent as a physician than a family med doc or a pediatrician.

Programs use scores to separate applicants not on the basis of their competency to practice medicine, but because when you get 500 applications for 6 slots, the easiest way to cut down that list is to pick the people with higher scores on a standardized test
 
  • Like
Reactions: 3 users
And you sound like a premed who's worked in no hospitals and is trying to extrapolate to all of them.

I'm not using my own experience either. Everything I'm telling you is directly from physicians with a bit of creative commentary on the side.

Clearly you're a glutton for being proven wrong. Here's the US news world rankings for medical schools:

https://www.usnews.com/best-graduat...rch-rankings?int=8de407&int=b3b50a&int=b14409

Here's the Truven's list for hospitals with the best patient outcomes and lowest cost:

https://truvenhealth.com/Portals/0/..._18725_0318_100_Top_Hospitals_Study_final.pdf

Please note: Of all teaching hospitals that were ranked as a top 100 hospital, very few belong to top medical schools. The list is also dominated by small-medium community hospitals.

Your hospital rankings is in part based on profits, which I think is a little ridiculous. I think the USNews ranking of hospitals is more objective and outcomes based, and there you'll see what I'm talking about: https://health.usnews.com/health-care/best-hospitals/articles/best-hospitals-honor-roll-and-overview. I don't understand the mental gymnastics going on to say that PDs are justified to use Step1 as a metric to differentiate applicants, but then also saying that the scores are meaningless and a 205 == 250. @anatomy, I do think that the rotations of the large teaching hospitals are better than the rotations at the smaller, more rural institutions. Seeing an extremely diverse set of conditions and treating wide ranges of acuity I think is great for learning/education.
 
Your hospital rankings is in part based on profits, which I think is a little ridiculous. I think the USNews ranking of hospitals is more objective and outcomes based, and there you'll see what I'm talking about: https://health.usnews.com/health-care/best-hospitals/articles/best-hospitals-honor-roll-and-overview. I don't understand the mental gymnastics going on to say that PDs are justified to use Step1 as a metric to differentiate applicants, but then also saying that the scores are meaningless and a 205 == 250. @anatomy, I do think that the rotations of the large teaching hospitals are better than the rotations at the smaller, more rural institutions. Seeing an extremely diverse set of conditions and treating wide ranges of acuity I think is great for learning/education. The reason that I think hospital rankings is important for consideration in outcome

Jesus Christ if there's one thing you could call US News, it's not objective or outcomes based (Unless by outcomes you mean leaving the hospital alive, which accounts for 37.5% of the rating).

Truven actually uses objective measures that have their methodology clearly stated and can be repeated by anyone with the same results.

First: That metric shows that the hospital is not bleeding money. Here's an interesting fact for you: Hospitals that go out of business don't do a very good job of treating patients.
Second: Way to name the 10th out of 11 metrics, it literally makes up 10% of the rating criteria.
Third: One of the US News' "Objective Metrics" is expert opinions. That makes up a whopping 27.5% of the hospital's rating. Please explain to me how that's a good way of rating something.
Fourth: US News recently dropped 2/6 patient safety measures, meaning that their results are even less meaningful in terms of patient safety (Which they actually only weight at 5% anyways). One of the metrics they dropped was healthcare quality. You're right, it is outcomes based.
Fifth: Their major metric was survival. That's a pretty low bar. This measure was also barely risk adjusted and doesn't take into account the patient's condition upon leaving the hospital.

Just save face, it's clear you have no idea what you're talking about and are just spouting out buzzwords with no background research.

I'm free all day though, so if you would like to continue to be argumentative, I can shoot you down with facts for as long as you'd like.

Edit: US New's methodology, showing they are neither "objective" nor "outcomes based": https://health.usnews.com/health-ca...es/faq-how-and-why-we-rank-and-rate-hospitals

Edit 2: According to US News, Stanford Hospital, which is currently being fined for having low patient safety, is apparently the 9th best hospital in the country.
 
Last edited:
  • Like
Reactions: 3 users
Jesus Christ if there's one thing you could call US News, it's not objective or outcomes based (Unless by outcomes you mean leaving the hospital alive, which accounts for 37.5% of the rating).

Truven actually uses objective measures that have their methodology clearly stated and can be repeated by anyone with the same results.

First: That metric shows that the hospital is not bleeding money. Here's an interesting fact for you: Hospitals that go out of business don't do a very good job of treating patients.
Second: Way to name the 10th out of 11 metrics, it literally makes up 10% of the rating criteria.
Third: One of the US News' "Objective Metrics" is expert opinions. That makes up a whopping 27.5% of the hospital's rating. Please explain to me how that's a good way of rating something.
Fourth: US News recently dropped 2/6 patient safety measures, meaning that their results are even less meaningful in terms of patient safety (Which they actually only weight at 5% anyways). One of the metrics they dropped was healthcare quality. You're right, it is outcomes based.
Fifth: Their major metric was survival. That's a pretty low bar. This measure was also barely risk adjusted and doesn't take into account the patient's condition upon leaving the hospital.

Just save face, it's clear you have no idea what you're talking about and are just spouting out buzzwords with no background research.

I'm free all day though, so if you would like to continue to be argumentative, I can shoot you down with facts for as long as you'd like.

Edit: US New's methodology, showing they are neither "objective" nor "outcomes based": https://health.usnews.com/health-ca...es/faq-how-and-why-we-rank-and-rate-hospitals

Well, I don't have all day to go back and forth lol. I'm actually working in the trauma unit today and trying to save lives (a worthless metric though?). I trust expert opinions, after all, that's what medicine is: going to experts and getting their opinion on what to do when something is wrong with the body.
 
Jesus Christ if there's one thing you could call US News, it's not objective or outcomes based (Unless by outcomes you mean leaving the hospital alive, which accounts for 37.5% of the rating).

Truven actually uses objective measures that have their methodology clearly stated and can be repeated by anyone with the same results.

First: That metric shows that the hospital is not bleeding money. Here's an interesting fact for you: Hospitals that go out of business don't do a very good job of treating patients.
Second: Way to name the 10th out of 11 metrics, it literally makes up 10% of the rating criteria.
Third: One of the US News' "Objective Metrics" is expert opinions. That makes up a whopping 27.5% of the hospital's rating. Please explain to me how that's a good way of rating something.
Fourth: US News recently dropped 2/6 patient safety measures, meaning that their results are even less meaningful in terms of patient safety (Which they actually only weight at 5% anyways). One of the metrics they dropped was healthcare quality. You're right, it is outcomes based.
Fifth: Their major metric was survival. That's a pretty low bar. This measure was also barely risk adjusted and doesn't take into account the patient's condition upon leaving the hospital.

Just save face, it's clear you have no idea what you're talking about and are just spouting out buzzwords with no background research.

I'm free all day though, so if you would like to continue to be argumentative, I can shoot you down with facts for as long as you'd like.

Edit: US New's methodology, showing they are neither "objective" nor "outcomes based": https://health.usnews.com/health-ca...es/faq-how-and-why-we-rank-and-rate-hospitals

Edit 2: According to US News, Stanford Hospital, which is currently being fined for having low patient safety, is apparently the 9th best hospital in the country.
There's a reason why medical educators in these fora call USN&WR "US Snooze and Worst Report". Again, I have to reiterate that the only people who take it seriously are ignorant pre-meds and med school Deans. PDs do not.
 
  • Like
Reactions: 5 users
Well, I don't have all day to go back and forth lol. I'm actually working in the trauma unit today and trying to save lives (a worthless metric though?). I trust expert opinions, after all, that's what medicine is: going to experts and getting their opinion on what to do when something is wrong with the body.

Whatever you say, buddy. You shouldn't have taken that 30 minutes making stupid arguments when there were trauma patients to save, shame on you.

I don't go to my doctor for his opinion on what to do when something is wrong. I go to my doctor for his research-based knowledge and to receive medicine that research has shown objectively works. If you wanted an opinion on what to do for your health, maybe you should check out naturopathy.

Edit: Also, you must work at a really low ranked hospital if they're unable to save lives without the help of 1 premed.
 
Last edited:
  • Like
Reactions: 1 user
So...



Should I just go to the cheapest MD medical school? Does it matter if I go to Columbia or Creighton or UMASS in terms of matching to general surgery?

This thread makes me doubt my aspiration to reach for top 20s; Im not particularly enthralled by research and I dont want to take on student debt. This makes the elite schools horse****??
 
So...



Should I just go to the cheapest MD medical school? Does it matter if I go to Columbia or Creighton or UMASS in terms of matching to general surgery?

This thread makes me doubt my aspiration to reach for top 20s; Im not particularly enthralled by research and I dont want to take on student debt. This makes the elite schools horse****??
all else being equal, a brand name is more beneficial than a non-brand name but it really depends on what you want to do. If you are just interested in matching to GS then you can do that from anywhere. The advantage of brand schools is that generally you are surrounded by more people doing high quality research so by proxy you can be more productive, and then recommendations from big names in the field helps for sure. In short, there is some advantage but depending on career goals it may not be worth the hassle/money. If your goal is be an excellent clinician and work privately then you can do that coming from anywhere. The krebs cycle is still the krebs cycle wherever you get your degree from.
 
  • Like
Reactions: 1 users
So...

Should I just go to the cheapest MD medical school? Does it matter if I go to Columbia or Creighton or UMASS in terms of matching to general surgery?

This thread makes me doubt my aspiration to reach for top 20s; Im not particularly enthralled by research and I dont want to take on student debt. This makes the elite schools horse****??

Do your best, apply everywhere, and go to whichever school feels right.
 
Everywhere matches general surgery. The only question would be if you care if you do general surgery at a massive academic center or at a smaller hospital.
How could someone know which they prefer?

Anyways, Im premed so this is probably premature
 
So...



Should I just go to the cheapest MD medical school? Does it matter if I go to Columbia or Creighton or UMASS in terms of matching to general surgery?

This thread makes me doubt my aspiration to reach for top 20s; Im not particularly enthralled by research and I dont want to take on student debt. This makes the elite schools horse****??

Brand name matters when applying for residency. It doesn't mean a better education or that someday you will be a better clinician coming from Columbia compared to Creighton, but the matching ceiling from Columbia is higher.

How could someone know which they prefer?

Anyways, Im premed so this is probably premature

That's the point, you usually don't know. The difference between residencies however are generally less with quality of training and more to do with training emphasis. My suggestion is to go to the best school you can, for the matching potential, and then from there you can pick the career path you want with greater ease.
 
  • Like
Reactions: 1 users
So...



Should I just go to the cheapest MD medical school? Does it matter if I go to Columbia or Creighton or UMASS in terms of matching to general surgery?

This thread makes me doubt my aspiration to reach for top 20s; Im not particularly enthralled by research and I dont want to take on student debt. This makes the elite schools horse****??
All things being equal I would always tell people to go the highest ranked place. Columbia will make it easier to match into just about any general surgery residency program. My cousin went to a "low tier" MD and killed the board exams and interviewed at a lot of top general surgery places, however, he told me it took a lot of networking, away rotations, and even family connections to piece together certain aspects of his application success. If he had gone to Columbia and even scored 20%ile less, he would have had no problem whatsoever.
 
  • Like
Reactions: 2 users
All things being equal I would always tell people to go the highest ranked place. Columbia will make it easier to match into just about any general surgery residency program. My cousin went to a "low tier" MD and killed the board exams and interviewed at a lot of top general surgery places, however, he told me it took a lot of networking, away rotations, and even family connections to piece together certain aspects of his application success. If he had gone to Columbia and even scored 20%ile less, he would have had no problem whatsoever.
Brand name matters when applying for residency. It doesn't mean a better education or that someday you will be a better clinician coming from Columbia compared to Creighton, but the matching ceiling from Columbia is higher.



That's the point, you usually don't know. The difference between residencies however are generally less with quality of training and more to do with training emphasis. My suggestion is to go to the best school you can, for the matching potential, and then from there you can pick the career path you want with greater ease.
Go to the ‘best’ school? Didnt we just determine that ranking has nothing to do with goodness of a school?:bang:
 
Go to the ‘best’ school? Didnt we just determine that ranking has nothing to do with goodness of a school?:bang:

We just determined that ranking has nothing to do with quality of education, competence as a physician, or outcomes for patients.

Ranking absolutely has to do with ability to acquire a competitive residency.
 
  • Like
Reactions: 2 users
We just determined that ranking has nothing to do with quality of education, competence as a physician, or outcomes for patients.

Ranking absolutely has to do with ability to acquire a competitive residency.
So whats a top tier residency? Does THAT affect these things?

:corny:
 
So whats a top tier residency? Does THAT affect these things?

:corny:

I can clearly see you’re trying to make a point, but let’s work this out in a thought experiment.

If we went to two different undergraduate institutions and both took general physics, would you claim to know more physics than me if your university was better than mine? Would you say that you were more competent in physics or that you were more prepared to answer physics problems?
 
So...

Should I just go to the cheapest MD medical school? Does it matter if I go to Columbia or Creighton or UMASS in terms of matching to general surgery?
?
Did you know that only 5% of Columbia grads go into Gen Surg?

My school: 2%
U MA: 5%
Drexel: 5%
U Miami: 7%
Jefferson: 7%
Marshall: 7%
Mercer: 8%
For Creighton, it's 9%

I swear that there was one school that had some 15% go Gen Surg.
 
Top