Why rank med schools?

  • Thread starter Thread starter LoveBeingHuman:)
  • Start date Start date
This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
L

LoveBeingHuman:)

What's the point? It's not like any one med school produces better doctors. I feel like it does more harm than good.

Members don't see this ad.
 
Prestige. Funding. More research opportunities. Relationships with better hospitals. So on and so forth.

Rankings might not be perfect, but there's definitely a point. What's the point in grades? To demonstrate your knowledge in comparison with other students. Same thing with medical schools.
 
Members don't see this ad :)
We humans are very status-conscious. We order ourselves by education, money, race, gender, profession, and every other differentiator we can find. Makes sense that we'd rank our med schools as another way to stratify ourselves.

I order all of my medical schools based on how stylish their logos are.
 
What's your top 10?

1.) Alexander American School of Medicine in the country of Guyana

Top 10 In USA:

Alabama College of Osteopathic Medicine (That bling)
Yale Medical School - (The serpent reminds me of Slytherin)
West Virginia School of Osteopathic Medicine (The serpent reminds me of Slytherin)
Harvard Medical School - (Reminds me of Gryffindor)
Georgetown University School of Medicine (That eagle reminds me of 'Merica)
Arkansas University of Osteopathic Medicine (no particular reason)
Michigan State University of Osteopathic Medicine (like the spartan helmet)
Drexel University of Medicine (Fierce)
Loma Linda School of Medicine (interesting phrase in their logo)
Pritzker School of Medicine (Fierce)
 
Last edited:
What's the point? It's not like any one med school produces better doctors. I feel like it does more harm than good.

Rankings don’t totally capture this (no rank system is perfect), but there’s certainly a difference in the quality of the training and the caliber of students.

Now will you see that difference between the 10th vs 11th ranked school? Probably not. But if you were to look at the 10th vs the 40th, or 40th vs 100th then you would certainly see a difference.

This is especially noticeable in high caliber fields. Say neurosurgery for example. Take a med student from UCSF which has one of the best NSG programs in the nation. They see thousands of ultra-complex patients a year, and their doctors are nationally recognized. Then take a student from your local private med school. They have a teaching hospital, but the patients are generally run of the mill and their doctors (while excellent) are not leaders in the field.

Both students COULD grow up to be excellent doctors. But if I was a residency director, I’d bet ON AVERAGE that the UCSF student would be better prepared for residency (all else being equal).

Additionally, NIH funding has a significant impact on rank. And training at a place where NIH funding is through the roof means many more opportunities to be exposed to people/research that’s on the leading edge of medicine.
 
This is especially noticeable in high caliber fields. Say neurosurgery for example. Take a med student from UCSF which has one of the best NSG programs in the nation. They see thousands of ultra-complex patients a year, and their doctors are nationally recognized. Then take a student from your local private med school. They have a teaching hospital, but the patients are generally run of the mill and their doctors (while excellent) are not leaders in the field.

I am not disagreeing with your assessment of UCSF versus local private medical school for neurosurgery, but your example actually underscores the folly of the most popular ranking system. To quote Malcolm Gladwell: "A ranking can be heterogeneous, in other words, as long as it doesn’t try to be too comprehensive. And it can be comprehensive as long as it doesn’t try to measure things that are heterogeneous. "

In other words, it is perfectly fine to say that UCSF would rank above most other institutions in terms of its neurosurgery program and the level of preparedness it give to medical students who are interested in the field. The problem comes when you try to extrapolate a linear ranking derived a limited set of criteria to entire institutions. Comparing Wash U to Marshall may seem intuitively simple, but it's actually very challenging if you posit that the former exists to crank out leaders in academic medicine and the latter exists to crank out providers for central Appalachia. How can you rank two entities that are completely divergent (i.e. heterogenous) in their missions? Which one is better, the apple or the Brussels sprout?
 
Last edited:
Rankings are of interest only to pre-meds and medical school deans.

I like to think of med schools grouped by class, like battleships and cruisers were. The USN did it by gun size, I do it by median stats. Hence, my use of the term Drexel/Albany class vs Keck class. I mean, are you really going to say that among the Ohio class SSBN submarines that the Michigan is better than the Florida?
 
Members don't see this ad :)
I've never understood the hate on US News for their lists for colleges or med schools. There was a demand they tapped, they didn't just decide to do it for fun.

And I think it's more useful than people realize. Say you're a highly competitive applicant interested in an academic career and you want to apply to places where the students will be similar to you and the research resources will be great. Without using reputations or research funding or lists built on those (like US News') how are you going to know where to apply?
 
I order all of my medical schools based on how stylish their logos are.

Gtown has a super fancy logo


At what rank does ranking not matter? Like if you go to a top 50 med school and then every rank after 50 is just relatively the same?
 
I've never understood the hate on US News for their lists for colleges or med schools. There was a demand they tapped, they didn't just decide to do it for fun.

The exact same argument could be made for the world's heroin producers.

efle said:
And I think it's more useful than people realize. Say you're a highly competitive applicant interested in an academic career and you want to apply to places where the students will be similar to you and the research resources will be great. Without using reputations or research funding or lists built on those (like US News') how are you going to know where to apply?

If you want an academic career, you typically have some inkling of the research field that you wish to target. Then you have to figure out the landscape of that field, and you do that by working in a lab, talking to people, going to meetings, finding out who's got R01's, who's got big labs, who's cranking out the hot papers in the field at that time, and who has a strong track record of mentoring. The labs one finds might be at Harvard or they might be at UT Memphis. You don't know until you actually do the work to find out, and things like the USNWR are of zero help in that process.

If you have no idea which field you want and simply want to hedge your bets by going to a place with maximum NIH funding, you can easily find that straight from the source.
 
I've never understood the hate on US News for their lists for colleges or med schools. There was a demand they tapped, they didn't just decide to do it for fun.

And I think it's more useful than people realize. Say you're a highly competitive applicant interested in an academic career and you want to apply to places where the students will be similar to you and the research resources will be great. Without using reputations or research funding or lists built on those (like US News') how are you going to know where to apply?
The problem isn't the rankings per se. It's the fact that people assume there's a 1:1 correlation between the USNews research rank and the quality of education, which does a disservice to many aspiring physicians. I suspect some of the hate comes from those with inferiority complexes, too, but I have no data to back that up.
 
Gtown has a super fancy logo


At what rank does ranking not matter? Like if you go to a top 50 med school and then every rank after 50 is just relatively the same?

Gtown is a good one.

I think tiers do matter, but so do your preferences. If you had to choose between a rank 30 and a rank 50, and one is in-state and the other is halfway across the nation, then I'd think rank is not worth the great stride to achieve.

It also depends on what you're looking for, some are ranked differently based on their research opportunities as opposed to other factors.
 
Last edited:
They don't need USNWR to tell them who are the top schools.

PDs have a better criteria to judge... they've seen their grads, and also know their Faculty

Whats PD?
 
By the way, if you're interested in some of the negative effects of rankings, there is a nice piece in the current issue of the New England Journal about how a school's ranking affects who goes there and whether that has an impact on diversity in medicine.
 
The exact same argument could be made for the world's heroin producers.



If you want an academic career, you typically have some inkling of the research field that you wish to target. Then you have to figure out the landscape of that field, and you do that by working in a lab, talking to people, going to meetings, finding out who's got R01's, who's got big labs, who's cranking out the hot papers in the field at that time, and who has a strong track record of mentoring. The labs one finds might be at Harvard or they might be at UT Memphis. You don't know until you actually do the work to find out, and things like the USNWR are of zero help in that process.

If you have no idea which field you want and simply want to hedge your bets by going to a place with maximum NIH funding, you can easily find that straight from the source.
Do you not think it's a valid argument? I don't think anyone blames Anheuser Busch or grocery stores for alcoholism in St Louis. It's not like US News went out and invented the idea of ivy league prestige and selectivity when it started making college profiles.

Maybe my class is an exception, but people that are dead set on a specific research area for their career as a junior/senior in undergrad are few and far between in my experience. General NIH funding values, reputation, and academic metrics are all valid things for an applicant to want to consider, and all US News does is put all that info into a table for you.
 
By the way, if you're interested in some of the negative effects of rankings, there is a nice piece in the current issue of the New England Journal about how a school's ranking affects who goes there and whether that has an impact on diversity in medicine.
Are you talking about the merit aid shaming piece from the deans of Harvard, Hopkins and Stanford?
 
Do you not think it's a valid argument? I don't think anyone blames Anheuser Busch or grocery stores for alcoholism in St Louis. It's not like US News went out and invented the idea of ivy league prestige and selectivity when it started making college profiles.

Nowadays, US News is even quite different from Ivy League prestige. Many non-Ivies are at the top of the list, including Stanford, Cal Tech, MIT, UChicago, etc. But I think people still care more about the Ivy League prestige than US News rankings.

Maybe my class is an exception, but people that are dead set on a specific research area for their career as a junior/senior in undergrad are few and far between in my experience. General NIH funding values, reputation, and academic metrics are all valid things for an applicant to want to consider, and all US News does is put all that info into a table for you.

If you're going full academic like MD/PhD, most people will have an idea of what field they want to do their PhD in, just as with any PhD program. You're super naive if you come into med school as an MD/PhD candidate without having any inkling of what field you want to pursue (assuming you even get into an MD/PhD program with that outlook).
 
Are you talking about the merit aid shaming piece from the deans of Harvard, Hopkins and Stanford?

I'm talking about the "benefits packages" used to entice top applicants who tend to be disproportionately from SES advantaged backgrounds.
 
Nowadays, US News is even quite different from Ivy League prestige. Many non-Ivies are at the top of the list, including Stanford, Cal Tech, MIT, UChicago, etc. But I think people still care more about the Ivy League prestige than US News rankings.



If you're going full academic like MD/PhD, most people will have an idea of what field they want to do their PhD in, just as with any PhD program. You're super naive if you come into med school as an MD/PhD candidate without having any inkling of what field you want to pursue (assuming you even get into an MD/PhD program with that outlook).
Idk I don't think going to Stanford or MIT loses you out on any prestige points. US News captures via survey the way institutions view each other, not the other way around.

I'm talking MD applicants who suspect they'll want to match academic residencies in major medical centers, I def understand MSTP is a different ball game.

I'm talking about the "benefits packages" used to entice top applicants who tend to be disproportionately from SES advantaged backgrounds.
Yeah so same article. I find it funny the criticism is coming from three schools who don't need merit aid at all - Harvard is never going to struggle to matriculate best-and-brightest students, even with a need-only policy. Put them in the position of WashU and let's see if they hold fast
 
Yeah so same article. I find it funny the criticism is coming from three schools who don't need merit aid at all - Harvard is never going to struggle to matriculate best-and-brightest students, even with a need-only policy. Put them in the position of WashU and let's see if they hold fast

Okay, but do you say the same about Stanford and Hopkins? Stanford has only recently been on the level they are on and they definitely lose students to other top schools (and a lot of that is probably due to East vs. West coast preferences). While the source of the criticism might itself be a valid point for criticism, I think the point holds very well.
 
Okay, but do you say the same about Stanford and Hopkins? Stanford has only recently been on the level they are on and they definitely lose students to other top schools (and a lot of that is probably due to East vs. West coast preferences). While the source of the criticism might itself be a valid point for criticism, I think the point holds very well.
I don't think they struggle nearly as much as places like WashU or Feinberg or Vandy, despite already being need-only. I totally agree with their point, no doubt a need-only policy would shift the class composition a little away from the wealthiest at a place like WashU. I just don't think they'd really be cool with a 20% yield rate and big drop in numbers if they were facing that tradeoff themselves.
 
I don't think they struggle nearly as much as places like WashU or Feinberg or Vandy. I totally agree with their point, no doubt a need-only policy would shift the class composition a little away from the wealthiest at a place like WashU. I just don't think they'd really be cool with a 20% yield rate and big drop in numbers if they were facing that tradeoff themselves.

Okay but in order for that argument to work, you'd have to assume that the pool of top applicants with top numbers is much smaller than the class sizes of the top, say, 10 or 20 combined. In other words, you'd be saying that WashU attracts what, 30-40% of their class away from Harvard, etc. with the merit scholarships, resulting in Harvard getting less than top applicants with less than top numbers.

I think the pool of top applicants with top numbers could easily fill the top ten schools. The scholarships are a matter of attracting exactly who they want. A school might want one top applicant over another top applicant for many reasons excepting their top numbers.
 
Do you not think it's a valid argument? I don't think anyone blames Anheuser Busch or grocery stores for alcoholism in St Louis. It's not like US News went out and invented the idea of ivy league prestige and selectivity when it started making college profiles.

No, I was just pointing out that the existence of demand for something does not make it inherently good. The rankings were created to sell copy and get attention, but the unintended consequences are now grossly apparent (particularly in admissions). If the magazine went under and the rankings ceased tomorrow the world would be a better place. People and institutions would still be obsessed with comparisons, but they wouldn't be getting led by the nose to do whatever USNWR tells them to do.

efle said:
Maybe my class is an exception, but people that are dead set on a specific research area for their career as a junior/senior in undergrad are few and far between in my experience. General NIH funding values, reputation, and academic metrics are all valid things for an applicant to want to consider, and all US News does is put all that info into a table for you.

This critique would be more meaningful if NIH funding and academic metrics were not available (in more detail) from other sources. On its face, letting a news magazine tell you where you should apply to college, graduate, or professional school is an absurd proposition. The data gathered are superficial and the weighting is arbitrary. Saying Wisconsin is #28 and OHSU is #29 doesn't mean anything, and it doesn't help anyone make a decision about their future.
 
Last edited:
All I have to say is that I know residents from Harvard and UCLA have their contracts not renewed and a resident from Hawaii who swept graduation honors. A person is more than where they went to school and can be successful anywhere they go.

Just like any other job, once you're in residency you're judged by the non-tangibles like work ethic, how you get along with others, bedside manner, helpfullness and not by how well you take tests or prestige.
 
No, I was just pointing out that the existence of demand for something does not make it inherently good. The rankings were created to sell copy and get attention, but the unintended consequences are now grossly apparent (particularly in admissions). If the magazine went under and the rankings ceased tomorrow the world would be a better place. People and institutions would still be obsessed with comparisons, but they wouldn't be getting led by the nose to do whatever USNWR tells them to do.



This critique would be more meaningful if NIH funding and academic metrics were not available (in more detail) from other sources. On its face, letting a news magazine tell you where you should apply to college, graduate, or professional school is an absurd proposition. The data gathered are superficial and the weighting is arbitrary. Saying Wisconsin is #28 and OHSU is #29 doesn't mean anything, and it doesn't help anyone make a decision about their future.
From personal experience I disagree about the world being better off for applicants without these lists. When I was in high school I knew I'd love to go somewhere with an Ivy-like student body, but I wasn't interested in living in the far northeast or the heart of some of the major cities. Pulling up US News gave me a bunch of names like Northwestern, Vandy, WashU and Rice that I had never heard of before but ended up liking a lot and being some of my top choices.

Same thing for med school, if you had me guess which schools had the most funding and best reputations early in college, I would never have guessed names like Pitt or Michigan were up there. I could have gotten a similar idea using the NIH tables, MSAR and asking around with the academic docs at work, but I don't think any damage was done to me by looking at that same data in US News tables instead.

Agree about choosing #28 vs #29 being meaningless. That would be stupid and I don't think anyone actually uses rankings like this.
 
Same thing for med school, if you had me guess which schools had the most funding and best reputations early in college, I would never have guessed names like Pitt or Michigan were up there. I could have gotten a similar idea using the NIH tables, MSAR and asking around with the academic docs at work, but I don't think any damage was done to me by looking at that same data in US News tables instead.

The problem isn't that US News can be a source of consolidated data. The problem is that incidental to this is the fact that US News has become a textbook situation of where the cart is driving the horse. Which schools have the best reputation? The ones that are ranked highly by US News? You and I both know that PD rankings differ significantly from US News rankings. The concept of reputation and US News ranking has become melded into one and the same and is now viewed by many many people as equivalent concepts.
 
From personal experience I disagree about the world being better off for applicants without these lists. When I was in high school I knew I'd love to go somewhere with an Ivy-like student body, but I wasn't interested in living in the far northeast or the heart of some of the major cities. Pulling up US News gave me a bunch of names like Northwestern, Vandy, WashU and Rice that I had never heard of before but ended up liking a lot and being some of my top choices.

For me it was the old Fiske Guide to Colleges, which offered a wealth of both subjective and objective data but never ranked anything. Imagine that.

efle said:
Same thing for med school, if you had me guess which schools had the most funding and best reputations early in college, I would never have guessed names like Pitt or Michigan were up there. I could have gotten a similar idea using the NIH tables, MSAR and asking around with the academic docs at work, but I don't think any damage was done to me by looking at that same data in US News tables instead.

The rankings don't damage individuals who use them casually in the course of a more extensive process of gathering data and weighing it according to personal priorities. That's not to say the rankings aren't damaging.

How U.S. News College Rankings Promote Economic Inequality on Campus
Why College Rankings Are a Joke
Your Annual Reminder to Ignore the U.S. News & World Report College Rankings14 Reasons Why US News College Rankings are Meaningless
Why US News College Rankings Shouldn't Matter to Anyone
The Big College Ranking Sham
Why U.S. News' College Rankings Hurt Students

From the perspective of medical school admissions, the rankings reward schools with the highest metrics and lowest percent accepted. How is that good?

efle said:
Agree about choosing #28 vs #29 being meaningless. That would be stupid and I don't think anyone actually uses rankings like this.

Tell that to #29.
 
The problem isn't that US News can be a source of consolidated data. The problem is that incidental to this is the fact that US News has become a textbook situation of where the cart is driving the horse. Which schools have the best reputation? The ones that are ranked highly by US News? You and I both know that PD rankings differ significantly from US News rankings. The concept of reputation and US News ranking has become melded into one and the same and is now viewed by many many people as equivalent concepts.
Re bold: Like I said, I don't buy the cart-before-horse theory. US news didn't go out and convince everyone ivy league schools (or stanford, MIT etc) were the best. They just went out and documented the high SAT scores and high regard. Schools like WashU and Duke and Rice had always been selective too (same admit rates circa 2000 as places like Penn and Dartmouth) but are not names you would come across as a coastal applicant, in my opinion, until the lists became popular. It's a useful way to get a more complete picture than the names everyone knows/expects to see.

Re red: This supports my point, I think, more than yours? NYU as an example has shot up a ton in the last decade in ranking, yet sits a lot lower down in reputation. That looks like US news rank doesn't determine how other institutions score NYU? And regardless these are rare cases. The top 20 by funding and by reputation are the same except for like 2 names
 
Last edited:
For me it was the old Fiske Guide to Colleges, which offered a wealth of both subjective and objective data but never ranked anything. Imagine that.



The rankings don't damage individuals who use them casually in the course of a more extensive process of gathering data and weighing it according to personal priorities. That's not to say the rankings aren't damaging.

How U.S. News College Rankings Promote Economic Inequality on Campus
Why College Rankings Are a Joke
Your Annual Reminder to Ignore the U.S. News & World Report College Rankings14 Reasons Why US News College Rankings are Meaningless
Why US News College Rankings Shouldn't Matter to Anyone
The Big College Ranking Sham
Why U.S. News' College Rankings Hurt Students

From the perspective of medical school admissions, the rankings reward schools with the highest metrics and lowest percent accepted. How is that good?



Tell that to #29.
I actually used the Fiske guide too! It did score colleges on a 5-point scale in different areas though, and going through it picking out the highest rated academics gives you the same names that are the top of US news ranks.

Did the "best" schools not already heavily emphasize metrics and have lowest admit rates back in the day? Like was it not places like Ivy League topping off the list of SAT scores and lowest admit rates in the 1990s?

Appreciate the links, will read up
 
NYU as an example has shot up a ton in the last decade in ranking, yet sits a lot lower down in reputation. That looks like US news rank doesn't determine how other institutions score NYU?

My understanding is that NYU shot up the rankings because of how the funds to rebuild after Sandy were counted. If true, the idea that an institution's rank can be manipulated by natural disaster aid should cause some concern.
 
My understanding is that NYU shot up the rankings because of how the funds to rebuild after Sandy were counted. If true, the idea that an institution's rank can be manipulated by natural disaster aid should cause some concern.
My understanding of the ranking methodology is that it uses the yearly NIH grants to the institution. Did Sandy somehow cause them to still be winning tens of millions more in annual biomed grants 5 years later?
 
Re bold: Like I said, I don't buy the cart-before-horse theory. US news didn't go out and convince everyone ivy league schools (or stanford, MIT etc) were the best. They just went out and documented the high SAT scores and high regard. Schools like WashU and Duke and Rice had always been selective too (same admit rates circa 2000 as places like Penn and Dartmouth) but are not names you would come across as a coastal applicant, in my opinion, until the lists became popular. It's a useful way to get a more complete picture than the names everyone knows/expects to see.

I would not argue that schools like WashU or Duke or Rice are held in the same regard reputation-wise as the Ivies or schools that people colloquially held in high regard before 1983. Are they selective? Yes. Do they boast high SAT scores? Yes. But reputation has to do with a lot more than that. Reputation has to do with what your graduates go on to do - what names they make for themselves and for the school. That's the essence of reputation, not whether a school is highly selective. If you want to say that US News made figuring out which schools are highly selective and had high numerical marks, then sure, it did.

Re red: This supports my point, I think, more than yours? NYU as an example has shot up a ton in the last decade in ranking, yet sits a lot lower down in reputation. That looks like US news rank doesn't determine how other institutions score NYU? And regardless these are rare cases. The top 20 by funding and by reputation are the same except for like 2 names

My point is that people on here tend to conflate US News ranking with reputation where it isn't true. NYU is ranked high in US News but not as high in reputation. That's the point. You can use US News to gauge one school's statistics vis-a-vis other schools but it would be incredibly stupid to use it to gauge reputation.
 
NYU is a sleeper mid tier trying to punch above its weight is one of my favorite SDN memes.

NYU is associated with the oldest public hospital in the US — Bellevue. It was among the very first medical schools to ever obtain funding from the NIH to startup an MD/PhD program because its president at the time — Lewis Thomas, a physician scientist — preempted the molecular revolution and understood that basic science and mechanistic understanding of disease — not merely memorizing the “natural history” of disease and guiding the patient through its turns and symptoms as he had been taught — would become the drivers of modern medical practice. Through the decades NYU has distinguished itself in providing community service, diversity of clinical offerings, patient population, and research opportunities.

Stanford, on the other hand, is a true upstart. A teenager in academic years and even younger in the world of medicine, Stanford has leveraged its titanic institutional resources and basic science departments to catapault itself to the highest echelons of the medical world in spite of being located in a ritzy suburb miles away from the nearest major medical center. It’s name and science have made up for what it lacks in history, clinical diversity, and even patient beds.

The power of reputation is this: NYU is viewed as an upstart when it pumps up its numbers and gets a boost in the rankings following a large injection of federal funds. Stanford is viewed as merely taking its inevitable, rightful on Olympus.

What this has to do medicine is beyond me. Both are outstanding schools. Their ideal students, in spite having similar grades and test scores, are very, very different. The people who would be best served by going to one, the other, or anywhere should be able to figure that out on their own without the aid of regurgitated pulp spit out by some grocery store shelf-warmer year after year. Maybe I’m wrong. Maybe the PD rankings hide some arcane wisdom that say one school is this much better than another, but even then those are available on their own without the aid of USNWR and it might even be impossible to say how much one might influence the other over time.

That being said, top 10 or bust baby.
 
Last edited:
After interviewing at NYU earlier this cycle alongside a whole bunch of other T20 schools, I don't understand how anyone could get the impression that NYU is a mid-tier school masquerading as a top-tier school through rank manipulation. Bellevue (one of four hospitals associated with the school) has to be one of the most amazing hospitals associated with a medical school, and NYU's MD program alongside the opportunities there definitely seemed on par or better than every other T20 school I visited. Its not like students there have any difficulty matching into top tier residency programs either, and their Step 1 scores are absolutely ridiculous and have to be one of the highest in the country (243 average this year).

I see people talk endlessly about the difference between "reputation" and USNews ranking for NYU on SDN, but the data/tables to back this up that's repeatedly been cited in every thread on SDN that i've found refer to rankings based on "residency director rating score" from USNews' Compass subscription service, which is something that's performed by USNews themselves using a questionable methodology that isn't even fully disclosed, and is already incorporated (20% weight) into USNews rankings as it is. If you put any stock into this score, that score still puts NYU at #20 instead of their USNews rank of #12, which isn't anything close to the type of gulf that's portrayed on SDN.

Am I missing something, and is there another source of data that backs this oft-cited reputation difference that isn't the above? It feels weird to even write this many words about a school i've withdrawn from, but the incredibly negative impression of NYU that you get from browsing SDN seems almost entirely disconnected from the impression I got both in person, that has been conveyed by every medical student and resident I've managed to talk to even at other schools, and honestly almost anywhere else.
 
After interviewing at NYU earlier this cycle alongside a whole bunch of other T20 schools, I don't understand how anyone could get the impression that NYU is a mid-tier school masquerading as a top-tier school through rank manipulation. Bellevue (one of four hospitals associated with the school) has to be one of the most amazing hospitals associated with a medical school, and NYU's MD program alongside the opportunities there definitely seemed on par or better than every other T20 school I visited. Its not like students there have any difficulty matching into top tier residency programs either, and their Step 1 scores are absolutely ridiculous and have to be one of the highest in the country (243 average this year).

Match lists, step scores, etc. =/= reputation. Reputation is what the medical community as a whole thinks about the school and you can't honestly believe that NYU has a name anywhere near as big as Harvard's or Hopkins'. Take a look at their match list as well (https://nyulangone.org/files/2017-match-list.pdf) and compare that to other schools in the top 10. If you don't have experience reading match lists, you might see that they match to a lot of top places. But take a look at match lists from, other top ten programs and look for detail.

I see people talk endlessly about the difference between "reputation" and USNews ranking for NYU on SDN, but the data/tables to back this up that's repeatedly been cited in every thread on SDN that i've found refer to rankings based on "residency director rating score" from USNews' Compass subscription service, which is something that's performed by USNews themselves using a questionable methodology that isn't even fully disclosed, and is already incorporated (20% weight) into USNews rankings as it is. If you put any stock into this score, that score still puts NYU at #20 instead of their USNews rank of #12, which isn't anything close to the type of gulf that's portrayed on SDN.

I'm pretty sure that the PD score is quite straightforward. Straight from the US News website:

"Survey recipients were asked to rate programs on a scale from 1 (marginal) to 5 (outstanding). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know."

A school's score is the average rating of all the respondents who rated it in the three most recent years of survey results. Responses of "don't know" counted neither for nor against a school."

The point of the matter is, if you want a specific metric on a school, go look it up for yourself. If you want to know what PDs think about a school, that information is there in raw form. If you want NIH funding data for a program, you can also see that data easily. Why do you feel the need to use a system that assigns arbitrary weights to many factors to arrive at a composite score that is actually meaningless?
 
Match lists, step scores, etc. =/= reputation. Reputation is what the medical community as a whole thinks about the school and you can't honestly believe that NYU has a name anywhere near as big as Harvard's or Hopkins'. Take a look at their match list as well (https://nyulangone.org/files/2017-match-list.pdf) and compare that to other schools in the top 10. If you don't have experience reading match lists, you might see that they match to a lot of top places. But take a look at match lists from, other top ten programs and look for detail.

I wasn't implying that NYU was as big of a name as Harvard of Hopkins, and I wasn't comparing NYU to those schools, or really even top 10 schools at all given than their "inflated" rank on USNews is #12. In terms of match lists, comparing their match list to the schools I interviewed at with similar rank as NYU (so, in the 11-15 range), for what its worth I actually did feel they were comparable to those schools.

The majority opinion on SDN is that NYU does not "deserve" its USNews rank and attained it purely through just rank manipulation, because of an apparent giant gulf in difference between their reputation and USNews rank. If we use the aforementioned residency director score as a standard for reputation, NYU would be ranked #20, representing a +15 point increase over the last decade. Based on USNews score, NYU would be ranked #12, representing a +18 point increase over the last decade. There is absolutely a difference, but both in terms of increase over the last decade and rank itself, this difference is nowhere near as big as people on SDN portray it as. Even based on residency director score alone, they're still a Top 20 school.


I'm pretty sure that the PD score is quite straightforward. Straight from the US News website:

"Survey recipients were asked to rate programs on a scale from 1 (marginal) to 5 (outstanding). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know."

A school's score is the average rating of all the respondents who rated it in the three most recent years of survey results. Responses of "don't know" counted neither for nor against a school."

Yes, but in the paragraph above that text where it talks about how the survey itself was conducted, it also states "One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry and radiology. The other survey involved primary care and was sent to residency directors designated by schools as mainly involved in the primary care fields of family practice, pediatrics and internal medicine."

How many residency directors were surveyed? No one knows. How was the sample constructed? No one knows. What specific fields? No one knows. How was the decision to assign residency programs to the "primary care" survey that's excluded from the score, rather than the "research" survey made? No one knows. The "residency director score" tables/rankings that get repeatedly linked on SDN represent the results of their "research" survey, and so excludes any residency program that was deemed by USNews to fall into "primary care" (family practice, pediatrics, internal medicine, and an unknown number of others) - how wouldn't this alone heavily influence the results, given that all you get in terms of "raw data" is a single number per school?

An actual residency director score would be an extremely valuable metric to have, but the way that USNews performs this survey, especially with so much critical information kept private, inherently requires it to be taken with a good deal of skepticism.


The point of the matter is, if you want a specific metric on a school, go look it up for yourself. If you want to know what PDs think about a school, that information is there in raw form. If you want NIH funding data for a program, you can also see that data easily. Why do you feel the need to use a system that assigns arbitrary weights to many factors to arrive at a composite score that is actually meaningless?

I agree with you about USNews as a ranking system - I don't think USNews rankings is a great metric, and the decisions made by them for what contributes to the score and what doesn't is extremely arbitrary. However, I disagree that data about how PDs feel about a school really is that readily available in raw form - all that's available in terms of data is a single number from USNews per school, with a lot of essential information about how this number was generated kept shrouded in mystery, and not exactly immune from USNews tendency to make arbitrary decisions either due to things like their decision to separate "research" fields from "primary care" fields. Unless you want to blindly trust random graphs linked on SDN that are varying number of years out of date, this information isn't even that easily available as it is since its locked behind USNews Compass' paid subscription that very few people are willing to shell out for.

More than that, I just wanted to dispute this perception SDN has of NYU as a mid-tier school that manipulated USNews rankings to give the false illusion of being a top 20 school due to an apparent giant gulf between their reputation and USNews ranking, since this is something that has become so pervasive I've had it repeated to me more than once by fellow applicants on several of my interview days this cycle as a reason why they're not seriously considering NYU (you would think everyone capable of getting interview invites to top 20 schools would know better, but nope). If we take USNews' residency director score as the benchmark for "reputation", NYU's residency director score has increased almost in tandem with their USNews ranking over the last decade (+18 vs. +15), and while it's absolutely lower, its not as huge of a difference as people make it out to be - they still would be top 20 and nowhere close to a "mid-tier" school even if all you looked at was this residency director "reputation" score alone.
 
My understanding of the ranking methodology is that it uses the yearly NIH grants to the institution. Did Sandy somehow cause them to still be winning tens of millions more in annual biomed grants 5 years later?

The primary supposition is that a lot of the research-related disaster relief funds were dispersed through the NIH in multi-year contracts. For ranking purposes, USNWR appears to specify only the source of funds and not their purpose. So yes, a lot of that money would end up bolstering the institution's ranking. Here is a piece on 300K of NIH money that may have gone a little wayward.

The secondary supposition is that in the wake of Sandy the NIH study sections were a little kinder, consciously or unconsciously, to NYU's research programs.
 
How many residency directors were surveyed? No one knows. How was the sample constructed? No one knows. What specific fields? No one knows. How was the decision to assign residency programs to the "primary care" survey that's excluded from the score, rather than the "research" survey made? No one knows. The "residency director score" tables/rankings that get repeatedly linked on SDN represent the results of their "research" survey, and so excludes any residency program that was deemed by USNews to fall into "primary care" (family practice, pediatrics, internal medicine, and an unknown number of others) - how wouldn't this alone heavily influence the results, given that all you get in terms of "raw data" is a single number per school?

Here - you could have found this via a simple Google search: https://www.usnews.com/education/be.../11/medical-school-deans-take-on-the-rankings. 1) Their language implies that they sent surveys to all of the schools they rank. They don't divulge which programs were surveyed at each school and it likely is a random mix of them. 2) For the peer score, around 50% of med school deans responded and for the PD survey, just under 20% responded. There obviously could be selection bias here. So now you know how the sample was constructed.

I don't think this is the best system for many reasons, not least because different PDs at the same school likely view medical students at another school differently, depending on their field. If you're an MS4 training at a world-renowned CT surgery program, you're obviously going to be more competitive to the surgery PD than another MS4 at your school with a mediocre family medicine department will be to the family med PD. You have no way of distinguishing this without more extensive surveying. However, at the same time, you need to learn that no survey will be perfect and free of bias. This is the only source of data we have (that I'm aware of). So the proper response to this isn't to disregard it but to look at the data and see how much of it you believe. If you believe none of it, you'll have a hard time in medicine indeed.

I agree with you about USNews as a ranking system - I don't think USNews rankings is a great metric, and the decisions made by them for what contributes to the score and what doesn't is extremely arbitrary. However, I disagree that data about how PDs feel about a school really is that readily available in raw form - all that's available in terms of data is a single number from USNews per school, with a lot of essential information about how this number was generated kept shrouded in mystery, and not exactly immune from USNews tendency to make arbitrary decisions either due to things like their decision to separate "research" fields from "primary care" fields. Unless you want to blindly trust random graphs linked on SDN that are varying number of years out of date, this information isn't even that easily available as it is since its locked behind USNews Compass' paid subscription that very few people are willing to shell out for.

See above about data analysis. Look at the data yourself and decide for yourself how much of it to discard. This is the only data we have right now on this metric.

More than that, I just wanted to dispute this perception SDN has of NYU as a mid-tier school that manipulated USNews rankings to give the false illusion of being a top 20 school due to an apparent giant gulf between their reputation and USNews ranking, since this is something that has become so pervasive I've had it repeated to me more than once by fellow applicants on several of my interview days this cycle as a reason why they're not seriously considering NYU (you would think everyone capable of getting interview invites to top 20 schools would know better, but nope). If we take USNews' residency director score as the benchmark for "reputation", NYU's residency director score has increased almost in tandem with their USNews ranking over the last decade (+18 vs. +15), and while it's absolutely lower, its not as huge of a difference as people make it out to be - they still would be top 20 and nowhere close to a "mid-tier" school even if all you looked at was this residency director "reputation" score alone.

It depends on what people define as "mid-tier" on here. As you've noticed, SDN standards are much higher than those in the greater pre-med community. Some people might take "mid-tier" to mean top 10-20 schools whereas "top tier" is top 10. For others, that might mean that mid-tier is beyond #30. I would agree that NYU is of similar status and reputation as schools like Vandy, Pitt, and UCLA.
 
Top