In Defense of US News and World Report

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

inyoface

Full Member
10+ Year Member
5+ Year Member
15+ Year Member
Joined
Apr 11, 2007
Messages
31
Reaction score
0
I am a newcomer to this forum, but I have been reading posts over the past couple of days. I noticed an interesting trend that prompted me to make my first post: virtually any mention of the medical school rankings issued by US News is extremely negative. People have claimed that these rankings "confirm the sentiments of laypeople," "place too much weight on NIH grants," and are even "specifically designed to place medical schools that have built a reputation consistently on top of the list."

I think this is a rather unfair assessment of the rankings, although it seems to be the pervading view on this message board. I could begin to speculate as to the reasons why people would have such a strong aversion to the rankings, but to avoid speculation that could be perceived as "offensive," i'll simply try and provide the other side of the argument.


The aim of these rankings is to provide an objective rating of the strength of over 100 medical schools from year to year. It should be noted that these rankings are by no means permanent, and schools are still upheld to a standard every year in order to maintain their position on the list.

To maintain that "laypeople" are somehow the primary determinants of this list is very shortsighted. Many institutions that most laypeople probably weren't even aware of in terms of their strength in medicine (UCSF, Baylor, UCLA, etc.) consistently make the top 10-20 medical schools on the list every year, and rightfully so. A more accurate statement would be to claim that reputation within the medical field absolutely matters in the rankings. This is due to the peer institution factor in the ranking formula. I don't think I really have to make formidable arguments about why this absolutely should matter anyway.

With regard to NIH funding, I will concede that the absolute amount of funding places institutions with more affiliate hospitals (more faculty) at a significant advantage. However, let's not forget that NIH funding per faculty is also considered in order to counterbalance this effect. Also, limiting funding analyses to NIH is both practical and fair, as every institution in the country has equal opportunity and access to these grants, making this a mark of standardization that is acceptable.

The strength of accepted applicants in objective terms (GPA and MCAT) also can and should be considered, as it is a function of the school's ability to graduate physicians with potential. Institutions which attract the most capable students are equipped with the facilities to satisfy this potential, so this factor absolutely makes sense.

Finally, schools' strength in residency placement corresponds very closely to the research ranking, due to the residency director's rating factor. The primary care rating is also provided for a fair assessment of individuals who want to go into family practice or GP.

All in all, it seems that no matter which way you look at it these rankings are a very fair assessment of the strength of institutions. Rebuttals?

Members don't see this ad.
 
It's the idea of ranking them linearly that's the problem, in addition to their methods. There's no way of completely quantitating it, and there's too much pressure to not have a 30-way-tie because then US News loses it's "gravitas"

The problem with the linear rank is that if you look at school #50 and compare it to school #20, there's really not much of a difference. Likewise, compare school #20 to school #2 and there's not a big difference. And the schools in between might be better or worse than 20 or 50 depending on your definition of "better." Linear ranking is stupid when you're trying to characterize the top 0.1% of educational institutions on the planet.

In addition:


Mayo is ranked 23, but arguably is one of the best medical schools and medical facilities in the world.

Davis is ranked lower than Irvine.

I rest my case.
 
I am a newcomer to this forum, but I have been reading posts over the past couple of days. I noticed an interesting trend that prompted me to make my first post: virtually any mention of the medical school rankings issued by US News is extremely negative. People have claimed that these rankings "confirm the sentiments of laypeople," "place too much weight on NIH grants," and are even "specifically designed to place medical schools that have built a reputation consistently on top of the list."

I think this is a rather unfair assessment of the rankings, although it seems to be the pervading view on this message board. I could begin to speculate as to the reasons why people would have such a strong aversion to the rankings, but to avoid speculation that could be perceived as "offensive," i'll simply try and provide the other side of the argument.


The aim of these rankings is to provide an objective rating of the strength of over 100 medical schools from year to year. It should be noted that these rankings are by no means permanent, and schools are still upheld to a standard every year in order to maintain their position on the list.

To maintain that "laypeople" are somehow the primary determinants of this list is very shortsighted. Many institutions that most laypeople probably weren't even aware of in terms of their strength in medicine (UCSF, Baylor, UCLA, etc.) consistently make the top 10-20 medical schools on the list every year, and rightfully so. A more accurate statement would be to claim that reputation within the medical field absolutely matters in the rankings. This is due to the peer institution factor in the ranking formula. I don't think I really have to make formidable arguments about why this absolutely should matter anyway.

With regard to NIH funding, I will concede that the absolute amount of funding places institutions with more affiliate hospitals (more faculty) at a significant advantage. However, let's not forget that NIH funding per faculty is also considered in order to counterbalance this effect. Also, limiting funding analyses to NIH is both practical and fair, as every institution in the country has equal opportunity and access to these grants, making this a mark of standardization that is acceptable.

The strength of accepted applicants in objective terms (GPA and MCAT) also can and should be considered, as it is a function of the school's ability to graduate physicians with potential. Institutions which attract the most capable students are equipped with the facilities to satisfy this potential, so this factor absolutely makes sense.

Finally, schools' strength in residency placement corresponds very closely to the research ranking, due to the residency director's rating factor. The primary care rating is also provided for a fair assessment of individuals who want to go into family practice or GP.

All in all, it seems that no matter which way you look at it these rankings are a very fair assessment of the strength of institutions. Rebuttals?

What about the fact that the peer assessment ratings and residency director ratings are based on less than 50% return and constitute a large percentage of the ranking. This is simply bad science.

Plus the characteristics are rated in a purely subjective way. Do we really think that if Harvard ended up somewhere in the middle US News wouldn't change their criteria? Of course they would, or no one would give their rankings any credit. For this, US News presents rankings only to justify what people already believe to be good and bad schools.

Additionally, the division between research/primary care isn't really fair. Schools are punished based on focusing on clinical medicine rather than research in the "research" rankings. US News tries to compensate for this in the "primary care" ranking by giving them points for the percentage of grads going into primary care. What about a school well known for excellent clinical instruction that still places people in non-primary care residencies? They'd get the shaft.

Overall, I guess my biggest problem with the US News rankings is that it attempts to make a distinction between schools where it sometimes does not exist. Is there a difference between Harvard and Po-Dunk Medical School? Absolutely. But is there a difference between #21 and 22? or even #20 and #30? Probably not.

Again, it is obvious to pre-med advisers and most grads as to what schools are known to be prestigious, just look for the ones with high gpa/MCAT averages in the MSAR. Being able to be so selective in choosing is a definite indication of prestige. US News just goes about defining prestige (or what is a top school) in the wrong way. There simply isn't a way to quantify that sort of thing
 
Members don't see this ad :)
my problem with usnews is "outside the box." that is, it doesn't make much sense to produce fine-grained rankings of individual schools, when individual effort is far more important than school name such that you can get just about any residency from any school. it's like ranking high schools. sure, a handful might give you a leg up such that they deserve to be in a separate tier from other schools, but these rough tiers are mostly common sense judgments that we don't need usnews to make for us. meanwhile, implying that "ooh, columbia is four spots better than cornell" is just nonsense. the ultimate value of school rankings is in the results (residency match), and the results just don't vary substantially enough across most schools to make these fine-grained distinctions.

as a for-profit news rag, it's pretty clear why usnews would want to make fine-grained distinctions among schools that change every year, when the most accurate way to think about med school "rankings" is to place them into two or three rough tiers, with many schools occupying a vague spot and most schools staying in a tier for a long time. students from the lowest tier still schools having a shot at the best residencies, showing that even tiered rankings will always be an ambiguous construct.
 
Additionally, the division between research/primary care isn't really fair. Schools are punished based on focusing on clinical medicine rather than research in the "research" rankings. US News tries to compensate for this in the "primary care" ranking by giving them points for the percentage of grads going into primary care. What about a school well known for excellent clinical instruction that still places people in non-primary care residencies? They'd get the shaft.

that's where schools 20-50 come from :thumbup:
 
It's the idea of ranking them linearly that's the problem, in addition to their methods. There's no way of completely quantitating it, and there's too much pressure to not have a 30-way-tie because then US News loses it's "gravitas"

The problem with the linear rank is that if you look at school #50 and compare it to school #20, there's really not much of a difference. Likewise, compare school #20 to school #2 and there's not a big difference. And the schools in between might be better or worse than 20 or 50 depending on your definition of "better." Linear ranking is stupid when you're trying to characterize the top 0.1% of educational institutions on the planet.

In addition:


Mayo is ranked 23, but arguably is one of the best medical schools and medical facilities in the world.

Davis is ranked lower than Irvine.

I rest my case.

Overall, I guess my biggest problem with the US News rankings is that it attempts to make a distinction between schools where it sometimes does not exist. Is there a difference between Harvard and Po-Dunk Medical School? Absolutely. But is there a difference between #21 and 22? or even #20 and #30? Probably not.

beat me to it.
 
What about the fact that the peer assessment ratings and residency director ratings are based on less than 50% return and constitute a large percentage of the ranking. This is simply bad science.

Plus the characteristics are rated in a purely subjective way. Do we really think that if Harvard ended up somewhere in the middle US News wouldn't change their criteria? Of course they would, or no one would give their rankings any credit. For this, US News presents rankings only to justify what people already believe to be good and bad schools.

Additionally, the division between research/primary care isn't really fair. Schools are punished based on focusing on clinical medicine rather than research in the "research" rankings. US News tries to compensate for this in the "primary care" ranking by giving them points for the percentage of grads going into primary care. What about a school well known for excellent clinical instruction that still places people in non-primary care residencies? They'd get the shaft.

Overall, I guess my biggest problem with the US News rankings is that it attempts to make a distinction between schools where it sometimes does not exist. Is there a difference between Harvard and Po-Dunk Medical School? Absolutely. But is there a difference between #21 and 22? or even #20 and #30? Probably not.

Again, it is obvious to pre-med advisers and most grads as to what schools are known to be prestigious, just look for the ones with high gpa/MCAT averages in the MSAR. Being able to be so selective in choosing is a definite indication of prestige. US News just goes about defining prestige (or what is a top school) in the wrong way. There simply isn't a way to quantify that sort of thing

well put. :thumbup:
 
For me, the ONLY use for the rankings is to keep them in mind when deciding where to apply so that you do not overload your cycle with reach/dream schools. I cannot see much difference other than public name recognition for any of the schools in the top 30 or so, and I see lots of great schools in the 2nd 30 or so, too, and there will be plenty of, God forbid, "unranked and unwashed" schools I am very interested in and will definitely apply to.
 
being a great hospital and great reasrch medical schools are two different things. Yes, I agree mayo is great for clinical care...basic research. I feel it's ranking is appropriate.
 
being a great hospital and great reasrch medical schools are two different things. Yes, I agree mayo is great for clinical care...basic research. I feel it's ranking is appropriate.

good thing that's what this thread is about.
 
An old post of mine:

usnews rankings are stupid. look at the methodology.

Quality Assessment (weighted by .40)

* Peer Assessment Score (.20 for the research medical school model, .25 for the primary-care medical school model)
In the fall of 2005, medical and osteopathic school deans, deans of academic affairs, and heads of internal medicine or the directors of admissions were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Survey populations were asked to rate program quality for both research and primary-care programs separately on a single survey instrument. Those individuals who did not know enough about a school to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 54 percent of those surveyed responded.


* Assessment Score by Residency Directors (.20 for the research medical school model, .15 for the primary-care medical school model)
In the fall of 2005, residency program directors were asked to rate programs on two separate survey instruments. One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in the fields of family practice, pediatrics, and internal medicine. Survey recipients were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 28 percent of those surveyed for research medical schools responded. Twenty-three percent responded for primary-care.

Research Activity (weighted by .30 in the research medical school model only)

* Total Research Activity (.20) measured by the total dollar amount of National Institutes of Health research grants awarded to the medical school and its affiliated hospitals, averaged for 2004 and 2005. An asterisk indicates schools that reported only research grants to their medical school in 2005.


* Average Research Activity Per Faculty Member (.10) measured by the dollar amount of National Institutes of Health research grants awarded to the medical school and its affiliated hospitals per full-time faculty member, averaged over 2004 and 2005. Both full-time basic sciences and clinical faculty were used in the faculty count. An asterisk indicates schools that reported research grants only to their medical school in 2005.

Primary-Care Rate (.30 in the primary-care medical school model only) the percentage of M.D. or D.O. school graduates entering primary-care residencies in the fields of family practice, pediatrics, and internal medicine was averaged over 2003, 2004, and 2005.

Student Selectivity (.20 in the research medical school model, .15 in the primary-care medical school model)

* Mean MCAT Score (.13 in the research medical school model, .0975 in the primary-care medical school model) the mean composite Medical College Admission Test score of the 2005 entering class.


* Mean Undergraduate GPA (.06 in the research medical school model, .045 in the primary-care medical school model) the mean undergraduate grade-point average of the 2005 entering class.


* Acceptance Rate (.01 in the research medical school model, .0075 in the primary-care medical school model) the proportion of applicants to the 2005 entering class who were offered admission.

Faculty Resources (.10 in the research medical school model, .15 in the primary-care medical school model) resources were measured as the ratio of full-time science and clinical faculty to full-time M.D. or D.O. students in 2005.

Overall Rank: Indicators were standardized about their means, and standardized scores were weighted, totaled, and rescaled so that the top school received 100; other schools received their percentage of the top score.

Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here.


------------> Notice that the surveys are rarely collected (which make up a staggering 40%). Notice also that many schools will have a hard time improving their rank because its not as if NIH grants can increase substantially year-to-year or a school can hire significantly more doctors (40%). Schools do have some control over their rank... ie, mcat score, acceptance rate, etc (20%).
 
being a great hospital and great reasrch medical schools are two different things. Yes, I agree mayo is great for clinical care...basic research. I feel it's ranking is appropriate.

Actually, Mayo could be used as the poster child for problems with the ranks; if you've seen its medical education its right up their with its graduate medical education, which is considered some of the best in the world. The school is new, though, and gets punished for that...Also they do a TON of funded research through sources other than NIH, which also hurts them n the rankings. I actually turned down 2 top 10 schools to go there, simply because there isn't much difference between it and I'd say some other top 1-5 schools even, except for a lot of us its free :D
 
One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in the fields of family practice, pediatrics, and internal medicine.

Even worse is the way that they arbitrarilly are using the residency director info they do have, scant though it may be. Many of the schools at the top of the research ranking in fact do send a majority of their students into primary care, and yet those fields' directors were not included in the analysis for the research ranking. So fields like radiology get a bigger vote notwithstanding a much smaller fraction of people at any school going into that field.

Without a reasonable cross section and a higher response rate, these surveys would actually be more accurate if they lost this residency director category all together. The research ranking mostly is a ranking of research grants anyhow. Which doesn't equate to better academics, but perhaps equates to better known personnel with whom to do research.
 
I read this article in the NYTimes maybe, last year, that talked about how because peer assessment score figured so importantly into usnwr rankings, schools would flood brochures to other institutions' admissions staff as propaganda.

The problem I have with usnwr med school rankings is that it just incorporates a bunch of factors I could care less about. Peer assessment??? who gives a flying goat what other med school admissions committees think of your school. If you argue that it just reflects the overall prestige of your school, yeah, the more 'prestigious' your school is, possibly the better your chances for future applications - but prestigious has a fair amount of flexibility, and would be better served by tiers, not numeric rankings

Residency director score, I'll give you, and sidestep the whole, whos surveyed whos not business.

NIH grants - honestly, do you think that will determine 30% of the quality of your education? Seriously, try very hard not to be naive about this. There are talented faculty everywhere these days...senior faculty members, academic stars, up and coming faculty, they are all over the place, and schools are stealing from each other left and right.

I do agree that caliber of student body is defintely important to me in what I want in a med school. And I also think GPA is huge in med school admissions - BUT i think GPA really is no measure of a student body. It is a measure of your work ethic and your ability to push yourself(and maybe how easy your classes were) and thats it. And more importantly, what I want in a student body is an interesting, fun one, something usnwr will never measure.

Of course these rankings are not permanent...but there was this really interesting post on another thread where someone posted the rankings of the top 50 or so schools for the past like decade...almost all of them stagnated. What does that mean....nothing!!!!! just like usnwr rankings!!!!
 
Actually, Mayo could be used as the poster child for problems with the ranks; if you've seen its medical education its right up their with its graduate medical education, which is considered some of the best in the world. The school is new, though, and gets punished for that...Also they do a TON of funded research through sources other than NIH, which also hurts them n the rankings. I actually turned down 2 top 10 schools to go there, simply because there isn't much difference between it and I'd say some other top 1-5 schools even, except for a lot of us its free :D
I am not saying mayo is a "bad" school. I just don't think it is up to other top ten schools interms of basic research. That is what the Research med school ratings are really about. I guess in the end we are arguing about two different things. Yes, if you think the US news are really telling you what type of education you will get then they are really useless. However, if you are interested in basic research I think they are fairly accurate.
 
I am not saying mayo is a "bad" school. I just don't think it is up to other top ten schools interms of basic research. That is what the Research med school ratings are really about. I guess in the end we are arguing about two different things. Yes, if you think the US news are really telling you what type of education you will get then they are really useless. However, if you are interested in basic research I think they are fairly accurate.

I would disagree to say that Mayo isn't up to par with research. They just receive their money from the Sultan of Arabia not the NIH. That is why they are lower in research in the rankings. They have enough in-house money to fund anything they want.
 
might be good for a macro perspective (ie Harvard > #48), but try and find the pros vs cons between two schools and US News falls apart horribly
 
I would disagree to say that Mayo isn't up to par with research. They just receive their money from the Sultan of Arabia not the NIH. That is why they are lower in research in the rankings. They have enough in-house money to fund anything they want.

Funny enough, they actually have their own FLOOR on the hotel accross the way from the clinic :laugh:
 
interesting points about the utility of ranking schools linearly. also good points about how institutions who train clinically for non-primary care residencies can get shafted, although i still feel there is a strong correlation between good research universities and this aspect.
 
it is obvious to... most grads as to what schools are known to be prestigious, just look for the ones with high gpa/MCAT averages in the MSAR. Being able to be so selective in choosing is a definite indication of prestige. US News just goes about defining prestige (or what is a top school) in the wrong way. There simply isn't a way to quantify that sort of thing

I agree with the post, but this last statement is circular reasoning. Why is a school good? b/c it has high GPA/MCAT. Well why does it have a high GPA/MCAT? b/c it's good. You're defining prestige based on selectivity. The problem is that the average med school applicant will see that it is prestigious, and thus only the more competitive applicant will apply, thus furthering the cycle of "academic reputation."
 
Top