2008 US News Medical School Rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
would not argue that a school that has a median GPA of 3.9 cares more about the MCAT than grades.
Point taken and conceded. It was probably not wise of me to make this comparison with any given pair of schools. You'd have to do the actual statistical analysis to know this for sure.

I will tentatively stand by the original point, for now: I do think that when one variable has a larger spread than the other, it indicates that different "types" of medical schools place different weights on those variables.

There are other ways to interpret this, of course. You could be seeing a threshold effect: that nobody cares once the GPA is above (say) 3.65 and then all that matters is MCAT scores. But I would be willing to hazard a guess that it's the former rather than the latter.

Members don't see this ad.
 
Wuh oh. Weight of academic institutions versus their GPA's is a can or worms I wouldn't touch with a ten foot pole.

Oh, and dude. Worms are gross. I wouldn't even touch a can full of those bastards with an 11 foot pole.
 
Source? I'd be curious to actually see this. You have data that shows trends in which physicians from lower ranked schools are sued more frequently than higher ranked schools? I'd find this suprising.

It does not seem to be correlated with damage done, any clinical outcome, general patient satisfaction, lawyers per square mile, lawyer advertising, or -- to refute your claim -- medical school degree, including DO's and overseas degrees.

Not dead and devilish, please read all of the posts.


I am clueless as to where you got that I gave a blanket rejection of any kind of hierarchy among US med schools. What I think I said was that lawyers would (and do) get more joy destroying a Hopkins grad's career than a Kentucky one. Such is the nature of being a hired gun -- you always want to take down the bigger mark. (Actually in this respect I think I am probably validating a hierarchy, not disputing one, just saying that in the world of litigation, a better school may put you in a worse place on the food chain -- a tastier morsel for voracious litigators).

See devilishlyblue's post a few posts back for a better (but less colorful) explanation than I gave re why this is so.

L2d, I'm referring to your overall post history.
 
Members don't see this ad :)
all of the posts

All of the posts where? This thread? (Which one specifically did you have in mind?) All of SDN? The internet in general?
 
L2d, I'm referring to your overall post history.

Then you have misunderstood a lot of what I have written. I don't deny a hierarchy, and don't dispute that folks, including myself, have used the research rankings and found them helpful, to a degree. I do suggest that as you go down this road, you will likely have a very different view about how much it really matters, in terms of where you ultimately end up. Everything counts, but a lot of things that seem important when you are in pre-allo, turn out to have a significantly smaller impact than folks on this board would have you believe. No point debating this -- you will use the rankings as a lot of us did. But when you get further down the road your posts might take a very different tack.
 
jesus, you guys are idiots. i'm out of here for a while again.

edit: I appreciate you post, L2d. You must have been writing at the same time I was. As a side note, I would guess that the two of us are probably at about the same position in our medical careers, and probably have similar previous 'real world' experience. Personally, I think that many people here are staunchly anti-elitist, to the point of denying real credible information about how much of the world works. But then, that is what often happens on internet forums.

peace
 
The most notable shooting star that continues to be on the upswing, however, is Pittsburgh. It went from merely an above-average academic institution to elite class in a period of about 20 years, thanks in part to sound economic management, a lack of real local competition, and a seismic shift in philosophy. This is a school that will continue to rise -- you can bank on that.

Good analysis.

Another rising star is Brown do to similar reasons as Pitt. They have begun to capitalize on their brand and location. Just got a $100m gift. Lots of new buildings and faculty. New leadership and plan. It still has a long way to go, but it has been rising over the past few years and will continue to do so.

Another falling star is Tufts. Similar problems as NYU IMO.
 
Finally, despite what's going on in NY, MSSM has steadily risen back to prominence over the past ~10-15 years, and it's climbing even faster today. I don't know much about their location or how they're pulling it off, but it definitely looks like a strong trend.

MSSM is pulling off one of the biggest turn-arounds in school history. Last year it was the only major NYC hospital to get into the black. Other centers around MSSM (notably NYU, Cornell and Columbia) are stagnant and/or sinking further into the red. I attribute this to the superior CEO management of Dr. Ken Davis, who has taken Sinai from being the sickest NYC hospital to the healthiest in a span of less than 5 years by using a creative combination of streamlining and resource management.

Additionally, MSSM is making huge strides in NIH funding on a yearly basis.
 
Re: trends, I keep a large collection of data that examine things like NIH funding and US news stuff over time, in addition to other indicators.

The long-term trends are really interesting if you take the time to examine them. Looking at the data, you can often pinpoint exactly when a school decided to adopt a more academic philosophy, often due to a change in leadership. Typically, the schools that successfully manage to climb up the academic ladder either: A) have monopolies in large markets, B) are clearly distinguished over their peers in very large markets, or C) benefit from outside industrial influences.

The typical school that consistently trends downward is the one that suffers at the hands of a more reputed neighbor in a saturated market. The down-trending schools usually don't have much room to expand and are typically located in the northeast.

The most notable shooting star that continues to be on the upswing, however, is Pittsburgh. It went from merely an above-average academic institution to elite class in a period of about 20 years, thanks in part to sound economic management, a lack of real local competition, and a seismic shift in philosophy. This is a school that will continue to rise -- you can bank on that. Other prominent med schools that are steadily rising and have yet to hit their peaks include Emory and OHSU.

The most under-the-radar rankings-climber this year inhabits the OC. In terms of research dollars, UC Irvine has taken off in an extremely short period of time. Expect a sizeable jump in next year's issue of US news, which will rely heavily on FY 06 and FY 07 awards from the NIH.

Of the schools trending downward, the most prominent is NYU. It went from top 10 in research funding to top 40 (and soon to be top ~45) in a period of about 30 years, and it continues to nosedive in this category at an alarming rate. This is likely a consequence of its location in a saturated market and its inability to expand much more. Other schools that have also fallen hard include Einstein, SUNY Downstate, SUNY Buffalo, Upstate, Albany Med, NYMC, Temple, and UMiss (don't ask me how that one got in there). Pritzker had also been gradually falling until about 10 years ago, when it reversed course and picked up again. It is still, however, a far cry from its previous top ten status.

Finally, despite what's going on in NY, MSSM has steadily risen back to prominence over the past ~10-15 years, and it's climbing even faster today. I don't know much about their location or how they're pulling it off, but it definitely looks like a strong trend.


nice stock market analysis... and I think Wayne is a definite strong buy with high potential for the future ;)
 
Does anyone have any thoughts about the direction VCU/MCV is heading? A few years ago it was ranked 60 in research.
 
Does anyone have any thoughts about the direction VCU/MCV is heading? A few years ago it was ranked 60 in research.

vcuef7.jpg


I don't see any real trend over the past ~6 yrs.

Of course I have no idea where it's headed. I don't know anything about the school, the size of the Richmond, or if there's any new leadership or philosophical change that might suggest new research priorities. Maybe others would know.
 
vcuef7.jpg


I don't see any real trend over the past ~6 yrs.

Of course I have no idea where it's headed. I don't know anything about the school, the size of the Richmond, or if there's any new leadership or philosophical change that might suggest new research priorities. Maybe others would know.

where did you get this data? Do you have this for any other schools? Where did you get stuff from that far back ?
 
Members don't see this ad :)
vcuef7.jpg


I don't see any real trend over the past ~6 yrs.

Of course I have no idea where it's headed. I don't know anything about the school, the size of the Richmond, or if there's any new leadership or philosophical change that might suggest new research priorities. Maybe others would know.

Does anyone understand why they made this graph look like getting worse is a good thing?
 
Bear in mind that to some extent the direction these schools go has more to do with how well they can entice (poach) big name folks away from other programs than any improvement in focus or management. Many schools jump a bit in the research rankings not because they have done anything different year to year but because they have lured someone with good grants or clout away from a competitor. When you see roller coaster trends, this is sometimes the case. Also bear in mind that the actual meaning of research grants doesn't always translate to a different experience to students, so if you are talking about the difference between school X and school Y which are only a few ranking spots apart, the difference could easilly be some scientist who works in a building off campus with no contact with med students, but decent grant funding, not that the school is "better". So use the rankings to some degree, but I wouldn't get carried away with the minute differences or year to year blips and dips.
 
where did you get this data? Do you have this for any other schools? Where did you get stuff from that far back ?
I collect this kind of stuff, though you can get all the raw NIH data here.

And yeah, there's data for every school.
 
I collect this kind of stuff, though you can get all the raw NIH data here.

And yeah, there's data for every school.

Allegheny University... there's a blast from the past
 
can someone post rankings past the 60th position?
 
Can someone please list the school that ranked #'s 50-60. thanks!
 
I agree with this, except for the "great city" part. Philly blows. You'll learn this if you go to Penn. Even people from Philly seem to hate Philly.

You've obviously never had pizza fries or a scrapple breakfast hoagie. There's also another specialty it's famous for, can't quite remember. Philly rocks. Go Birds.
 
You've obviously never had pizza fries or a scrapple breakfast hoagie. There's also another specialty it's famous for, can't quite remember. Philly rocks. Go Birds.

And, it's the birthplace of this great nation! Ben Franklin, Thomas Jefferson! Democracy! Independence! There's an old-fashioned soda fountain there, where you can get served by people in bowties and little caps. I mean, how cool is that?
 
Just as a side note on the rankings. For some of the scores, I was suprised to see the percentage of those surveyed who actually responded. The data can be found here:

http://www.usnews.com/usnews/edu/grad/rankings/about/08med_meth_brief.php

For the "Peer Assessment Score" about 49% responded.

In the "Assessment Score by Residency Directors," in the research model, only 25% responded, and in primary care, only 18% responded.

The residency director assesment response percentage seems extremely low to me.
 
Just as a side note on the rankings. For some of the scores, I was suprised to see the percentage of those surveyed who actually responded. The data can be found here:

http://www.usnews.com/usnews/edu/grad/rankings/about/08med_meth_brief.php

For the "Peer Assessment Score" about 49% responded.

In the "Assessment Score by Residency Directors," in the research model, only 25% responded, and in primary care, only 18% responded.

The residency director assesment response percentage seems extremely low to me.

That's because most of them do not give a damn about the US News rankings.
 
Just as a side note on the rankings. For some of the scores, I was suprised to see the percentage of those surveyed who actually responded. The data can be found here:

http://www.usnews.com/usnews/edu/grad/rankings/about/08med_meth_brief.php

For the "Peer Assessment Score" about 49% responded.

In the "Assessment Score by Residency Directors," in the research model, only 25% responded, and in primary care, only 18% responded.

The residency director assesment response percentage seems extremely low to me.


"Responded" is a misleading term without the rest of the paragraph. If the director's answered , "do not know," it didn't count as a "response," but they still responded to the survey.
 
Can someone please post those peer assessment and residency director scores for the top 15-20?
 
That's because most of them do not give a damn about the US News rankings.

I have to disagree. As stupid as it seems, people live and die by these rankings. I work for a department at a prestigious medical school, and one of the department goals is to climb the US News rankings. They actually have department meetings about it. It's sad.
 
I have to disagree. As stupid as it seems, people live and die by these rankings. I work for a department at a prestigious medical school, and one of the department goals is to climb the US News rankings. They actually have department meetings about it. It's sad.

It is true that these rankings are stupid, but some people just want to get more and more recognition and rankings are just one way of getting recognized.
 
I have to disagree. As stupid as it seems, people live and die by these rankings. I work for a department at a prestigious medical school, and one of the department goals is to climb the US News rankings. They actually have department meetings about it. It's sad.

I'm not doubting that Towelie, but that is only one department at one medical school. If the vast majority of the residency directors out there did care about the US News rankings, trust me, they would have participated.
 
I'm not doubting that Towelie, but that is only one department at one medical school. If the vast majority of the residency directors out there did care about the US News rankings, trust me, they would have participated.

True. I've come to realize that my department is full of *****s.

Also, I guess "participation" by voting on other schools' prestige is different from what my department is doing--pandering to US News in a pathetic attempt to rise in the rankings.
 
I'm not doubting that Towelie, but that is only one department at one medical school. If the vast majority of the residency directors out there did care about the US News rankings, trust me, they would have participated.

I don't know about that. I doubt that a residency director can vote for his or her own school, so their responses only serve to boost or depress the rankings of other schools. Each director at a ranked school is probably keenly interested in the ranking of his or her school, but can't really do anything about it by simply participating in the survey by US News. But I bet that they do plenty of other things to influence their ranking when they can. Presumably some of the things that they do amount simply to improving their program, but that surely isn't everything.

Furthermore, as someone has already pointed out, whenever a director states 'no opinion' on the survey, it's counted as 'no response' even though a response was actually given. Maybe most directors respond to the survey, but each only feels qualified to give an opinion about a handful of programs.
 
True. I've come to realize that my department is full of *****s.

Also, I guess "participation" by voting on other schools' prestige is different from what my department is doing--pandering to US News in a pathetic attempt to rise in the rankings.

Nice! :laugh: :laugh:
 
I don't know about that. I doubt that a residency director can vote for his or her own school, so their responses only serve to boost or depress the rankings of other schools. Each director at a ranked school is probably keenly interested in the ranking of his or her school, but can't really do anything about it by simply participating in the survey by US News. But I bet that they do plenty of other things to influence their ranking when they can. Presumably some of the things that they do amount simply to improving their program, but that surely isn't everything.

Furthermore, as someone has already pointed out, whenever a director states 'no opinion' on the survey, it's counted as 'no response' even though a response was actually given. Maybe most directors respond to the survey, but each only feels qualified to give an opinion about a handful of programs.

"In the fall of 2006, residency program directors were asked to rate programs on two separate survey instruments. One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in the fields of family practice, pediatrics, and internal medicine. Survey recipients were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 25 percent of those surveyed for research medical schools responded. Eighteen percent responded for primary-care. The source for the names for both of the residencey directors'surveys was the Graduate Medical Education Directory 2006-2007 edition, published by the American Medical Association."

The portion in bold is not consistent with your statement.

Nothing is said about not allowing a residency director to rate their own program. Who cares if they give their affiliated medical school a 5.0?
 
Just to be clear, I believe that the majority of residency directors do not care about the US News rankings. This is not to say that medical schools do not care - I think a lot of medical schools have a vested interest in where they fall out in the rankings.

Residency directors, on the other hand, are probably more concerned about how their hospital stacks up compared to others. Since a mechanism to rank individual residency programs does not exist, the hospital rankings in particular specialties are probably as close as they come.
 
"In the fall of 2006, residency program directors were asked to rate programs on two separate survey instruments. One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in the fields of family practice, pediatrics, and internal medicine. Survey recipients were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 25 percent of those surveyed for research medical schools responded. Eighteen percent responded for primary-care. The source for the names for both of the residencey directors'surveys was the Graduate Medical Education Directory 2006-2007 edition, published by the American Medical Association."

The portion in bold is not consistent with your statement.

Nothing is said about not allowing a residency director to rate their own program. Who cares if they give their affiliated medical school a 5.0?

Hmmm, I stand quite corrected regarding the number of directors who respond. As far as directors ranking their own schools, I have no evidence one way or the other and am just speculating, of course; but it seems that US News would want to avoid the skewing of their results by such a clear conflict of interests. However, since every director could do it for his or her own school, I suppose it would all come out in the wash anyway.
 
Just to be clear, I believe that the majority of residency directors do not care about the US News rankings. This is not to say that medical schools do not care - I think a lot of medical schools have a vested interest in where they fall out in the rankings.

Residency directors, on the other hand, are probably more concerned about how their hospital stacks up compared to others. Since a mechanism to rank individual residency programs does not exist, the hospital rankings in particular specialties are probably as close as they come.

Good thought, good point. You've clearly thought about this more thoroughly than I have. And hell, maybe you're just smarter than I am. ;)
 
Good thought, good point. You've clearly thought about this more thoroughly than I have. And hell, maybe you're just smarter than I am. ;)

No way! I have probably spent way too much time trying to figure out exactly how important medical school rank/prestige is in the grand scheme of things. Ultimately, I want to attend a medical school that will allow me to get into the residency program of my choice. So, I thought the residency director assessment component of the US News rankings would be worthwhile. Unfortunately, after reading the methodology, I found out how useless those rankings truly are.
 
Have the primary care rankings after #10 been posted yet? Thanks in advance.
 
Okay I have a question.
There are two lists for medical schools...
1) Research
2) Primary care.

My question (which is probably a really noobie question) is what is the difference? I mean... it seems obvious but what exactly are the criteria?
 
Okay I have a question.
There are two lists for medical schools...
1) Research
2) Primary care.

My question (which is probably a really noobie question) is what is the difference? I mean... it seems obvious but what exactly are the criteria?

They have ranking methodologies on the website that explain all this. That's probably the best place to find out.
 
anyone have the rest of the rankings for the patient care side?
 
These are the 2008 U.S. News Rankings - Primary Care

(1) Univ. of Washington
(2) UNC-Chapel Hill
(3) Univ. of Colorado-Denver and HSC
(4) OHSU (Oregon)
(5) Michigan State University COM (Osteopathic)
(6) East Carolina University Brody
(7) Univ. of Vermont
(8) UCSF
(9) Univ. of Wisconsin-Madison
(10) BCM (Baylor)

I think you got #10 wrong on the Primary Care list.
1. University of Washington
2. University of North Carolina–Chapel Hill
3. University of Colorado–Denver and Health Sciences Center
4. Oregon Health and Science University
5. Michigan State University College of Osteopathic Medicine
6. East Carolina University (Brody) (NC)
7. University of Vermont
8. University of California–San Francisco
9. University of Wisconsin–Madison
10. University of Nebraska College of Medicine

http://education.yahoo.com/college/essentials/school_rankings/med/
(USNews' website.. my subscription ended a couple days ago).
 
I think you got #10 wrong on the Primary Care list.
1. University of Washington
2. University of North Carolina–Chapel Hill
3. University of Colorado–Denver and Health Sciences Center
4. Oregon Health and Science University
5. Michigan State University College of Osteopathic Medicine
6. East Carolina University (Brody) (NC)
7. University of Vermont
8. University of California–San Francisco
9. University of Wisconsin–Madison
10. University of Nebraska College of Medicine

http://education.yahoo.com/college/essentials/school_rankings/med/
(USNews' website.. my subscription ended a couple days ago).

11. BCM
 
Top