Internal Medicine Residency Rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Paulraj

New Member
10+ Year Member
15+ Year Member
Joined
Oct 6, 2004
Messages
3
Reaction score
0
I have just a simple question. Does anyone have access to a relatively accurate ranking list of Internal Medicine Residency programs? If so, I think that posting of the top 25 or so, even from US News and World Report, would be tremendously helpful to all.

Thanks.

Members don't see this ad.
 
Paulraj said:
I have just a simple question. Does anyone have access to a relatively accurate ranking list of Internal Medicine Residency programs? If so, I think that posting of the top 25 or so, even from US News and World Report, would be tremendously helpful to all.

Thanks.

you know about the search function, right?

Peter
 
i just cut and pasted from that thread...


Newly ranked in 2004....

1. Johns Hopkins University (MD)
2. Harvard University (MA)
3. University of California?San Francisco
4. Duke University (NC)
5. University of Washington
6. University of Pennsylvania
7. Washington University in St. Louis
8. University of Michigan?Ann Arbor
9. Stanford University (CA)
10. U. of Texas Southwestern Medical Center?Dallas
11. Yale University (CT)
12. Columbia U. College of Physicians and Surgeons (NY)
13. University of California?Los Angeles (Geffen)
14. Mayo Medical School (MN)
15. University of Chicago
16. University of North Carolina?Chapel Hill
17. Vanderbilt University (TN)
18. Cornell University (Weill) (NY)
19. University of Alabama?Birmingham
20. University of Colorado Health Sciences Center
21. Emory University (GA)
University of California?San Diego
23. Northwestern University (Feinberg) (IL)
24. Baylor College of Medicine (TX)
New York University
26. University of Iowa (Roy J. & Lucille A. Carver)
University of Virginia
28. Case Western Reserve University (OH)
Indiana University?Indianapolis
Mount Sinai School of Medicine (NY)
 
Members don't see this ad :)
That list is really tough to swallow b/c it seems heavily NIH grant-related. But it's better than any list I could come up with. I think general perception of the entire university itself takes a whole lot longer to change for the better or for worse despite these rankings. E.g., I would have ranked UCLA or Columbia or Mayo ahead of UTSW, UW. I would not have guessed Case Western was in the top 25. When I think of Indiana, I think Hoosiers. Never would've guessed U Virginia was in there either. I think you'll get an equally good eduation at all those places, but like most things it's what you put in. The environment can only help you so much.
 
pufftissue said:
That list is really tough to swallow b/c it seems heavily NIH grant-related. But it's better than any list I could come up with. I think general perception of the entire university itself takes a whole lot longer to change for the better or for worse despite these rankings. E.g., I would have ranked UCLA or Columbia or Mayo ahead of UTSW, UW. I would not have guessed Case Western was in the top 25. When I think of Indiana, I think Hoosiers. Never would've guessed U Virginia was in there either. I think you'll get an equally good eduation at all those places, but like most things it's what you put in. The environment can only help you so much.


There is no doubt in my mind that UVA is a top 20 program, if not a top 15 program.
 
pufftissue said:
That list is really tough to swallow b/c it seems heavily NIH grant-related. But it's better than any list I could come up with. I think general perception of the entire university itself takes a whole lot longer to change for the better or for worse despite these rankings. E.g., I would have ranked UCLA or Columbia or Mayo ahead of UTSW, UW. I would not have guessed Case Western was in the top 25. When I think of Indiana, I think Hoosiers. Never would've guessed U Virginia was in there either. I think you'll get an equally good eduation at all those places, but like most things it's what you put in. The environment can only help you so much.

On what basis would you rank UCLA, Columbia, and Mayo ahead of UTSW? I'm trying to think what negative the program here has, but still can't think of one. We've just added St. Paul University Hospital to further increase the clinical variety and volume and every subspecialty is strongly represented here. Graduates routinely get the fellowship of their choice or jobs in prime locations.
 
I'm not so sure that these rankings are of the Internal Medicine Residency programs. I think they're just a further extension of the US News Hospital Rankings. So I wouldn't put too much into these or any other rankings.
 
mediocre said:
I'm not so sure that these rankings are of the Internal Medicine Residency programs. I think they're just a further extension of the US News Hospital Rankings. So I wouldn't put too much into these or any other rankings.


I agree. These rankings are largely affected by how much NIH grants these places get for research.
 
These rankings by no means have anything to do with residency programs, only research, and hospital reputations. The mayo clinic, where it would be great to have my dad cared for, is really an average IM residency (see list of those accepted and their medical schools, as well as fellowship placement)(ditto case western)(ditto emory).
Talk with your PD for better advice, also no ranking is as good as the feeling you get from visiting and talking with the housestaff, dont get too caught up in the numbers.

peace
 
Where can I find a list of where graduates from Mayo, Case, and Emory come from and go for fellowship? I agree that the rankings are almost completely NIH funding based, but to say that those are "average" programs is a bit much.
 
The Case program that is ranked is "University Hospital's of Cleveland". It is affiliated with Case Western Reserve but so is "Metro" which is not nationally ranked.

As for University Hospital it depends on the field as to where we match. From the past year or two we have had people go to cleveland clinic for GI, John's Hopkins for Immunology, Michigan for Cardiology, also we keep several people each year in many areas as most of the subs are well ranked at University also.

As for the people who blame the NIH funding for ranking, what do you think makes these programs the best? It is money. Without it you do not recruit top notch talent and get national/international recognition which is attached to the names of personnel on faculty.

The down side to all this for residents is that the ranking is usually directly related to the amount of hours you work. :)

Hope that helps. If you interview at University they should provide you with a packet that list where all of the IM graduates go post residency.
 
Art Vandelay said:
Where can I find a list of where graduates from Mayo, Case, and Emory come from and go for fellowship? I agree that the rankings are almost completely NIH funding based, but to say that those are "average" programs is a bit much.

Look, I don't train at Mayo Rochestor, Case, or Emory, but how could you base how good a program is on where their residents went to medical school?? That is the most idiotic advice you could give a med student.. there are many people who go to "middle tier" or "lower tier" med schools and are SOLID residents.. much better than "high tier" med school students who were at the bottom of their class..

look, you can use rankings all you want if you want to get caught up in this web of always worrying about where your program ranks and how hard you can possibly work to maintain that status as opposed to working hard to LEARN medicine, which are two very different things..

Someone asked me what the hell they would want to go to UTSW.. I'm not a resident there for geographic reasons, but I told him (he was from that area) that it is a place KNOWN for producing solid resident who develop awesome skills through repetition and faculty.. that's what you want to be at the end of your training.. and you want to go to a program that will supply you with awesome faculty who love to teach and ample opportunities to apply this knowledge...

NOT a place, like some of these "top places" where you will scut your whole 3years and learn from disgruntled attendings who have their heads so far up in the air that they don't even notice if your learning anything or not...
 
That was a nice little rant there, but I am not sure why you used my post as its basis? I was responding to someone who made it sound like those were weak programs because of where they got their residents from It so happens that those are programs that I am interested in. I would very much like to know how residents from programs I am interested in fare as far as obtaining fellowships goes. I agree w/ most of your post, but I think reputation does make a difference w/ regards to fellowship placement.
 
Members don't see this ad :)
Art Vandelay said:
That was a nice little rant there, but I am not sure why you used my post as its basis? I was responding to someone who made it sound like those were weak programs because of where they got their residents from It so happens that those are programs that I am interested in. I would very much like to know how residents from programs I am interested in fare as far as obtaining fellowships goes. I agree w/ most of your post, but I think reputation does make a difference w/ regards to fellowship placement.


Sorry Art,

I replied to the wrong post... sorry about that...

about fellowhship, I do agree with you, but if you run a medline search on fellowship placement (seriously, you won't believe what will come up), you'll see the most important item is your LOR from a specialist in the area that you want to specialize in, AND his/her connections...

I guess what I was trying to say is that there are many programs that may not be top 10, or even top 25, but may set you up well for fellowships... and if all you want to do is IM, then almost any program that meets your PERSONAL criteria will do...

hope this helps, and sorry for the post, ARt..
 
Residency training has nothing to do with NIH funding. Why do residents give a hoot about NIH funding? Funding only means money for the medical school and maybe that can help med school reputation somewhat, but the training hospital needs to carry itself and creates its reputation. If the hospital does not have people of national prominence, it is still a community hospital at best. Such is the case with University hospitals of Cleveland/Case Western. Its presence is only local at best. The only reason it is up there is because of CWRU NIH funding. For the recent fellowship matching, residents at best match at UH(which for cardiology, falling apart at this point with no chief/program director, other specialty also average at best). The Michigan match from UH is only if you are chief resident(for cards), Hopkins GI but that guy is MD/PhD with 20 publications. Other matches include ID at MGH(if you want to rave about that). If you rank Case Western ahead of Mount Sinai of NYC, I would think you are delusional(Sinai with Valentin Fuster at Cardiology Chief has been sending residents to UCSF/UCLA for fellowship). Other NYC programs like Columbia/Cornell also send residents for card fellowships at places like SF/PENN/Texas Heart/JHU/Brigham/UCSF. Coming from a regional Case Western you would probably be matching card fellowship at Brigham in your dreams. Obviously BWH/MGH/JHU/Duke/PENN IM programs are always solid(CCF especially likes residents from these institutions for card fellows, also included, from Vandy/UVA/Michigan/Stanford. Most people from west coast tend to stay in west coast(ie SF people in SF for fellowship, LA people-4 IM/fellowship programs w/in UCLA to choose from. Mayo sends good number of residents to Texas Heart/Emory/Duke for cards. Other IM programs like Emroy/UAB/UTSW/U Wsh/Wsh U/Michigan/Vandy/UNC all have very solid IM programs(to get residents to top 10 fellowship programs). But I would not even rank Case Western(be very low in reputation) on that list( I forgot to mention NWU/U Chicago both also with good IM reputations.)
 
So if Case if so good and has so much NIH funding then tell me why the cardiology division is falling apart at this moment with no chief/program director, dwindling attendings( from 45 down to <20, at Case website they list more attendings including the ones who have left long time ago.), precipitous fall in cath case volumes. Who of national prominence in cards is left at Case(don't think there are too many to begin with, and don't mention the ones who are about to retire).


csullivan said:
The Case program that is ranked is "University Hospital's of Cleveland". It is affiliated with Case Western Reserve but so is "Metro" which is not nationally ranked.

As for University Hospital it depends on the field as to where we match. From the past year or two we have had people go to cleveland clinic for GI, John's Hopkins for Immunology, Michigan for Cardiology, also we keep several people each year in many areas as most of the subs are well ranked at University also.

As for the people who blame the NIH funding for ranking, what do you think makes these programs the best? It is money. Without it you do not recruit top notch talent and get national/international recognition which is attached to the names of personnel on faculty.

The down side to all this for residents is that the ranking is usually directly related to the amount of hours you work. :)

Hope that helps. If you interview at University they should provide you with a packet that list where all of the IM graduates go post residency.
 
i am wondering why you would say that about u va. as an applicant, i am just trying to learn more about their program.

handsports said:
There is no doubt in my mind that UVA is a top 20 program, if not a top 15 program.
 
mediocre said:
I'm not so sure that these rankings are of the Internal Medicine Residency programs. I think they're just a further extension of the US News Hospital Rankings. So I wouldn't put too much into these or any other rankings.

Actually, these ARE specifically rankings of the departments of medicine at these particular institutions, according to the US News site (here are the new ones, although you can only see the top 3 unless you pay: http://www.usnews.com/usnews/edu/grad/rankings/med/brief/medsp03_brief.php)

Contrary to what's been presumed in some posts here, the methodology behind these rankings is actually NOT based on research, as opposed to the formula used to calculate the medical school rankings in US News. Rather, it's based 100% on the opinion of deans and faculty. The US News page states : "Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here." (here's the link; its at the very bottom of the text on this page: http://www.usnews.com/usnews/edu/grad/rankings/about/06med_meth_brief.php)

So basically, these rankings are based on the opinion of medical school deans and faculty as to which medicine departments/programs are "the best." I think this is a more fair measure, since it would presumably be based at least in part on the quality of the residents produced by these departments. That said, I'm sure it's tough to eliminate the influence of NIH funding as playing a role in these people's "opinions." And of course, just because popular opinion doesn't dictate that certain programs aren't top 25 programs hardly implies that the training at such programs isn't still top-notch.
 
Thanks for clarifying tommy.
 
Here's an op-ed that the University of Chicago economist Austan Goolsbee wrote last year. Read it and understand it, and realize that much of this talk about "rankings" is meaningless.
How Rankings Rate
New York Times
April 12, 2004
By Austan Goolsbee

From best-selling book lists to Zagat dinner recommendations to David Letterman's top 10, it's clear that Americans love rankings. Little surprise, then, that the release this month of U.S. News & World Report's annual ratings of the nation's graduate schools has prospective students swarming newsstands for advice on where to go. The list has an undeniably huge impact. Applications typically soar at universities that get top marks; some state governments even use the rankings to help them decide how much money to give to their universities.

Critics argue that rankings like these try to quantify the unquantifiable - that variables used to measure quality are often flawed and incomplete. But the often overlooked, and perhaps more troubling, problem is that the average person who uses these rankings may not understand basic statistics. If he did, he would know that even if all the right variables were included, once he got past the top spots, the distinctions between schools (or restaurants or places to retire) are often meaningless.

Take a look at the sales ranking data at Amazon.com. I once
needed a publication, a World Bank technical paper on the
regulatory environment in Bulgaria. The demand for technical working papers on Bulgaria being what it is, the publication's Amazon sales rank was extremely low (about 2.5 million down on the list). Yet after I bought it, a most amazing thing happened. My one purchase moved this working paper past almost one million other books. This happened because once buyers get out of the best sellers -where the difference in sales can be enormous - almost everything else is basically tied. The differences in rank are statistically meaningless, and small blips cause big changes.

This same is true for the rankings of all sorts of things.

In the New York City Zagat guide, for example, a restaurant that raised its food rating by three points would pass 50 restaurants if it went from 25 to 28 (from "excellent" to "extraordinary"), but 439 establishments if it went from 19 to 22 (both considered "very good").

Then there are the education rankings. For the large mass of schools that rate in the middle of lists like U.S. News & World Report's, rankings are extremely sensitive to small blips. Any criteria included in a survey, like undergraduate grade point average or percentage of applicants accepted, will have random variations over time.

This is perfectly normal. These fluctuations, though, cause the rankings of schools in the middle to move all over the place. It's why schools at the top seldom move more than a place or two, but the Tippie College of Business at the University of Iowa can move up five spots one year and down 18 the next (which is what just happened).

This sensitivity to small changes also tends to encourage bad behavior among university administrators. Some, for example, no longer report foreign students' verbal scores on standardized admissions tests to get an overall better result, or have wait-listed candidates that are likely to turn their schools down so their university looks more selective. These things may raise measured performance a tiny bit, but tiny bits can mean a lot when everyone is clustered in the middle.

The lessons are pretty obvious: outside the very top, be careful about basing decisions on rankings data. The differences are often statistically meaningless. If possible, whether you're interested in schools or restaurants, look at the last several surveys, instead of just the most recent one, since random blips tend to average out over time. Certainly don't pick a Chinese restaurant tonight because its rating is 20 versus another's 19.

For those who really can't get enough, it's now possible to find rankings of the rankings. U.S. News & World Report may get lots of attention, but as of last week its "Ultimate College Directory" for 2004 had an Amazon sales rank of 22,305, worse than the Intercollegiate Studies Institute's "Choosing the Right College 2004" at 19,752.

How is a rankings addict to choose? Next time, perhaps, you would do well to consider an alternative: the informative (and surprisingly popular) "Cartoon Guide to Statistics." Its sales rank is 8,600. •

Austan Goolsbee is a professor of economics at the University of Chicago Graduate School of Business. This article reprinted with permission from the New York Times.
 
atsai3 said:
Here's an op-ed that the University of Chicago economist Austan Goolsbee wrote last year. Read it and understand it, and realize that much of this talk about "rankings" is meaningless.
How Rankings Rate
New York Times
April 12, 2004
By Austan Goolsbee

From best-selling book lists to Zagat dinner recommendations to David Letterman's top 10, it's clear that Americans love rankings. Little surprise, then, that the release this month of U.S. News & World Report's annual ratings of the nation's graduate schools has prospective students swarming newsstands for advice on where to go. The list has an undeniably huge impact. Applications typically soar at universities that get top marks; some state governments even use the rankings to help them decide how much money to give to their universities.

Critics argue that rankings like these try to quantify the unquantifiable - that variables used to measure quality are often flawed and incomplete. But the often overlooked, and perhaps more troubling, problem is that the average person who uses these rankings may not understand basic statistics. If he did, he would know that even if all the right variables were included, once he got past the top spots, the distinctions between schools (or restaurants or places to retire) are often meaningless.

Take a look at the sales ranking data at Amazon.com. I once
needed a publication, a World Bank technical paper on the
regulatory environment in Bulgaria. The demand for technical working papers on Bulgaria being what it is, the publication's Amazon sales rank was extremely low (about 2.5 million down on the list). Yet after I bought it, a most amazing thing happened. My one purchase moved this working paper past almost one million other books. This happened because once buyers get out of the best sellers -where the difference in sales can be enormous - almost everything else is basically tied. The differences in rank are statistically meaningless, and small blips cause big changes.

This same is true for the rankings of all sorts of things.

In the New York City Zagat guide, for example, a restaurant that raised its food rating by three points would pass 50 restaurants if it went from 25 to 28 (from "excellent" to "extraordinary"), but 439 establishments if it went from 19 to 22 (both considered "very good").

Then there are the education rankings. For the large mass of schools that rate in the middle of lists like U.S. News & World Report's, rankings are extremely sensitive to small blips. Any criteria included in a survey, like undergraduate grade point average or percentage of applicants accepted, will have random variations over time.

This is perfectly normal. These fluctuations, though, cause the rankings of schools in the middle to move all over the place. It's why schools at the top seldom move more than a place or two, but the Tippie College of Business at the University of Iowa can move up five spots one year and down 18 the next (which is what just happened).

This sensitivity to small changes also tends to encourage bad behavior among university administrators. Some, for example, no longer report foreign students' verbal scores on standardized admissions tests to get an overall better result, or have wait-listed candidates that are likely to turn their schools down so their university looks more selective. These things may raise measured performance a tiny bit, but tiny bits can mean a lot when everyone is clustered in the middle.

The lessons are pretty obvious: outside the very top, be careful about basing decisions on rankings data. The differences are often statistically meaningless. If possible, whether you're interested in schools or restaurants, look at the last several surveys, instead of just the most recent one, since random blips tend to average out over time. Certainly don't pick a Chinese restaurant tonight because its rating is 20 versus another's 19.

For those who really can't get enough, it's now possible to find rankings of the rankings. U.S. News & World Report may get lots of attention, but as of last week its "Ultimate College Directory" for 2004 had an Amazon sales rank of 22,305, worse than the Intercollegiate Studies Institute's "Choosing the Right College 2004" at 19,752.

How is a rankings addict to choose? Next time, perhaps, you would do well to consider an alternative: the informative (and surprisingly popular) "Cartoon Guide to Statistics." Its sales rank is 8,600. •

Austan Goolsbee is a professor of economics at the University of Chicago Graduate School of Business. This article reprinted with permission from the New York Times.

Great article! Thanks.
 
golytlely said:
These rankings by no means have anything to do with residency programs, only research, and hospital reputations. The mayo clinic, where it would be great to have my dad cared for, is really an average IM residency (see list of those accepted and their medical schools, as well as fellowship placement)(ditto case western)(ditto emory).
Talk with your PD for better advice, also no ranking is as good as the feeling you get from visiting and talking with the housestaff, dont get too caught up in the numbers.

peace

Our average IM residents would be delighted to take great care of your dad, or any of your other family members, including yourself, while matching at Mayo's average GI, Cards, Heme/Onc, Endo, Neph and Rheum programs or at other average institutions like Hopkins and Harvard. As for the average medical schools we came from, we, on average, made AOA.
 
lol, thats funny, I personally loved Mayo when I visited there last week for an interview. Only Harvard hospitals seemed to be as teaching oriented that I have been to. Looking at thier fellowship placement they are anything but average. They do recruit from midwest more I think, so snobish types who think a program should have all ivy league schools to be good may be turned off. Only downside is rochester is prob. not a single person's town, but seems a great place to raise a family. Will rank Mayo top 3 for sure.
 
here's something i thought was interesting. here are the USN&W rankings from 2006:

1. Johns Hopkins University (MD)
2. Harvard University (MA)
University of California&#8211;San Francisco
4. Duke University (NC)
5. University of Pennsylvania
6. University of Washington
7. Washington University in St. Louis
8. University of Michigan&#8211;Ann Arbor
9. U. of Texas Southwestern Medical Center&#8211;Dallas
Yale University (CT)
11. Columbia U. College of Physicians and Surgeons (NY)
12. Stanford University (CA)
13. University of Chicago (Pritzker)
14. Mayo Medical School (MN)
15. University of California&#8211;Los Angeles (Geffen)
16. University of North Carolina&#8211;Chapel Hill
17. University of Alabama&#8211;Birmingham
18. Emory University (GA)
Vanderbilt University (TN)
20. Northwestern University (Feinberg) (IL)
21. Cornell University (Weill) (NY)
University of California&#8211;San Diego
23. University of Pittsburgh
24. Baylor College of Medicine (TX)
25. Boston University
26. University of Colorado&#8211;Denver and Health Sciences Center
University of Rochester (NY)

if you look at the numbers, there are a total of 410 internal medicine residencies in the united states. that means that this list encompasses the top 7% of programs. comparing the lists of 2004 and 2006, for the most part they're consistent. the only real exceptions to this are towards the end where, for example, some of the nyc programs dropped off and some new kids showed up.

does that mean that mt sinai doesn't still have a solid reputation? of course not. if you wanted to do GI, for example, it would be a major mistake to gloss over them. but one point the article makes is that while it is hard to be accurate with rankings in the middle tiers, at the apex (ie. the top 7%), relatively big changes have to occur for a program to really move. so my point is that yes, i do think these rankings mean something.

of course i'm not particularly committed to using this list to pick where to do your training. it's based on the opinion of deans and program directors as well as research funding. choosing the right program for you has a lot more to it than that. to some degree, though, reputation has something to do with getting a competitive fellowship, and this list is the closest thing we have to a consensus panel.
 
here's something i thought was interesting. here are the USN&W rankings from 2006:

1. Johns Hopkins University (MD)
2. Harvard University (MA)
University of California–San Francisco
4. Duke University (NC)
5. University of Pennsylvania
6. University of Washington
7. Washington University in St. Louis
8. University of Michigan–Ann Arbor
9. U. of Texas Southwestern Medical Center–Dallas
Yale University (CT)
11. Columbia U. College of Physicians and Surgeons (NY)
12. Stanford University (CA)
13. University of Chicago (Pritzker)
14. Mayo Medical School (MN)
15. University of California–Los Angeles (Geffen)
16. University of North Carolina–Chapel Hill
17. University of Alabama–Birmingham
18. Emory University (GA)
Vanderbilt University (TN)
20. Northwestern University (Feinberg) (IL)
21. Cornell University (Weill) (NY)
University of California–San Diego
23. University of Pittsburgh
24. Baylor College of Medicine (TX)
25. Boston University
26. University of Colorado–Denver and Health Sciences Center
University of Rochester (NY)

if you look at the numbers, there are a total of 410 internal medicine residencies in the united states. that means that this list encompasses the top 7% of programs. comparing the lists of 2004 and 2006, for the most part they're consistent. the only real exceptions to this are towards the end where, for example, some off the nyc programs dropped off and some new kids showed up.

does that mean that mt sinai doesn't still have a solid reputation? of course not. if you wanted to do GI, for example, it would be a major mistake to gloss over them. but one point the article makes is that while it is hard to be accurate with rankings in the middle tiers, at the apex (ie. the top 7%), relatively big changes have to occur for a program to really move. so my point is that yes, i do think these rankings mean something.

of course i'm not particularly committed to using this list to pick where to do your training. it's based on the opinion of deans and program directors as well as research funding. choosing the right program for you has a lot more to it than that. to some degree, though, reputation has something to do with getting a competitive fellowship, and this list is the closest thing we have to a consensus panel.


Well, as far as I know, "Harvard University" doesn't have an internal medicine program. There are 3 different programs affiliated with Harvard Med - MGH, BWH and BIDMC. So those rankings must be Medical School rankings, not IM rankings.
 
Residency training has nothing to do with NIH funding. Why do residents give a hoot about NIH funding? Funding only means money for the medical school and maybe that can help med school reputation somewhat, but the training hospital needs to carry itself and creates its reputation. If the hospital does not have people of national prominence, it is still a community hospital at best. Such is the case with University hospitals of Cleveland/Case Western. Its presence is only local at best. The only reason it is up there is because of CWRU NIH funding. For the recent fellowship matching, residents at best match at UH(which for cardiology, falling apart at this point with no chief/program director, other specialty also average at best). The Michigan match from UH is only if you are chief resident(for cards), Hopkins GI but that guy is MD/PhD with 20 publications. Other matches include ID at MGH(if you want to rave about that). If you rank Case Western ahead of Mount Sinai of NYC, I would think you are delusional(Sinai with Valentin Fuster at Cardiology Chief has been sending residents to UCSF/UCLA for fellowship). Other NYC programs like Columbia/Cornell also send residents for card fellowships at places like SF/PENN/Texas Heart/JHU/Brigham/UCSF. Coming from a regional Case Western you would probably be matching card fellowship at Brigham in your dreams. Obviously BWH/MGH/JHU/Duke/PENN IM programs are always solid(CCF especially likes residents from these institutions for card fellows, also included, from Vandy/UVA/Michigan/Stanford. Most people from west coast tend to stay in west coast(ie SF people in SF for fellowship, LA people-4 IM/fellowship programs w/in UCLA to choose from. Mayo sends good number of residents to Texas Heart/Emory/Duke for cards. Other IM programs like Emroy/UAB/UTSW/U Wsh/Wsh U/Michigan/Vandy/UNC all have very solid IM programs(to get residents to top 10 fellowship programs). But I would not even rank Case Western(be very low in reputation) on that list( I forgot to mention NWU/U Chicago both also with good IM reputations.)

Why the rant against Case? I don't think anyone was saying that Case is the best program in the world or anything. I think it's a solid medicine program. The DEPARTMENT of medicine's NIH funding is in the top 10 in the country...that's just a fact. Case just appointed a new head of Cardiovascular medicine from the Brigham as well (see pasted info below). Most residents who match in cards do go to UH, but most people stay at their home insitution to do fellowship anyway. Plus this years chief is going to CCF for cards...so it is possible to get into good places from Case. I'm not saying Case is the best program in the world, but I think that if you're smart and motivated I think you can do well coming out of the program.

"Case Western Reserve University (Case) School of Medicine and University Hospitals of Cleveland (UHC) announced the recruitment of Daniel I. Simon, MD, a leading cardiologist and researcher, as the new chief of the Division of Cardiology and director of the Cardiovascular Center. Simon, who also will hold the Herman Hellerstein Chair, is currently with Harvard Medical School and Brigham and Women’s Hospital in Boston."
 
These are medical school rankings...NOT internal medicine residency rankings.

I don't think there are any lists of IM programs. The closest to any hospital ranking is the US news and world report best hospitals list, but this is based on subspecialties!!!
 
Well, as far as I know, "Harvard University" doesn't have an internal medicine program. There are 3 different programs affiliated with Harvard Med - MGH, BWH and BIDMC. So those rankings must be Medical School rankings, not IM rankings.

no, these are rankings of the internal medicine departments. this point has been brought up in the past--for some reason they group the harvards into a single ranking.
 
US News rankings has little to do with educational quality, let alone where you'll be happy. I'd rather pass bilateral stag horn calculi then go to a few places on the US News top 10.
 
These are medical school rankings...NOT internal medicine residency rankings.

I don't think there are any lists of IM programs. The closest to any hospital ranking is the US news and world report best hospitals list, but this is based on subspecialties!!!

The list posted by tum is of IM programs, not medical schools. See the list of specialties under 'Rankings' here: (you have to have paid for the online edition).

http://www.usnews.com/usnews/edu/grad/rankings/med/medindex.php

This is how the list of 26 IM programs is derived, according to US News,

"Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here."

So, depending on which faculty and deans respond to the US News surveys, this could be one of the better rankings out there.
 
Actually this is a ranking of Internal Medicine departments and has nothing really to do with the residency programs. This would be a better list if you were trying to figure out where to seek tenure track as a clinical researcher.
 
Actually this is a ranking of Internal Medicine departments and has nothing really to do with the residency programs. This would be a better list if you were trying to figure out where to seek tenure track as a clinical researcher.

This is very likely. Looking more closely at the USNews website, "Internal Medicine" is listed with rankings for "AIDS" and "Rural Medicine." Who's ever heard of "AIDS" residency, or fellowship??
 
Actually this is a ranking of Internal Medicine departments and has nothing really to do with the residency programs. This would be a better list if you were trying to figure out where to seek tenure track as a clinical researcher.

Taken from usnews.com: "Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here."

again, it's admittedly flawed, but likely the best survey of internal medicine program reputation per deans and senior faculty.

best wishes.
 
Taken from usnews.com: "Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here."

again, it's admittedly flawed, but likely the best survey of internal medicine program reputation per deans and senior faculty.

best wishes.

I don't see how your quote mentions residency/fellowship/training programs anywhere. It just says "programs," which can be a variety of things. Again, there's no such thing as "AIDS" residency or fellowship program, so there's also no reason to conclude that the "Internal Medicine" list refers to residency programs. IMO, we are so used to having rankings to aid our decisions, whether it be college rankings or med school rankings. Now all of a sudden we are without one come residency match, and so we cling on to anything that resembles rankings, even if the said list does not claim to be ranking the best residency training programs.
 
I don't see how your quote mentions residency/fellowship/training programs anywhere. It just says "programs," which can be a variety of things. Again, there's no such thing as "AIDS" residency or fellowship program, so there's also no reason to conclude that the "Internal Medicine" list refers to residency programs. IMO, we are so used to having rankings to aid our decisions, whether it be college rankings or med school rankings. Now all of a sudden we are without one come residency match, and so we cling on to anything that resembles rankings, even if the said list does not claim to be ranking the best residency training programs.

this is a ranking of the reputation/prestige of the department. obviously that has little, or nothing to do with the quality of training. it has quite a bit to do with getting a competitive fellowship. especially if you're trying to go outside your institution.
 
...we are so used to having rankings to aid our decisions, whether it be college rankings or med school rankings. Now all of a sudden we are without one come residency match, and so we cling on to anything that resembles rankings, even if the said list does not claim to be ranking the best residency training programs.

points well taken. the limitations of these rankings are immense, including, but certainly not limited to, how usnews implements their surveys and the wording/ semantics they use.

there's a great post by 'atsai3' on this thread from 12/16/05 w/ a NYtimes article written by a UofC economist. excellent elaboration on how humans tend to be addicted to rankings- rankings that are often statistically meaningless and carry such pejorative power.
 
Taken from usnews.com: "Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here."

again, it's admittedly flawed, but likely the best survey of internal medicine program reputation per deans and senior faculty.

best wishes.

Unless the questionare they fill out is "what internal medicine residency programs provide the best training" which remains to be seen, then who knows what the survey means.
 
The mayo clinic, where it would be great to have my dad cared for, is really an average IM residency (see list of those accepted and their medical schools, as well as fellowship placement)(ditto case western)(ditto emory).

Ignorant statement. I went to Mayo and I agree it may not be a "top 5" program or whatever that means. But the startest people I met there went to University of South Dakota or Oklahoma Osteopathic Medical School. Saying a residency is not good because it doesn't get all harvard or Hopkins trained people there (and I've met some Harvard / Hopkins trained people along the way and there is nothing special about one med school versus another) is just foolish.
 
points well taken. the limitations of these rankings are immense, including, but certainly not limited to, how usnews implements their surveys and the wording/ semantics they use.

there's a great post by 'atsai3' on this thread from 12/16/05 w/ a NYtimes article written by a UofC economist. excellent elaboration on how humans tend to be addicted to rankings- rankings that are often statistically meaningless and carry such pejorative power.

i completely agree. someone said this in another forum, but i have thought of it while reading this thread. a good friend of mine who was cambridge trained for law school gave me an interesting insight. harvard and yale provide excellent theoretical training, but relatively limited "real world" experience compared to their less prestigious peers. they are still more often the ones making 200k first year out--not the courtroom brawlers.

in terms of clinical training, the vast majority of my own knowledge thus far has come from private interactions with residents and (less often) attendings. a far greater amount has come from self-directed study. i don't really care about the quality of training at different institutions--i am sure i will be in good shape regardless of where i end up. for me, my main criteria were the quality of fellowships obtained combined with an ambiguous 'gut feeling' about the vibe of the place. to say that those two things exist independent of the perceived national prestige of a program (which i think this list accurately measures) is myopic.
 
Fellowship matches are surely important to many of us. However, why use this USNews list to indirectly gauge fellowship opportunities when we often have access to the actual matchlists? Just like we should all take what we read on SDN with a grain of salt, we ought to trust our own judgements before blindly putting faith in the printings of a magazine.
 
Fellowship matches are surely important to many of us. However, why use this USNews list to indirectly gauge fellowship opportunities when we often have access to the actual matchlists? Just like we should all take what we read on SDN with a grain of salt, we ought to trust our own judgements before blindly putting faith in the printings of a magazine.

some programs are very ambiguous or not forthcoming about their year to year subspecialty matches. especially for my interviews at less "prestigious" (grain of salt) programs.
 
some programs are very ambiguous or not forthcoming about their year to year subspecialty matches. especially for my interviews at less "prestigious" (grain of salt) programs.

I understand where you are coming from. Not just "less prestigious" programs, but a few better known programs didn't (or forgot) to include this information as well. Still, in cases where the match list is known, I find it pointless to refer to USNews. There were a few top programs on that list that had (somewhat) less impressive fellowship match lists FOR ME than programs ranked lower. IMO, this defeats the whole purpose of being ranked higher.

Another thing to consider is that, since you mentioned "less prestigious" programs, USNews only ranks 25 or 26 schools, so the vast majority of these "less pretigious" programs are not included on the list anyway.

For me, basically the only time this USNews is of any interest is if a well known program is less forthcoming with their fellowship match list. Otherwise, I use the fellowship match information provided me to make my own judgments about a program.
 
in terms of clinical training, the vast majority of my own knowledge thus far has come from private interactions with residents and (less often) attendings. a far greater amount has come from self-directed study. i don't really care about the quality of training at different institutions-- i am sure i will be in good shape regardless of where i end up . for me, my main criteria were the quality of fellowships obtained combined with an ambiguous 'gut feeling' about the vibe of the place. to say that those two things exist independent of the perceived national prestige of a program (which i think this list accurately measures) is myopic.

I am sure that you will not learn as much at a residency program with limited case load or limited variation, poor attendings and senior residents and administrative atmosphere not concerned with providing a usefull learning experience, regardless of how much you think you will read. Your program's strength is probably as important as your own motivation for becoming a competent physician.
 
Ignorant statement. I went to Mayo and I agree it may not be a "top 5" program or whatever that means. But the startest people I met there went to University of South Dakota or Oklahoma Osteopathic Medical School. Saying a residency is not good because it doesn't get all harvard or Hopkins trained people there (and I've met some Harvard / Hopkins trained people along the way and there is nothing special about one med school versus another) is just foolish.
I fully agree about Mayo in particular, and in many schools in general. The perceived difference between the motivated, hard working students and grads from the top 5 med schools and the average med schools is really tiny, if anything. Motivated people anywhere can and do become great docs.

The other part of the equation, of course, is what happens to those bright, motivated students after 2 years of residency. I image that like during med school, some people work very hard and become amazing docs, whereas others don't work as hard and don't seem as amazing. Perhaps the best programs, are those that take motivated though not amazing people, and turn them into amazing fellowship applicants. Hard to measure this, though.

Oh, I only counted 1 DO currently at the Mayo in Rochester when I went through their roster (he was, by the way, a very nice guy and apparenlty very very smart).
 
I am sure that you will not learn as much at a residency program with limited case load or limited variation, poor attendings and senior residents and administrative atmosphere not concerned with providing a usefull learning experience, regardless of how much you think you will read. Your program's strength is probably as important as your own motivation for becoming a competent physician.

true. i guess i just factor the atmosphere into my 'gut feeling' that i spoke about. angry attendings suck for learning. i also don't like when a program is so lazy that it almost looks bad when you want to learn. i definitely got this feeling from certain places on the trail, although they t ended to be on the lower end of the academic spectrum.
 
Im an IMG hoping to apply for the 2008 match. Could anyone tell me which of the programs listed are known to be open to accepting IMGs, because a few places like MGH, Hopkins and Cornell are quite categorical (on their websites at least) that they dont generally accept graduates from abroad.
Thanks
 
Im an IMG hoping to apply for the 2008 match. Could anyone tell me which of the programs listed are known to be open to accepting IMGs, because a few places like MGH, Hopkins and Cornell are quite categorical (on their websites at least) that they dont generally accept graduates from abroad.
Thanks

Do they really say that on their websites? Can you show us links please? I am pretty sure there are some IMGs at MGH. I think Duke, UT Southwestern, Emory and UAB also have a few IMGs. Look into Mayo and Pittsburgh also.
 
There is at least one IMG that I know of at MGH, from Toronto. I think they are likely to rank an IMG from a prestigious medical school overseas. I definitely think they do not rank IMGs who are american born and simply left the country to go to med school.
 
There is at least one IMG that I know of at MGH, from Toronto. I think they are likely to rank an IMG from a prestigious medical school overseas. I definitely think they do not rank IMGs who are american born and simply left the country to go to med school.

Canadians are not IMGs!!!
 
There were a few top programs on that list that had (somewhat) less impressive fellowship match lists FOR ME than programs ranked lower. IMO, this defeats the whole purpose of being ranked higher.

Another thing to consider is that, since you mentioned "less prestigious" programs, USNews only ranks 25 or 26 schools, so the vast majority of these "less pretigious" programs are not included on the list anyway.

For me, basically the only time this USNews is of any interest is if a well known program is less forthcoming with their fellowship match list. Otherwise, I use the fellowship match information provided me to make my own judgments about a program.

those are two very good points. you have to look at the bottom line.
 
Top