IM match list analysis

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
pyrois, between doing this analysis and your comic strip, how do you find time to sleep? :D

Members don't see this ad.
 
pyrois, between doing this analysis and your comic strip, how do you find time to sleep? :D

Well, I also run a non-profit, run an internet technology company, and am planning out a medical excursion program for med students to the Himalayas (wanna come?).

As for sleeping, I usually get some in during my midterm weeks since they don't assign problem sets:p
 
I meant that the USNews IM specialty rankings more accurately reflects the opinions of people like myself who've gone through the IM residency application process, interviewed at these places, and compared their fellowship matches, as well as had long conversations about this with IM program directors at top places.

The USNews ranking of "Best Hospitals" is way off if you're comparing IM programs. For instance, the Cleveland Clinic is #3 on the best hospital list, but the IM program is certainly not in the top 25. Cornell and Columbia are listed as the same hospital, but they are two separate IM programs and Columbia is regarded as the stronger one.

The USNews ranking of "Top Medical Schools-Research" somewhat correlates with the strength of IM programs but it's still badly flawed. For example you have WashU above Duke, Baylor above Columbia, Case above Mayo, etc. Most people would not agree with these ranks. Other glaring errors--Cornell med school is ranked 15th but it's IM program doesn't belong in Tier 2, USC is 38th but IM is not in the top 100, UCDavis is 49th but it's IM deserves to be in a higher tier, and so on.

I don't think it's that bad that they lumped the major Harvard programs (MGH, BWH, and BID) together when they ranked them. They're all run by Harvard faculty, and as a whole they belong in the top tier. BID is not a shabby place to train.

Well that depends. Who provided the data on the ranking and what percentage response rate to the survey did they have? (not all US News data is particularly compelling -- for the residency director portion of the research rankings, for example they have a tiny percentage of a smattering of specialties respondng, hardly scientific data; is the response on this better?). Only with such info can you come to the conclusion that it is "more accurate".
 
Members don't see this ad :)
I meant that the USNews IM specialty rankings more accurately reflects the opinions of people like myself who've gone through the IM residency application process, interviewed at these places, and compared their fellowship matches, as well as had long conversations about this with IM program directors at top places.

The USNews ranking of "Best Hospitals" is way off if you're comparing IM programs. For instance, the Cleveland Clinic is #3 on the best hospital list, but the IM program is certainly not in the top 25. Cornell and Columbia are listed as the same hospital, but they are two separate IM programs and Columbia is regarded as the stronger one.

The USNews ranking of "Top Medical Schools-Research" somewhat correlates with the strength of IM programs but it's still badly flawed. For example you have WashU above Duke, Baylor above Columbia, Case above Mayo, etc. Most people would not agree with these ranks. Other glaring errors--Cornell med school is ranked 15th but it's IM program doesn't belong in Tier 2, USC is 38th but IM is not in the top 100, UCDavis is 49th but it's IM deserves to be in a higher tier, and so on.

I don't think it's that bad that they lumped the major Harvard programs (MGH, BWH, and BID) together when they ranked them. They're all run by Harvard faculty, and as a whole they belong in the top tier. BID is not a shabby place to train.

Most people would agree that those relative rankings in US News research are wrong even for research:p
 
Most people would agree that those relative rankings in US News research are wrong even for research:p

Well isn't that train largely driven by grant money (i.e. an objective standard)? If so, the research ranking is what it is, whether the school is particularly known as a good place for research.
 
Well isn't that train largely driven by grant money (i.e. an objective standard)? If so, the research ranking is what it is, whether the school is particularly known as a good place for research.

I wonder how much grant money plays a role in research? Isn't clinical research a lot cheaper than basic research?
 
Well isn't that train largely driven by grant money (i.e. an objective standard)? If so, the research ranking is what it is, whether the school is particularly known as a good place for research.

Hard to say. They include a lot of affiliated programs into the "grant money" calculation. Most of these programs are located in places that a med student would never (and should never) actually be.

If US News only measured grant money in facilities accessible by students, that might be a different story.
 
Hey Pyrois, could you spare a moment to do:

Brown
-----
johns hopkins
cornell
columbia - 2
university of maryland med center
Penn
BU - 2
beth israel deaconness
Einstein
Tufts New England med center
Barnes Jewish
Rhode Island Hospital/Brown - 2
Cedars Sinai/ucla
Mt. Sinai


UVa
----
Thomas Jefferson - 2
UVa -4
Pitt
Nat. Naval Med Center
McGaw/Northwestern -2
Baylor
Mayo
Boston
UMaryland
University of Utah
New England Med Center
UNC-2
URochester
hopkins
duke
georgetown
wake forest


Thanks a ton

btw, is barnes jewish in your tier 2? What about cedars-sinai and harbor?
 
btw, is barnes jewish in your tier 2? What about cedars-sinai and harbor?

Finishing up a project right now, but I'll get around it tomorrow.

Barnes-Jewish, Cedar Sinai, and Harbor all fall into rank 2 (along with main UCLA, etc.)
 
Same with UAB:

2006:
UAB (6), Baptist Health - Birmingham (2), Carraway Methodist, Duke, Emory, Kentucky, LSU - New Orleans, MGH, Penn (2), Brown/Rhode Island, Stanford, Texas A&M/Scott & White, UTSW, UT Knoxville, UT Chattanooga, U of Washington, Vanderbilt, Virginia, Washington U, Wake Forest, UCSD, Yale, Indiana

2007:
UAB (8), Baylor, Cincinnati, Duke, Michigan, Penn, Pitt (2), Stanford, UT Memphis, U of Washington, VCU(2), WashU, MGH
 
Not trying to be nitpicky, but Cedars-Sinai and Harbor-UCLA really do not belong in the same tier as WashU (a top 10 program) and UCLA. Cedars is a good program, but they accept 41 categoricals, IMGs, and DOs, making it a lot less competitive and less prestigious than somewhere like UCLA. Harbor-UCLA, although it's in LA, it's still not nearly as difficult to match as WashU. Also if you were to compare their fellowship matches it becomes evident they don't belong in the same category, regardless of the analysis.

Finishing up a project right now, but I'll get around it tomorrow.

Barnes-Jewish, Cedar Sinai, and Harbor all fall into rank 2 (along with main UCLA, etc.)
 
where can i find that fellowship data? Really interested to see it
 
Members don't see this ad :)
Not trying to be nitpicky, but Cedars-Sinai and Harbor-UCLA really do not belong in the same tier as WashU (a top 10 program) and UCLA.

My tiers are very large.

If I moved Harbor-UCLA and Cedars-Sinai to the next level, they'd be in the same league as Kaiser:p

Just remember, residency matches are MUCH different than medical schools. There are many MORE residency hospitals than there are medical schools you've heard of (since each major school has anywhere from 2 to 10 affiliated hospitals, rest be assured they build up).

So the tiers must be a lot broader than you might expect, and hospitals will fall into tiers you may not have thought they would fall into.

I understand your concern though. Harbor-UCLA/Cedars-Sinai-UCLA are noticeably less competitive to get into than Barnes-Jewish, but they are even more noticeably more competitive to get into than the tier 3 schools.
 
Pyrois, I'm just trying to help you come up with a more accurate algorithm. If you're giving equal weight for a med school matching their students into UCLA and Cedars-Sinai IM programs, then it's pretty flawed. Most people who match at Cedars don't even get interviews at UCLA.

I suggest making more tiers and stratifying programs further. Tiers 1 and 2 shouldn't include that many programs either. It should be a pyramid, with fewer and fewer programs the higher you go. Tier 1 should have 4 programs (MGH, Brigham, UCSF, JHU), Tier 2 with 10 programs (Stanford, Columbia, Duke, etc.), Tier 3 with 20 programs (BU, Harbor-UCLA, etc.) and so on.

My tiers are very large.

If I moved Harbor-UCLA and Cedars-Sinai to the next level, they'd be in the same league as Kaiser:p

Just remember, residency matches are MUCH different than medical schools. There are many MORE residency hospitals than there are medical schools you've heard of (since each major school has anywhere from 2 to 10 affiliated hospitals, rest be assured they build up).

So the tiers must be a lot broader than you might expect, and hospitals will fall into tiers you may not have thought they would fall into.

I understand your concern though. Harbor-UCLA/Cedars-Sinai-UCLA are noticeably less competitive to get into than Barnes-Jewish, but they are even more noticeably more competitive to get into than the tier 3 schools.
 
Pyrois, I'm just trying to help you come up with a more accurate algorithm.

There are many ways I could make the algorithm more accurate but...

...my goal isn't to make the most accurate algorithm. My goal is to make the most accurate algorithm that can be completed in a reasonable amount of time.

The reason why I am only using four broad tiers is that the whole idea of a "tier" is very artificial. To increase the number of tiers is a slippery slope. If I go from 4 tiers to 5... why not 5 tiers to 8? But if I'm at 8 tiers there will still be programs that aren't strictly "equal" to each other, so maybe I should make 32 tiers, or 100, or one tier for every medical school.

Also, you make a blanket statement about Harbor-UCLA not belonging with UCLA. If there's only 1 tier, then all of them should be together. If there's 2, they most likely belong in the same group, and with 4, looking at the whole thing, they definitely belong in the same tier, because Harbor doesn't belong with the schools in the lower tier.

As for people applying to Harbor and not even getting interviewed at UCLA, I'm sure there were plenty of people who matched into UCLA and didn't even get an interview at Harbor.

So yes, I agree. More tiers = more accurate ranking. When I release the program online, I'll make it so you can customize your tiers and allocate programs as you wish, but I've already played around quite a bit and I've found (as you will find) that it doesn't make much of a difference in terms of relative rankings (the numbers simply all just shift up or down slightly).
 
Obviously you haven't applied to these programs or really know what it's like. UCLA is a LOT more selective than Harbor-UCLA. Over 20 people I know who got interviews at UCLA, all got interviews at Harbor. It's not the other way around. Ask around; Harbor interviews many more people per slot than UCLA does. Check your facts.

Your algorithm is based on ranks/tiers. If your tiers are not accurate, then what's the point of your research project? If I'm looking at this to determine the competitiveness of candidates from a certain school, it becomes meaningless when top flight academic places like Barnes-Jewish are thrown together with community programs. In other words, people care exactly about the distinctions you're eliminating by making these places equal.

As for people applying to Harbor and not even getting interviewed at UCLA, I'm sure there were plenty of people who matched into UCLA and didn't even get an interview at Harbor.
 
Over 20 people I know who got interviews at UCLA, all got interviews at Harbor. It's not the other way around.

You apparently haven't asked as many people as I have.

When all is said and done, like I said earlier, you can input your own tiers, and see what it spits out.

Right now, I'm not really bothering with the whole "who's in what tier" argument. I'm more interested in inherent statistical trends that can be calculated purely objectively.

When I do a final release, you'll be able to either take my word for it, or put in your own values.
 
hey pyrois, can you do Brown/Uva? I posted their lists a few posts above
 
as requested:

UMDNJ-NJMS = 2.28
 
pyrois, you should totally throw in a few more useful statistics and come up with your own rankings of the best med schools - time to rebel against that usnews rubbish
 
pyrois, you should totally throw in a few more useful statistics and come up with your own rankings of the best med schools - time to rebel against that usnews rubbish

Hah, I dare not challenge the US News gods. I might be struck down by a bolt of e-lightning.
 
Same with UAB:

2006:
UAB (6), Baptist Health - Birmingham (2), Carraway Methodist, Duke, Emory, Kentucky, LSU - New Orleans, MGH, Penn (2), Brown/Rhode Island, Stanford, Texas A&M/Scott & White, UTSW, UT Knoxville, UT Chattanooga, U of Washington, Vanderbilt, Virginia, Washington U, Wake Forest, UCSD, Yale, Indiana

2007:
UAB (8), Baylor, Cincinnati, Duke, Michigan, Penn, Pitt (2), Stanford, UT Memphis, U of Washington, VCU(2), WashU, MGH

Feed me, pyrois!
 
Feed me, pyrois!

Haha, okay okay. I'm essentially doing a problem set, then entering in a match list in alternation:p It's actually a rather cathartic break activity, and somewhat educational.

Seriously, I have floating around in my head, at least a dozen different match lists.
 
Top