Another Ranking Question

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Infinity Cola

New Member
7+ Year Member
Joined
Jan 4, 2015
Messages
1
Reaction score
0
I know these threads cause a lot of controversy, but here goes anyways:

Columbia
Cornell
TJU
UTSW
UVA
Wake Forest

I would be happy with matching at any of the above, but I'm looking for some insight on how to rank these programs based on program strength/training alone. I'm not limited by geography (but of course it plays a factor as well). Also, I have no concrete idea on what type of fellowship I'd be interested in either.
 
I know these threads cause a lot of controversy, but here goes anyways:

Columbia
Cornell
TJU
UTSW
UVA
Wake Forest

I would be happy with matching at any of the above, but I'm looking for some insight on how to rank these programs based on program strength/training alone. I'm not limited by geography (but of course it plays a factor as well). Also, I have no concrete idea on what type of fellowship I'd be interested in either.

The only place where I interviewed was UTSW which I really liked. I felt like it is a program that is going to be a top 10 or even top 5 in the next couple of years. They are pumping tons of money into new research and research scanners. Also, it sounds like the residents do pretty well with moonlighting. It also sounds like Texas is one of the better job markets. I did not end up ranking them super high just because there were other programs I liked better but it ended up higher on my list than I initially expected. Good luck!
 
Very geography dependent. You are not going wrong with any of them. None of them has a red flag and none has the name brand of MGH. Long story short, from a practical point of view, all are at the same level.

UVA has top notch IR and MSK departments and probably is very slightly better than the other programs. However, its location does not appeal to everyone.

UTSW is a very good program, but it is far from being a top 5 or even a top 10 program. By just pumping money into research, you can not compete with the name brand of MGH or UCSF or .... I don't say necessarily these programs are better, but once you reach a certain quality level that all top 25-30 programs have, the name might play a role. Otherwise, I am pretty sure that a graduate of any of the above programs can be as good or better than a graduate of MGH.
 
Agree with shark2000. These programs are excellent and have so much overlap that you can't really differentiate them without more input parameters.

Some biased and subjective opinions for whatever they're worth:

UVA has the strongest IR on the list.
TJU has the best MSK on the list. Body MR is pretty good too.
Wake Forest has the best body imaging on the list, although the others are pretty good. UVA is not quite at the level of the others yet.
Can't compare them for mammo, peds, or neuro.
In general New York programs get more play than they deserve because of their location. They're probably the weakest on your list.

TJU, UVA, and Wake Forest residents have pretty busy call, which is probably a good thing to go through, although it sucks at the time. I hear that Wake is the heaviest workload overall, but TJU residents have a pretty heavy neuro component to their call.

Any program in this list would be a good choice, all factors being equal. If you have no opinions whatsoever, I would probably rank UVA or TJU at the top.
 
Also, as an antidote to this ranking stuff... if you choose a program in the "top half" (whatever that is), whether it's a great program or not is mostly what you make out of it. Selecting a name brand in order to give you an advantage for the job search is something else entirely than being good at what you do.
 
UTSW has the nicest hospital. Wake has the best call (heavy like real life) and residents seem happy. UVA has the best IR program. Can't go wrong with any of them.
 
I know these threads cause a lot of controversy, but here goes anyways:

Columbia
Cornell
TJU
UTSW
UVA
Wake Forest

I would be happy with matching at any of the above, but I'm looking for some insight on how to rank these programs based on program strength/training alone. I'm not limited by geography (but of course it plays a factor as well). Also, I have no concrete idea on what type of fellowship I'd be interested in either.

Where does MUSC and Emory fall in with the above?
 
According to doximity, attendings surveyed ranked these programs

Emory>Cornell>UVA>WFU>TJU>UTSW>Columbia>MUSC

We can all give our own subjective opinion, but this at least is the averaged subjective opinion of people who have good experience on the topic.
but yeah, geography should be pretty important.
 
Last edited:
According to doximity, attendings surveyed ranked these programs

Emory>Cornell>UVA>WFU>TJU>UTSW>Columbia>MUSC

We can all give our own subjective opinion, but this at least is the averaged subjective opinion of people who have good experience on the topic.
but yeah, geography should be pretty important.

That line up is pretty amusing.

Beware of this "averaged" opinion as it's weighted toward departments who knew that the ranking was taking place and voted as a block. Departments who ignored the Doximetry attempt to rank residencies source got froze out.

Personally, I like the idea of ignoring third parties who decide they want to step in and just decide to become medical metrics experts (ala USN&WR). We just hand this phenomenal power over to these third parties. Now hospitals are scrambling to make the USN&WR list and physicians pride themselves on making "Top Docs" in magazines that rank hot dog stands.
 
Last edited:
That line up is pretty amusing.

Beware of this "averaged" opinion as it weighted toward departments who knew that the ranking was taking place and voted as a block. Departments who ignored the Doximetry attempt at ranking got froze out.

Agreed. It is ridiculous to say emory is better than all of those programs. Come on
 
I agree I like the idea of ignoring 3rd parties and their made up metrics.
The doximity rankings didn't try to make up any metric. They just used the clearly subjective opinion of people in the field whose opinions matter for at least some purposes, which makes it at least more useful than a) some made up metric or b) any random sdn user's opinion. But it would be even more useful information if they better documented response rate, affiliations, etc.
 
Last edited:
Agree. "Dox" + "metry" is appealing to our sense of the law of large numbers to add validity to their survey, but to my understanding (and the 3 PDs I've spoken with about it) the methodology is flawed and the results questionable.

Doximity is about as much a public service as "Top Docs." Ranking by third parties (subjective or "objective") seems to me to be an unnecessary incursion and kind of a carrot-and-stick game.

In a similar vein, I'm a big believer that we (doctors) should develop metrics but develop our own transparent metrics without need for a third party. Part of the definition of being a "professional" is belonging to a group that independently regulates itself.
 
Last edited:
Top