Ultimate Radiology Rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

RadiologyRankings

New Member
7+ Year Member
Joined
Mar 28, 2015
Messages
10
Reaction score
2
Hello all current and future radiologists. After countless forums and years of debating, I thought I would spend some time and create an "objective" ranking of radiology programs. Please feel free to critique/add/etc. and changes can be easily made as this was all done on an excel sheet. I attempted to think up as many different factors but am open to others. There were some surprises, I must admit.

Methodology: Point Based System (points that counted toward final score are bolded)
- US News Adult: Points were awarded based upon hospital rankings
- 1 point for "high performing" specialties, 2 points for 26-50 rank, 2.5 points for 10-25 rank, 3 points for 1-10 rank
- When compiled, programs were then awarded overall points: 8 points for 45+, 7 points for 40-44, 6 points for 35-39, 5 points for 30-34, 4 points for 25-29, 3 points for 20-24, 2 points for 15-19, 1 point for 10-14, 0.5 points for < 10

- US News Pediatrics: 1 Point was awarded if a program has an on-site pediatric hospital or direct institutional affiliation with a pediatric hospital
- Programs then were awarded points in the same way as the adult programs from US News except 2 points for 1-10, 1.5 for 11-25, and 1 for 26-50
- When compiled, programs were then awarded 2 points for 15-20, 1.5 points for 10-14, 1 point for 5-9, 0.5 points for 1-4

- Doximity
- This year's radiology residency rankings were used
- Points: 6 for 1-5 rank, 5.5 for 6-10, 5 for 11-20, 4 for 21-30, 3 for 31-40, 2 for 41-50, 1 for 51-60

- Aunt Minnie: compilation of semi-final (1 point) + finals (3 points) from years 2002-2014
- Compiled points: 4 for 20+ points, 3.5 for 10+, 3 for 6-9, 2.5 for 3-5, 2 for 2, 1 for 1

- NIH Funding: Radiology department specific funding rankings from 2014
- 2 points for rank 1-10, 1.5 for rank 11-20, 1 for rank 21-30, 0.5 for 31+

- Case Volume: This was tricky as only about 70% of programs list their volume; an average was taken after throwing out the outliers. An estimate was then made for the remaining 30% based upon hospital/city size and similar institutions. Points were awarded for studies/resident.
- 3 points for > 30,000, 2 points for > 20,000, 1 point for > 15,000, 0.5 points for 10-14999

- VA hospital: 0.5 points awarded for VA afilliation

- Cancer Center: 1 point awarded for NCI-affiliated cancer center and 0.5 points additional for "comprehensive" designation

-
Fellowships: 0.25 points awarded per fellowship at an institution



THE LIST (Ties broken by highest avg. of US News + Doximity + AM)

1. MGH 26 points
2. Johns Hopkins 25.75
3. Mayo - MN 25.5
4. UCSF 25
5. Duke 24
6. MIR 24
7. UPMC 24
8. Penn 23.5
9. Stanford 23.25
10. UCLA 22.25
11. BWH 21.75
12. Michigan 21.75
13. CCF 21.75
14. UCSD 21.5
15. NYP - Cornell 20.75
16. Emory 20.25
17. Univ. of Wash 19.75
18. NYU 19.75
19. NYP - Columbia 19.25
20. Indiana 18.5
21. Yale 18
22. Wisconsin 17
23. Wake Forest 16.75
24. Northwestern 16.75
25. Iowa 16.5
26. Vanderbilt 16.5
27. Thomas Jefferson 16.25
28. USC 14.75
29. UAB 14.75
30. Virginia 14.25
31. UT-Houston 14
32. Case Western 14
33. UTSW 13.5
34. Oregon 12.75
35. Colorado 12.25
36. UC-Davis 11.75
37. Utah 11.25
38. Dartmouth 11.25
39. Maryland 10.75
40. Med Coll. WI 10.75
41. MUSC 10.25
42. Beaumont 10
43. Cincinnati 9.75
44. Cedars-Sinai - LA 9.75
45. Univ. of Chicago 9.75
46. BID 9.25
47. Minnesota 9.25
48. Arizona 9
49. Baylor Univ. 8.5
50. North Carolina 8.5

Members don't see this ad.
 
Last edited:
Region Ranks if interested:
Northeast
MGH
Hopkins
UPMC
Penn
BWH
NYP - Cornell
NYU
NYP - Columbia
Yale
Thomas Jefferson

Midwest
Mayo
MIR
Michigan
CCF
Indiana
Wisconsin
Northwestern
Iowa
Case Western - Univ.
Med College WI

Southeast
Duke
Emory
Wake Forest
Vanderbilt
UAB
Univ. of Virginia
MUSC
UNC
Mayo - FL
Miami/Jackson

Southwest
UT - Houston
UTSW
Univ. of Arizona
Baylor Univ.
UT - SA
Mayo - AZ
Univ. of OK
Baylor College

West
UCSF
Standord
UCLA
UCSD
Univ. of Wash
USC
Oregon
Colorado
UC - Davis
Utah
 
Thanks for the hard work in putting this together. Good intentions.
The idea of taking the strength of other hospital departments into account I think is a valuable one, and one not often discussed.

A meta-ranking is only as good as one's inputs, however.
The major point inputs (USN&WR, Doximity, AM) are unfortunately bogus. Two of them are mostly subjective opinion.
NIH funding is worthless to you as a resident.
AM opinion is worthless relative to a true objective like case volume.
Case volume is misleading without seeing resident case volume.

This leads to some bizarre conclusions... like UPMC twenty-three ranks above UVA, Duke twenty-two ranks above Thomas Jefferson, or Yale twenty-nine ranks above UNC.
 
Members don't see this ad :)
Thank you for the compliments. I'm kind of a data nerd, haha.
I completely agree with you on the subjectivity of those rankings, that's why they were given less weight than the US News and WR which actually uses real data for their rankings. There is also a hope that the more subjective data points you have, the more legitimacy they gain
The case volume was total cases divided by number of residents which is the only way I could possibly give it any credence.
NIH is not directly reflective down to residents, but gives a sense of the "research-oriented nature" of the program.
I also agree that some programs were ranked more highly/lowly than I thought they would be. However, I think it's unfair to say that AM and doximity are bogus due to subjective opinion but then state that there are "bizarre conclusions" that seem to be also based on your own subjective opinion.
Maybe UPMC is 23 ranks about UVA, maybe its 5, maybe its worse. All I can do is compile objective and subjective data.
What are the reasons UVA is closer to UPCM or the reasons Yale isn't ahead of UNC? I would absolutely love to refine the criteria, add/subtract criteria, and attempt to make the "Best" rankings we can come up with.
 
I thought that might be your response... that somehow the ranking was revealing a secret truth, resulting in nonintuitive results. The reason why UPMC is not that far ahead of UVA is obvious to anyone who has worked with residents from both institutions. They're both good with wide individual variation. The same thing goes for other comparisons on the list.

My objection to using AM and Doximity is that it's masking subjective inputs as an objective output. Those lists are driven by name brand, which is driven by research, which is driven by big programs with attendings who are too interested in their grant applications and R1s to teach their residents well, which results in graduates from big 10 programs who can do DTI research, but who can barely read a radiograph. It's hard to capture this in your rankings.

The biggest problem with a list like this is that: 1) the objective is not entirely clear... what is "best"? Most research money? Residents who are competent in PP? These two endpoints (out of many) are not well associated, and this has never been a controversial point. 2) It doesn't give a good sense for the spread of the data. The "top 20" programs are basically all the same in terms of their resident product. Maybe even the "top 30". Individual resident variation trumps institution variation in terms of new-graduate competence. That's why most of these ranking schemes (no offense) are total baloney.

If you want to rank programs in terms of academic reputation, please go right ahead. It's an interesting exercise.
To think that this would make a good list in terms of residency applications is missing the bigger picture.
 
Last edited:
To my mind, in order to really answer the question being implicitly asked ("which residency program will train me to be the best radiologist I can be"), you really need some metric of new resident graduate competence.

I have no idea how to do this.

But when I meet an R5 in one program who does not know any of the inner and middle ear structures on a CT IAC vs. an R4 at another program who can pick up a Mondini malformation... I can tell the difference in resident product.
This difference is not reflected in any ranking scheme I know of.
 
This is most definitely an attempt to gauge overall reputation of programs and exactly for the reason you stated - a fun and interesting exercise. I attempted to compile all areas involved in a radiology program: academic opinions (Dox), peer opinions (AM), non-medical people rankings (US news), and some real data such as VA system, Cancer centers, case volume, and fellowships. I actually realize I forgot to mention in my original methodology that I gave a program 0.25 points per fellowship (fixed now) offered with the assumption being that if they have a fellowship, then the department must be a good relative quality/volume. The endpoint is overall "floor" of the program.

I 100% agree with you that individual variation is HUGE. You can get more out of UNC than MGH if you are a truly driven individual.
I found this just mostly fun and if anything, a good discussion point.
Maybe US news will rank radiology departments someday 🙂

I also want to say that I completely agree that if you are at any of the top 20-30 up there, there's not much major difference. Perhaps that's why people end up either going for the high-powered name for their resume or the awesome location. The "name" means a bit more these days with a poor job market, but maybe the best thing to take away from this list is verification of the "go where you want to live" philosophy.
 
To my mind, in order to really answer the question being implicitly asked ("which residency program will train me to be the best radiologist I can be"), you really need some metric of new resident graduate competence.

I have no idea how to do this.

But when I meet an R5 in one program who does not know any of the inner and middle ear structures on a CT IAC vs. an R4 at another program who can quickly pick up a Mondini malformation... I can tell the difference in resident product.
This difference is not reflected in any ranking scheme I know of.


Perhaps CORE exam score averages by program, but I doubt those would ever be released. Rankings are hard to do...haha
 
Like any other ranking, there are always lots of arguments and disagreements

Like any other rankings there are lots of mistakes in the list. The truth is there is no REAL or GOLD STANDARD list so we can argue forever.

I don't want to enter any discussion about this list but as an example 11 is too low for BWH, 3 is too high for Mayo and 7 is too high for UPMC.

Many factors that you used are not independent.

From practical purposes, this is how I rank programs:

1- TOP 5-6 which have national and international reputation. If you work in Texas or California people will still get impressed by MGH or BWH name, but not so much by UPMC or Michigan.

2- Big regional Academic center. I don't rank between them. In Indiana radiologists, clinicians and general public think that Indiana has a better program than UT Houston.

3- Community programs.
 
Like any other ranking, there are always lots of arguments and disagreements

Like any other rankings there are lots of mistakes in the list. The truth is there is no REAL or GOLD STANDARD list so we can argue forever.

I don't want to enter any discussion about this list but as an example 11 is too low for BWH, 3 is too high for Mayo and 7 is too high for UPMC.

Many factors that you used are not independent.

From practical purposes, this is how I rank programs:

1- TOP 5-6 which have national and international reputation. If you work in Texas or California people will still get impressed by MGH or BWH name, but not so much by UPMC or Michigan.

2- Big regional Academic center. I don't rank between them. In Indiana radiologists, clinicians and general public think that Indiana has a better program than UT Houston.

3- Community programs.


I like your rankings. I actually considered adding a point modifier for just that but decided against it. National names vs regional names vs state names vs. community programs with academics vs. pure community programs. I thought then people would argue over what is a national vs. regional name, haha.

I would like to think it's safe to say that MGH, Hopkins, Mayo, BWH, UCSF, and MIR are true national brands, but everyone also knows the names places like CCF and Yale that are not "tippy-top" programs.

I had hoped we could just have discussions. There will always be argument, but that's part of the fun.
 
Sure.

There are two important relatively different rankings that we should consider: Ranking of the radiology department versus ranking of the institution.

As an example, if you tell an IR guy that you did your IR fellowship they will get VERY IMPRESSED. It is very very high respected among IR doctors. But many clinicians or general public may not know about it. In the other hand, though general public and other physicians get very impressed by the name of BWH, it does not impress IR people that much if you tell them that you did your IR training there (relatively).

The other part of discussion goes back to quality of training versus the name brand. I personally believe that there is not a significant difference between quality of training at probably TOP 50 programs. You can get the same quality of training at any BIG ACADEMIC REFERRAL center that you get from MGH. However, obviously the name brand of MGH is better than most of these programs.

In real life, as a radiologist you deal with three group of people: Your colleagues (other radiologists), other physicians and patients. As I mentioned above, these three group of people have their own RANKING in their mind. If you go to Oregon, patients and even some clinicians and radiologist may consider their own medical center better than UPMC or Michigan. If you tell a patient that you were trained at Yale they will get impressed, not so much among radiologists and mixed opinions among clinicians.

As we see things are complicated. My ranking is based on the above logic with the following two points to consider.
1- Quality of training: You can get similar training in any big academic center and the rest of it depends on you yourself. You need the pathology, the diversity, sub-specialize attendings and ... which are available at most big academic centers. On the contrary, many community program can not provide you with these elements and thus your training is not as comprehensive.

2- Name brand: As I mentioned above, the reputation does matter if it has a Hopkins or Harvard or similar name. Otherwise, it is very regional. People including clinicians are biased towards the medical center in their own area.

So, IMO, you there is no point in moving from Seattle to Texas because people thing UTSW may be better than UW. However, moving from Seattle to Boston because you want that Harvard name brand is reasonable. Even if you want to go back and work in West coast later, still that name follows you. Radiology groups will advertise to referring doctor and patients that their new doctor is "Harvard trained". Right or wrong, other radiologists and clinicians will respect you more because they feel you may have some better training or at least they feel that you were a smarter medical student/resident. None of above is true for UPMC or Michigan or UTSW or ..., though all of these programs will provide you with equal training quality. After those brand name programs, most people are highly biased towards their own local big academic center.


This is my personal opinion from being in private practice and interacting with different patients, radiologists and clinicians. Having said that, if you are good at what you do, after a few years you will receive a lot of respect from your colleagues and other clinicians. On the other hand, if you are not good, you will lose the respect even if you put that "Harvard" name above your desk.


Now I go back to dictate some more studies in this very very unusual slow day.
 
Last edited:
The hardest part for me was that I don't know where I want to live....so I didn't want to get "stuck" in a certain region. There aren't very many "national names" though and most of them are almost impossible to get.
I guess it's my own fault but how do you all know for certainty which places you want to live the next 20-30 years?
 
All of the preoccupation with these rankings. I guess they are nice if you want to make yourself feel better than everyone else. I am waiting for the crappiest programs that should be closed list. Human nature I suppose.

This is my personal opinion from being in private practice and interacting with different patients, radiologists and clinicians. Having said that, if you are good at what you are do, after a few years you will receive a lot of respect from your colleagues and other clinicians. On the other hand, if you are not good, you will lose the respect even if you put that "Harvard" name above you desk.

This I definitely agree with. Of course, you would have the opportunity to say "The Mass gen" as they like to call it. 😉
 
Last edited:
Members don't see this ad :)
The hardest part for me was that I don't know where I want to live....so I didn't want to get "stuck" in a certain region. There aren't very many "national names" though and most of them are almost impossible to get.
I guess it's my own fault but how do you all know for certainty which places you want to live the next 20-30 years?

Then choose One of those TOP 5-6 programs. After that choose a fun city to live for your residency. Even if you end up in a different place, you will always have great memories from your residency.

Good or fun city means different to different people. You may love or hate NYC. But for most young people, Metropolitan areas like SF, Boston, San Diego, Chicago, Houston and others are fun places to live. It is also nice that most of these large cities have excellent programs. You never go wrong with NW or UT Houston.

If you are married and have kids or you have very strong family ties to a certain location then it is no brainer.
 
"Then choose One of those TOP 5-6 programs."

If only I could just call up MGH and let them know I chose them... I'm sure I'll get in then 🙂

Haha, in all seriousness, I tried to look toward more "national brands" that were not as difficult to get into such as Yale and CCF. Yes, the radiology community knows the difference, but as previously mentioned, there are 3 types of people you associate with: "your colleagues (other radiologists), other physicians and patients." I could argue a fourth being business administration people that do hiring/etc. but that's not everyone who deals with them. Meaning 2/3 of those types of people are going to think places like Yale and CCF are more powerful than they may seem in the radiology world.

I'm picking out those two but arguments could be made for others such as Ivy league institutions that get less press, Northwestern, Vanderbilt, and maybe Emory. Deciding which programs are "national brands" is difficult because it varies so much with your audience. I'm in medicine and I couldn't tell you who is the strongest IM department for example but would just assume it was Hopkins, Mayo, CCF, MGH.

I wish I was like most of you and knew where I wanted to live, but my strategy was to try and go after an obtainable national brand. I think that's why those top 5ish are so desirable. Everyone knows that if you go to one of them you have a "golden ticket" to live pretty much anywhere and people will know you were well trained and respect you (at least at the beginning). Any large referral academic center is going to train you well, but only some of them offer that future freedom.
 
I'm in medicine and I couldn't tell you who is the strongest IM department for example but would just assume it was Hopkins, Mayo, CCF, MGH.

The top IM residencies would be MGH, Hopkins, UCSF, and BWH.
 
How do places like UPenn, Columbia, and NYU rank within this 3 tier ranking?

Excellent programs but tier 2. UPenn maybe tier one.

My Tier one programs are MGH, BWH, Hopkins, UCSF, Stanford and probably UPenn (tier 1.2, haha). MIR is definitely tier one but because of its location, I would put it tier two unless someone is 100% sure that he wants to end up in academics. Duke is tier 1.5 but if you don't like the location, rank it as tier two.

Stanford was my tier 1.5 a few years back but it is now my tier one.

Yale and CCF are solid programs but not as good as their name. Yale does not have a lot of weight in radiology world. Also its other departments are not at the level of name Brand.

One person mentioned about business administrators. They don't care and they are not hiring directly. Even if the physicians are hospital employees, the credentials of job candidates are evaluated by current group of physicians.

Seriously, I can not imagine that someone does not have ANY preference for the location that they want to live.
 
To reiterate again... "tier 1" is brand name, NOT quality of training. e.g. Stanford.

An analogy I like (although not perfect) are professional sports teams.

The Yankees are the Yankees. Big market team. Everyone knows them. Their reputation was secured a long time ago, but they can steamroll the reputation into a brand, getting top picks and good players for themselves, maintaining a decent team. The NYY with Babe Ruth is not the NYY of today, but the branding confuses things for those not in the field. Same with the L.A. Dodgers.

A lower tier residency is like... I dunno... the Florida Marlins or the Cincinnati Reds or something (I'm sure somebody's upset now). They still can have competitive teams. They can actually be better than the NYY, and usually people in the game understand this. The 3rd baseman on the Reds could be wayyy better than his Yankee counterpart (theoretical example).

So... if somebody is trying to decide which team they should try to join... you see why saying "NYY is tier 1 and the best!" is misleading. You could be drafted on to that team and fall into a situation where it's not a good fit for you. You could get the training and support you need at a smaller team, and if the team clicks and is managed well, you're going to win more games than some big clunky organization. You could be drafted into one big market team (Mets), hate it, and get yourself traded to some podunk team where you have a better chance to shine (e.g. LAA in the 70s, then Astros in the 80s), and the rest is history. You can't tell every player "You need to be a part of the Yankees!", that's not a very nuanced look at what may be right for them. Why would you want to be a part of the Yankees anyway (assuming salaries are constant)... will it make you a better pitcher, or you just want to be associated with the name?

The analogy falls down because there actually is a way to measure which team is better in MLB: the win-loss record. Rads doesn't have something like that. But I still think the analogy is useful. Personally, I'd join the Padres if I thought it would make me into a better player and I like the team dynamics, rather than be someone who washed in and out of the Yankees, just to say I'd been a Yankee. Personal choice.
 
Last edited:
Fun exercise to put together a list like this and fun to discuss it as well. But as someone who just interviewed at a number of these top places I agree with much of what Shark2000 and Gadofosveset's have said above. From an applicant's perspective, all of the top 20-30 places seem to be excellent with plenty of research, famous faculty, etc. What seemed to vary much more was:

1. Quality of training (+/- independent call, volume of studies, and IR experience seem to vary quite a bit)
2. Location
3. Culture/vibe of the department

So a list like this functions well to give totally lost applicants (ie myself 1 year ago) some idea of where to apply. But I think you have to go interview at these places to figure out how YOU should actually rank them.
 
Fun exercise to put together a list like this and fun to discuss it as well. But as someone who just interviewed at a number of these top places I agree with much of what Shark2000 and Gadofosveset's have said above. From an applicant's perspective, all of the top 20-30 places seem to be excellent with plenty of research, famous faculty, etc. What seemed to vary much more was:

1. Quality of training (+/- independent call, volume of studies, and IR experience seem to vary quite a bit)
2. Location
3. Culture/vibe of the department

So a list like this functions well to give totally lost applicants (ie myself 1 year ago) some idea of where to apply. But I think you have to go interview at these places to figure out how YOU should actually rank them.

I'm glad that you see a use for the above rankings for some people. I completely agree that you must go to these places and find a fit. There's no good way to "rank" the 3 points you bring up but they are definitely extremely important when considering a personal ranking. My hope was to give a starting point and programs that are able to provide the resources, reputation, and training that most seek.
 
I know this list isn't the "be all end all" or anything but I did want to use it to help out future DO applicants. As far as I know, these are the programs that have 1. Have had DOs 2. Have DOs or 3. Have interviewed DOs. I'll use the rankings from OP just for fun.


7. UPMC
13. CCF
21. Yale (just took one this year but been interviewing them for years)
23. Wake Forest
28. USC
35. Colorado
38. Dartmouth
41. MUSC
42. Beaumont
43. Cincinnati
47. Minnesota
50. UNC

Other pretty good programs I know have/interview DOs: UF-Gainesville, USF, VCU, Penn State, Loyola, Nebraska, Med College GA, Christiana Care, Hartford Hospital, UT-SA, Texas A&M

As you can see, DOs still have a long way to go... Things may change in the coming 5-10 years. Still not an easy process knowing that only approx. 13 of the "top 50" programs will even take a look at a DO and really none in the top 10 since UPMC is prob. closer to 15-20 range.

Others please feel free to add to this list if you know of others that have interviewed/have DOs. Additionally, there are lots in the "second half" of all the programs that have/interviewed DOs.
 
Last edited by a moderator:
Instead of arguing about what good programs are in the top 50, let's set our sites on the bottom 20 that should shutter the doors.

Not sure if anyone has the balls to really "bad mouth" programs...but there are definitely some that could drop some spots or close and wouldn't be missed...
In general (in any specialty), I'm not a fan of private practice groups using residents to offset work and providing little education. Not sure which ones these would be in rads, but I know first-hand of some in some other specialties.
 
Thanks for the hard work in putting this together. Good intentions.
The idea of taking the strength of other hospital departments into account I think is a valuable one, and one not often discussed.

A meta-ranking is only as good as one's inputs, however.
The major point inputs (USN&WR, Doximity, AM) are unfortunately bogus. Two of them are mostly subjective opinion.
NIH funding is worthless to you as a resident.
AM opinion is worthless relative to a true objective like case volume.
Case volume is misleading without seeing resident case volume.

This leads to some bizarre conclusions... like UPMC twenty-three ranks above UVA, Duke twenty-two ranks above Thomas Jefferson, or Yale twenty-nine ranks above UNC.

I feel like a tier system would be more effective. obviously it's going to be basically futile to attempt to differentiate 43 from 44, maybe like 1-10, 11-30, 31-50 and then after that just identifying malignants?
 
Not sure if anyone has the balls to really "bad mouth" programs...but there are definitely some that could drop some spots or close and wouldn't be missed...
In general (in any specialty), I'm not a fan of private practice groups using residents to offset work and providing little education. Not sure which ones these would be in rads, but I know first-hand of some in some other specialties.

It's ballsy for someone named Cubsfan to suggest shutting down programs because of inadequacy with the baseball analogies that have been thrown around in here.
 
Can we access the raw data? Very interested to see all the information. Nice work! I love the idea of a "brand name" modifier. I'm sure there is data floating around on recognisability. You could also construct a "lifestyle" modifier using inverse salary on the assumption that the more desirable the place is, the lower the salary. It would be amazing to see an IR subcategory.
 
Last edited:
Can we access the raw data? Very interested to see all the information. Nice work! I love the idea of a "brand name" modifier. I'm sure there is data floating around on recognisability. You could also construct a "lifestyle" modifier using inverse salary on the assumption that the more desirable the place is, the lower the salary. It would be amazing to see an IR subcategory.

I have no problem attaching the raw data. A brand name modifier was definitely considered but I was curious how programs stacked up before that "bias." What I had planned was a system based upon points of which:
- 10 points (National Name)
- 7 points (Regional Name)
- 5 points (Local/State Academic Name
- 4 points (Community/Academic with Regional Name)
- 3 points (Community with Regional Name)
- 2 points (Community with Local Name)

One of the main reasons I did not include this was the debate that would rage over the subjectivity of what is considered a national name and are we talking medical or non-medical, radiology or non-radiology. Some programs "live up" to their national brands and others do not. I thought places like Yale and CCF would benefit while places like UCSF and UCSD which honestly are powerhouses in radiology would be hurt so I didn't include that modifier. I can't consider those "national brands" because here in the Midwest people don't know them at all. Without living in all parts of the country, it's impossible to know what is truly national vs. regional.
My best guess would have very few "national brands" that everyone in all areas of the country would know and they wouldn't all be considered "top" radiology programs.
- MGH, Mayo, Hopkins, Yale, Cleveland Clinic and maybe Stanford and Duke are places EVERYONE knows but obviously some are far better quality than others.

As far as your inverse salary modifier, that would most definitely work in theory. You could have a true multiplier in terms of desirability or award points such as above. Perhaps score x 1.2 for most desirable, scores x 0.8 for least desirable and then all the rest in between. Or just a simple 10 point scale as above for the brands.

IR subcategory was also considered due to the popularity of the field right now and all programs could be ranked as "total program" and "IR programs." The difficulty was finding good data on which programs have strong IR departments. I didn't want to use the hearsay on auntminnie as verification but it could be used as part of the criteria. It would be most beneficial to judge them based upon procedures done, diversity of cases, etc. but that data would be arduous to collect.

If anyone would like to add to what I have done, please feel free. This once again was an attempt to give a good sense of what the good programs are.

On another note, I appreciate the "DO modification" done by Cubsfan. I did not consider this originally and actually am shocked to see how few top places actually consider DOs.
 

Attachments

Where were you scraping data from? I wouldn't mind taking a crack at adding a lifestyle modifier. I'm also interested in adding a IMG and DO modifier as well. Depending on the source data, it might be interesting to throw in number of CT, MRI and PET scanners as an input into volume because some of the volume was estimated. There should be a strong correlation between the number of machines and true volume.
 
Where were you scraping data from? I wouldn't mind taking a crack at adding a lifestyle modifier. I'm also interested in adding a IMG and DO modifier as well. Depending on the source data, it might be interesting to throw in number of CT, MRI and PET scanners as an input into volume because some of the volume was estimated. There should be a strong correlation between the number of machines and true volume.

Most of the data came from program websites and some came from brochures from me and my friend's interviews. Some info came from Freida. Some from US News. Most programs list "current residents" so you could find out how many IMG and DO. There's also the above post that lists places that have taken/interviewed DOs.
A lot of the volume is not published so it's hard to get an accurate amount for each program.
Have fun with it though!
 
"On another note, I appreciate the "DO modification" done by Cubsfan. I did not consider this originally and actually am shocked to see how few top places actually consider DOs."

Thanks...
We are still 2nd class citizens...3rd class at the "top" places behind FMGs and AMG MDs.
The thing that really bothers me personally about it is that we are going for a medical residency and we are judged on basically what we did before medical school rather than what we did during. We can have better applications and not even get interviews.
 
You can add Case Western and Jefferson to the list of places that have taken DO's. Also, Case Western has a VA affiliation, can't miss out on that arbitrary 0.5 point.
 
Last edited:
You can add Case Western and Jefferson to the list of places that have taken DO's. Also, Case Western has a VA affiliation, can't miss out on that arbitrary 0.5 point.

I agree it's ridiculous that otherwise competitive DOs still have trouble even getting interviews at certain programs. For what it's worth, though, I just wanted to point out that there aren't any DOs among the current Jefferson PGY2-5s, nor in the incoming accepted class of 2016. Not sure about the class in internship right now.
 
You can add Case Western and Jefferson to the list of places that have taken DO's. Also, Case Western has a VA affiliation, can't miss out on that arbitrary 0.5 point.

I went by the Freida directory. They list all affiliated institutions per specific residency program. If Case Western radiology residents actually do rotate over at the VA then I apologize for missing it.
 
I agree it's ridiculous that otherwise competitive DOs still have trouble even getting interviews at certain programs. For what it's worth, though, I just wanted to point out that there aren't any DOs among the current Jefferson PGY2-5s, nor in the incoming accepted class of 2016. Not sure about the class in internship right now.

There is a DO starting at TJ in July; currently in their intern year.
 
I'm guessing that DO applicant was a std dev or two above the mean.

Probably but that doesn't guarantee anything...How do you think a 256/268 and a no. 1 class rank with 3 publications would go over as an MD compared to DO 😛
 
Probably but that doesn't guarantee anything...How do you think a 256/268 and a no. 1 class rank with 3 publications would go over as an MD compared to DO 😛

MD: front-runner
DO: back-up
 
You can add Case Western and Jefferson to the list of places that have taken DO's. Also, Case Western has a VA affiliation, can't miss out on that arbitrary 0.5 point.

case/metro or case/UH?
 
Do you know what Doximity bases their rankings on?
 
Doximity rankings are based on "nominations" by practicing physicians in that specialty that responded to their survey. Likely this means that somewhere around 10% (at best) radiologists weighed in...obviously giving programs that are bigger and more well known a huge advantage. Aka it's complete crap.
 
Geez this makes me have second doubts hat I ranked and matched #1 program I had thought was solid ahead of many of my other programs which was much higher on this list [and also in good locations].
 
Geez this makes me have second doubts hat I ranked and matched #1 program I had thought was solid ahead of many of my other programs which was much higher on this list [and also in good locations].

That's why you should always trust your gut and personal opinion over online "rankings" and such. I'd be interested to see what you thought was "solid" and ahead of other good programs.
 
Geez this makes me have second doubts hat I ranked and matched #1 program I had thought was solid ahead of many of my other programs which was much higher on this list [and also in good locations].

Pfffft. Don't believe these rankings. Any rads rankings that don't have Mallinckrodt as #1 are clearly flawed anyway.
 
Top