Rad onc rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Before defending some of the logic of this ranking, I must first point out that it is still more objective than the "hear-say" that has mostly been used on this thread.

In addition to reasons mentioned above by beemer, I still maintain that there are good justifications to think a med school's overall quality contributes to the program quality (not in a major way but deserves to be taken into account as it is below).

It is not true that rad onc funding and med school funding from the NIH are completely dissociated. The data says otherwise. There is a general correlation whenever this information is available (with some exceptions of course!). Further, higher ranked schools are likely to have more clinical/translational project grants (such as the CTSA) which are multidisciplinary and allow for large trials etc, all resources that may not necessarily exist in lower ranked schools. Finally, top-tier schools may be able to attract top faculty even if their programs is no match, simply due to resources, reputation, research support, and quality of other cancer related services (i.e. surgery, heme/onc, patient databases etc...).

Third top tier schools with mid-tier programs that are on the up swing may have more momentum compared to other programs on the up-swing simply because of the resources available to back them up.

And while med school prestige doesn't equal quality, for the reasons mentioned above, it is nevertheless "a minor" factor that deserves to be incorporated.

Finally, my apologies for leaving out programs, an oversight, but this is meant to be a preliminary starting point with a more objective basis that can hopefully be further modified. And in my humble opinion, I submit that it is a more logical alternative to many of the subjective hear-say lists mentioned above.

1-10

1 MD Anderson
2 Harvard
3 MSKCC
4 UCSF
5 UPenn
6 U Michigan
7 Johns Hopkins
8 Wash U
9 Stanford
10 Yale

11-20

11 Duke
12 U Chicago
13 Columbia
14 U of Washington
15 U Wisconsin
16 Mayo
17 Vanderblit
18 UCLA
19 U Maryland
20 U of Pittsburgh

21-30

21 Cleveland Clinic
22 NYU
23 Ohio State
24 Emory
25 U Minnesota
26 Northwestern
27 UNC Chapel Hill
28 U of Iowa
29 UCSD
30 Thomas Jefferson U

31-40

31 Baylor
32 Mount Sinai
33 U Florida
34 U Colorado
35 U of Alabama
36 UT Southwestern
37 University of Virginia
38 Cornell (Weill) - Queens
39 U Rochester
40 UC Davis

41-50

41 OHSU
42 U Southern California
43 UC Irvine
44 Fox-Chase (Temple)
45 U South Florida (Moffit)
46 Georgetown U
47 Albert Einstein
48 Tufts
49 SUNY upstate

Members don't see this ad.
 
Last edited:
I felt compelled for the first time to chime in. I agree that medical schools are far removed from Radiation Oncology Programs however it is hard to overlook the positive effect of a top medical school on all the programs within. I have been involved in hiring decisions and that factor is definitely taken into account for better or for worse.

wagy27 your remarks are well taken and there are many non-affiliated programs who are beyond excellent, though words such as "junk, failure, ridiculous, fallacy" don't necessarily make your case stronger when you haven't presented any methodology of your own or concrete objections beyond your personal opinion. I fail to see the "standardized thoughts".
 
Members don't see this ad :)
This is turning nasty very quickly. Last I heard we still live in country with freedom of speech. Anyone has the right to disagree with someone's observations but to suggest this should be shut down is pretty ridiculous. Of course, there are going to situations where a method may fail (like in the case of Beaumont which was not associated with a strong med school) but that doesn't mean there is absolutely no value to looking at things from a different prism. I, for one, believe we should try to be constructive and point out deficiencies rather than be condescending and shutting down people we disagree with.
Medstudent: US news ranking already takes NIH funding of med school in consideration. In my opinion, using both US news ranking AND med school NIH funding is skewing this too much away from Rad Onc. There is data available on department specific NIH funding and you can find it here:
http://www.brimr.org/NIH_Awards/2012/NIH_Awards_2012.htm
You may want to use that data instead of NIH funding to med schoosl. The only limitation is that radiology and radiation oncology are reported together and I don't know of any rad Onc only data. Also, you may want to average funding over last 3 years or so to distinguish outliers from consistently top performing departments.
People can take it or leave it but trying to shut other people down is not very American.
 
This is turning nasty very quickly. Last I heard we still live in country with freedom of speech. Anyone has the right to disagree with someone's observations but to suggest this should be shut down is pretty ridiculous. Of course, there are going to situations where a method may fail (like in the case of Beaumont which was not associated with a strong med school) but that doesn't mean there is absolutely no value to looking at things from a different prism. I, for one, believe we should try to be constructive and point out deficiencies rather than be condescending and shutting down people we disagree with.
Medstudent: US news ranking already takes NIH funding of med school in consideration. In my opinion, using both US news ranking AND med school NIH funding is skewing this too much away from Rad Onc. There is data available on department specific NIH funding and you can find it here:
http://www.brimr.org/NIH_Awards/2012/NIH_Awards_2012.htm
You may want to use that data instead of NIH funding to med schoosl. The only limitation is that radiology and radiation oncology are reported together and I don't know of any rad Onc only data. Also, you may want to average funding over last 3 years or so to distinguish outliers from consistently top performing departments.
People can take it or leave it but trying to shut other people down is not very American.

I think as scientists we all prefer objective data, rather than speculation. Unfortunately, as has been pointed out, we don't have access to the data that would be truly necessary to create a comprehensive Radiation Oncology ranking system.

So Medstud, thank you for making an effort to objectify this. That said, I also agree that including anything related to medical school doesn't make sense. Perhaps some brainstorming as to what objective measures WOULD be good to include in a "strength of department" ranking list? (I feel like quality of training is a completely different list)

Strength of department -
- Research funds (The included NIH grant data is probably the best we can do)
- Publication rate of department (http://dx.doi.org.libproxy.uams.edu/10.1016/j.ijrobp.2008.10.022, clearly dated but the best I could find.)
- Faculty to resident ratio
- Trajectory of department under current leadership
- Leaders in the field's opinion of the departments (Not objective, but clearly important to consider)

I think as a group there are probably many more, or even better, criteria than I listed above so please suggest!

Unfortunately the things that it seems would make for good training are hard to find and completely different to each application: Rotation structure, support of research, how good the faculty is at teaching, etc.

TL;dr - objective data is good, but there must be something better than medical school ranking.
 
Part of the issue I see is that we have posters in medstud and you who have a total of 20 or so posts and from what I can see are med students with limited experience at this point to evaluate a whole group of programs yet medstud is putting up an "objective" evaluation based on 1) the flawed concept that med school rankings translate into radiation oncology residency quality, 2) without correction for institutions without med school, rad onc history/prestige, rad onc publications, rad onc funding, etc, and 3) uses a weighting system that from what I can ascertain came out of med studs mind.

Im not suggesting shutting down rankings, instead that this objective system is significantly flawed and needs to be scrapped since there are so many subjective factors associated and what is currently presented is significantly misleading to those who come here to look at the rankings. Simply picking 1-3 objective factors and omitting the far more important subjective factors that previous posters have ranked upon is not a solution.
Your response now is lot more civil now than your initial response which was rude, condescending and uncalled for (even if some of us agree with some of your points), IN MY OPINION.
I strongly disagree that you could ask someone to "scrap" something because you disagree with them. I reject the notion that somehow other medical students will fall into a trap just because these rankings were described as "objective" by the poster. If someone is smart enough to get into med school and applying for rad onc, they are smart enough to process the information and decide what to take seriously and what to take with a grain of salt.
 
wagy27, in case you didn't read my previous posts well, med school rankings was weighed at 18% whereas other factors such as rankings on this thread was weighed at 35%, cancer hospital ranking and NIH funding were each weighed at 24%. Thus I'm not solely basing this on med school rankings nor am I completely scraping previous posts. I am sorry you're very upset about my perspective (which I never claimed was entirely objective but only more so than previous ones), nonetheless, I still and will maintain that med school ranking and overall performance has a small and heeded contribution towards the radiation oncology program quality. I don't think I am in the minority on this.

beemer and sheldor: you both make great points, obviously to implement some of them would be time consuming. I do however agree that a field such as radiation oncology with so much research emphasis should have more respectable means to rank programs. My intention was to bring up the issue and maybe stimulate other means of ranking than the mere subjective.

Radiadouken: I agree it's turkey time. But first, let me first post 50 more messages, because apparently the number of messages posted on sdn by users is one objective metric of success according to wagy27
 
Your response now is lot more civil now than your initial response which was rude, condescending and uncalled for (even if some of us agree with some of your points), IN MY OPINION.
Sometimes, the truth hurts...

I strongly disagree that you could ask someone to "scrap" something because you disagree with them. I reject the notion that somehow other medical students will fall into a trap just because these rankings were described as "objective" by the poster. If someone is smart enough to get into med school and applying for rad onc, they are smart enough to process the information and decide what to take seriously and what to take with a grain of salt.

I'd probably weigh the opinion of a bunch of insiders from different institutions over an arbitrary med student created grading system, but that's just me
 
Wagy: you too have presented a ranking with questionable logic, including dozens of programs that you have no first-hand knowledge about, but chose to include anyway. I'm not sure your system is any more valid than this one. At the very least, medstud explained the rationale behind his system, not that I agree with it.

If this thread is purely for entertainment purposes, it serves its function well. However, if we really want to construct a valuable resource for med students, why don't we just rank the handful of programs that we are personally familiar with and include some specific rationale to back it up?
 
Take is easy everyone. Every list of program rankings is inherently subjective. Furthermore, there is no magical, universal list of program rankings which awaits discovery with complex mathematical formulas or world-wise insight.

So my advice is, come one come all and submit your lists. But, be prepared to defend your views and be called out for perceived slights in your rankings. For all the hooplah over the years, the top 3 programs have not changed and the top 10 programs are essentially a re-shuffling of about 15 total programs.

Once you go below the top 15 or so, national prominence goes by the wayside and regional prominence is much more important. For that reason, ranking programs below 10-15 is probably an exercise in futility anyway.
 
I dont care whether you thought I was civil or not to be quite honest. You can think I was rude and condescending, I could care less. I was honest and objective in my assessment and maybe you think people should be patted on the back when they have a good idea that is poorly executed but me, I call a spade a spade. I cant get anyone to do anything, whether I agree or disagree but I can point out the futility and suggest that something is abandoned, something I tell my residents and med students when they present ideas unlikely to lead to meaningful results or results that arent valuable. You make a lot of assumptions; as someone who has gone through the application process, went through residency, and now as an attending mentors students and residents, I can tell you that these rankings hold a lot of cache and even those smart enough to get into med school and apply for rad onc can be biased by these rankings.
If you think you were honest and objective in your assessment and that everybody else who don't agree with your assessment need to be "scrapped", I don't want to waste my time arguing with you. The med students on this forum are smart and don't need an arrogant and condescending attending to act as filter and let them know what posts are useful to them and what are not. Sorry but you have no authority for "scrapping" other people.
 
Wow.

I think it's constructive to think about ways to rank, but if you're ranking overall quality of radiation oncology residency (I think? Or is it department quality?), I think using medical school data is misleading and not helpful. For one, the top medical schools fill their residencies with their own residents, so it confounds things. For two, its relying on mostly GPA/MCAT and that really doesn't make me make any decisions. For three, NIH money is being used redundantly.

I think resident publications matter, whether it is an NCI designated comprehensive center, the amount of site specific tumor boards, board scores of residents (dumb, but somewhat quantitative), location, exposure to modalities (protons, SRS, etc), whether or not you get your cases without outside rotations, and many more, but not one of these has been counted in your model.

Try again!
S
 
Members don't see this ad :)
This has gotten pretty ugly.

Couple thoughts.

Rad onc is a small, highly competitive field and current residency programs are comprised of top tier applicants out of med school. Additionally, at many programs, a component of teaching is led by the senior residents to supplement faculty led instruction. These are both good things in my estimation, and have resulted in improved quality of training in our field. Nowadays, any of the top 70 programs offer high quality training. Research opportunities in residency, and academic careers at top tier institution, may be better facilitated by training at a top 10 program; but training at a well respected regional program should equip you to land a top tier job in private practice.

Program quality is important, but independent study is equally important to learning rad onc. Landing a spot at a top three residency program is not a golden ticket. The volume of knowledge that we have to acquire to sit for radiobio, physics, written/oral clinical boards, and manage a general practice can not be satisfactorily learned through a passive approach during training. So debating which program is #6 vs #12 is less important that coming up with a sound self study plan to complement your program's clinical training and didactics.

Program rankings are a nice resource, but probably less important than fit- as it relates to personality and learning style. Four years is a long time, so you don't want to be miserable training at a program that doesn't fit your persona. And as I suggest before, I think that training at the #11 vs #25 institution on a fabricated rank list will not likely influence your job prospects in a meaningful way.

These rankings are a combination of objective (research dollars, resident publications, didactic structure, faculty prestige, etc) and subjective (applicants/residents/attendings perception) measures. The list by medstud and others are one component of the latter. I tend to agree with Wagy in his criticism of the methods, but a program's reputation is a product of the collective impressions of that program by others in the field, including applicants.

Agree with previous posters that resident publications are important. Ave resident publications at each residency program often gets overlooked in these rankings. In my opinion, having a significant research component on your CV can make up for training at a lesser tier program... Both for academic AND private jobs!

Lastly, rad onc is a small field. These forums are anonymous, but it is not hard to figure out who is who based on information derived from posts. As such, having a knock down, drag out fight with a future employer/partner on one of these forums is probably ill advised.
 
If you are having a ranking system and you don't rank or rank extremely low programs like UTSW, MCW, Beaumont, etc, your ranking is a failure.

This. Utah, Wake, and Henry Ford not in top 50 either?
 
Any thoughts on the UCLA rad onc program? status/trajectory/research support?

Any insight is much appreciated
 
What do people think are the top 10 programs explicitly for training and placing physician scientists (basic science / translational medicine) into 80:20 research : clinical positions?
 
What do people think are the top 10 programs explicitly for training and placing physician scientists (basic science / translational medicine) into 80:20 research : clinical positions?

In my opinion, which stems from personal experience on the interview trail, talking to others, and reading SDN :

8 Schools that push for basic science with good track record of residents doing basic science projects and tenure track faculty with labs etc...

- Wash U- St Louis (Basic science center)
- Yale
- MSKCC (research at MSK, Cornell, Rockefeller labs)
- U of Chicago
- U of Michigan
- U of Wisconsin
- UCSF (they oversubscribed to the Holman so it's hard to tell whether or not they will be able to maintain the support for basic science. Their non-holman track rather fragmented for a successful basic science project)
- Harvard (protected year for everyone, 1-2 holmans per year - can do research at any harvard lab. Although one note of caution is that they do not seem to hire too many "Tenure-track" faculty with startup funds with 80-20 positions. I know this has been discussed before on the interview impressions thread, but the examples cited on that thread were for instructor positions (i.e. part-time clinical and part-time extension of the post-doc and not full tenure track positions with start up funds as in other institutions). Whether these instructors will, in the future, be hired as permanent members is unknown to me. However for training purposes this may be irrelevant since you can obviously apply to other institutions coming out of there but you may not have a large proportion of basic-science radiation oncology tenure track faculty at present.

3 programs that have chairs who are pushing hard for basic science:
- UCLA
- NYU
- UTSW

Obviously this list is subjective and I'm sure I might have missed schools that should belong to this list. But these were my two cents.
 
In response to Beemer, above:

I have not submitted rankings and probably will not for many of the reasons already hashed out here. But I think that the specific question of programs supporting physician-scientist pathways is a great one for those of us thinking along those lines. I do not have a comprehensive list, and I do not believe there is a top 10. What I do think is that if you are looking to get a physician-scientist job when you are done, there are some specific characteristics that could be key to your decision that will significantly differ from applicants planning other types of careers. Here are some of the ones I think matter:

1. Basic science infrastructure already in place *in the area you are interested in pursuing*. You cannot create infrastructure during residency. But it is not critical that the basic scientists you want to work with are in the department. Several otherwise *very strong* programs are weak here, and that may make the prospect a non-starter in my book.
2. Departmental culture enthusiastic about basic science. Much is made of whether there are a lot of faculty in the department who are basic scientists themselves. That certainly can't hurt, but I think what matters most is that the department leadership (read: *chair* and PD) establish the idea in the department that residents pursuing basic science will be supported with the magic quantities *money* and **time** (not just lip service). That trickles down to clinical faculty, who will therefore be more understanding when you have to be somewhere other than clinic due to scientific pursuits.
3. Research mentorship. This is where some physician-scientist faculty in the department could be useful. But you can get away without them if the institution has other faculty who are enthusiastic about mentoring you. I think to qualify they need to have a track record of success and enthusiasm for mentoring you in publishing and *grantsmanship*.

The issues of research time and Holman support deserve attention as well. A lack of time means you have no hope. How much time you need, and whether it is continuous, is somewhat dependent on your type of research, but programs that give a full year of protected research and/or support Holman are demonstrating tangible evidence of number 2 above. I think that counts for something. However, caution is advised when programs talk a lot about number 2 but have no evidence of number 1. There are many of those, and while I do not doubt their sincerity, I do question the likelihood of success.

Young physician-scientist faculty in the department are an excellent indicator of number 2 and number 1. Also, your best shot at a faculty position with protected research time straight out of residency is probably your own program--if they have hired such people before, that is certainly encouraging.

A track record of graduates securing 80:20 jobs is extremely encouraging. There aren't that many.

OK, so here is a non-comprehensive list of programs that I think are promising based on some combination of the criteria above:

-Michigan
-Stanford
-WashU
-Hopkins (no Holman, unfortunately, but otherwise strong)
-Harvard
-UCSF

I agree with the poster above that UCLA wants to be there and has the tools to make it happen. I would add that UCSD and Yale have all the pieces in place and are likely to have success with recruitment of appropriate residents (Yale has one who is PGY4 or so and doing Holman).

Finally, I just note that an idea very popular among chairs is that prospective physician scientists should spend some time as postdoctoral research fellows after residency.
 
Last edited:
i think everyone has a good idea of what the top programs are, what about the programs that are more likely to accept students with lower than rad onc-average Step 1 scores (220-230 range)? all i've seen is students with 240+ posting in the other threads..
 
I'm a rad onc that's been in practice for 15 years. I stumbled upon this threat, and sort of chuckled at some of the responses... remembering also those days. Here are a couple of thoughts:

1. When looking for a program, know what you want out of the program. The primary distinction, is whether or not you want to keep your options open for academics. If so, then you will want to go to a nationally recognized top school, and get some good experience doing research. Even in the schools that produce a lot of academic physicians, only about 50% go into academics, but those that choose private practice tend to be highly competitive candidates in that arena as well. If you want maximum options when you finish, go to one of these schools. If you are sure you want to go into private practice, then (almost) all of the programs will provide the necessary opportunities to become a competent radiation oncologist.

2. Something else that is harder to determine, but I believe is underrecognized, is how "ethical" are the attendings. They held strongly to the ideals of "evidence based medicine." When quoting the literature and/or teaching, they didn't just talk about their work, but new the national/world-wide literature. We all have biases, but they were very intentional about making the right choices for the patient, and were openly cautious about potential 'conflict of interests' related to overtreating, or any other inherent potential incentives. Call me old fashioned, but I loved it that all of the attendings where I worked were married to their first wives, and their was never any weird drama. They weren't afraid to talk about 'cost-effectiveness.' They saw themselves as an oncologist first, not a radiation technician who treats everyone who walks through the door. Everyone has their flaws, but their motivations were ethical.

3. If potentially interested in research, inquire about how many finishing residents enter academic programs, how many residents publish as first authors (and how many publish more than once), and what their aspirations are for their ideal resident.

4. The worst kind of arrogance... arrogance that EITHER can't back up the confidence in a peer-reviewed setting (how many publications, how many invited speakers events, etc) OR is unaware of what the rest of the world is doing (all they can talk about is how they do it, but seem uncaring about how anyone else thinks about it - they probably shouldn't aways agree, but they should know the opposing opinions and be able to respond to it with evidence based medicine). Harvard probably has some arrogance, but perhaps it has been earned, as they do a great job in those two areas.

BTW, I noticed some mixed comments about Duke. My personal opinion is that based on the criteria above, they are one of the top 5 programs in the country. Some of their surgery residency programs might be a bit malignant, but the rad onc program is quite balanced. Their techology base is also now strong, their volume and diversity of patients is great, their attendings are highly capable and respected, their are great research opportunities for those who are interested, they highly value ethical/evidence based medicine, etc.

Anyway... good luck!
 
So, I wanted to contribute based on what I heard out on the interview trail this cycle, and this is a very interesting process. I know that, generally speaking, we are fans of a distinct top 10, but I found there is so much more ambiguity in the real world that this can be disingenuous to those who rely on these.

I’d also be remiss, if I didn’t say that there are SO many great programs out there that provide wonderful training, that while a “top 10” is a noble goal, many programs will help you in your path to becoming a top notch rad onc.

“Tied for 1st”, aka Elite Programs
Goes without saying, but in terms of reputation, its hard to argue that these three are nearly always mentioned in the same breath when listing the “top programs.”
1) Harvard
1) MD Anderson
1) Memorial Sloan Kettering

“Rest of the top 10”, aka Top tier programs (nearly arbitrary order)
4) U Penn - Arguably making strides toward being considered alongside the other three elite programs, I couldn’t believe how many great things I heard from nearly everyone about this program.
5) UCSF - How much of it is location? I don’t know. Personally, I’m not a fan of California, but i believe the general consensus is that this is still a top tier program as it is often mentioned immediately after the above group.
6) Yale - Everyone loves this program. Happy residents, productive research and a killer name via the university.
7) Wash U - Research power house, and with new leadership could continue to rise.
8) Chicago - Mentioned consistently as THE place for basic science research, but perhaps at the expense of clinical training/research.
9) Johns Hopkins - Used to never be mentioned on these boards, but talking with people recently they seem to agree that is beginning to live up to the reputation of the rest of the medical center.
10) Stanford - Rad Bio, huge historic name.
11) Michigan
12) Duke
13) University of Wisconsin - Amazing faculty, and clearly one of the best places for training, and great research. One of the better balanced departments.

“Lower top tier ( arbitrary order)
Mayo Clinic (Some of the best training in the country, huge research machine)
UCLA (Name is gold in So-cal, and the department is moving in the right direction)
Vanderbilt (Lost a couple faculty recently, but new chair who trained at Harvard has some great ideas to diversify the research goals of the department, and they have one of the most diverse faculty groups in the south)
Beaumont (Probably one of the best clinical training places in the country, took a small hit with the faculty exodus, but still a research power house that again, has incredible training. )
Emory (Strong name, and heard nothing but positive things about its growth potential on the trail.)

“Up and coming”, aka the Upper mid-tier but potential to rise to Lower Top shortly
UCSD (Could be placed higher, but still needs a few years to mature)
Moffitt
Ohio State (Not top tier yet, but has potential with the new cancer coming online in 2014), Southwestern
Maryland (Mehta and protons with a near doubling of faculty in the near term could make this even better)
University of Washington (Location, amazing name in medicine, and a new chair with new vision),

“Solid middle tier”, amazing training but just not as much name recognition
Utah, UAB, Cleveland Clinic, Florida, and many others (MCW, UNC, etc.) (These places are all incredible places to train and very stable. One of the chairs at these programs told me when I asked him what changes were coming, “Well, we’re pretty happy with the way things are here.” Thats fair.)

Well that was fun, and stressful. I’m sure I forgot some/placed them in places you’d disagree with. Personally, I got the impression that it was the Top 3, then the rest of the top tier as listed above, followed by the mid tier. Ranking them within those tiers is multi-factorial. I feel like the SDN top 10 makes up the Top Tier, then there are 5-7 programs in the “lower top tier” that are in the gap on the way down. Then there are mid-tier programs that are solid, amazing places to train just without the prestige of a research powerhouse.

Also, as a note. I don't necessarily agree personally with everything in this list, but this is the prestige list. Depending on which factor you weight heavier this list could change as I feel like the clinical training at some of the lower programs is superior to some at the top, so really there needs to be a nuanced list that weights the few big factors that people use to make decisions. That said, this is the "Prestige/reputation" weighted list. Enjoy!
 
What do people think are the top 10 programs explicitly for training and placing physician scientists (basic science / translational medicine) into 80:20 research : clinical positions?

During interviews I asked very point-blank questions about research support. I was not really interested in going to a place that would not support basic science research, so I wasn't worried about the impression these questions made upon my interviewers. There are two important components - track record and trajectory. Personally, I believe trajectory is more important than track record (and is largely dependent on the chair). In the end, your academic job options largely depend upon your personal research productivity and funding track record.

I think that most programs are aligning themselves as either pro-science or ambivalent-at-best. I agree with those posted above, but wanted to emphasize a strong research commitment from the following institutions as well:

MSKCC - Simon Powell (chair) is a top level basic scientist and the faculty is very supportive of research (including Holman). Also, you have full access to Rockefeller (!!!), Cornell, and MSK labs.
MDACC - new president Ron DePinho is a world-class scientist who is redirecting MDACC's resources into basic and translational research. His quote is that they have reached the limit of improving cancer care via clinical improvements and that any further improvements will be built on the back of basic science discovery. Expect more and more science from this traditionally clinical powerhouse.
UCSD - AJ Mundt is building an outstanding department within an institution that has nearly unparalleled research opportunities (Salk, Scripps, UCSD), and he is very interested in supporting self-directed inquisitive residents. Not a fan of Holman, however.
OHSU - Charles Thomas is a visionary within this small department who will do anything to support motivated scientists, including Holman. Research opportunities growing with the Knight Cancer Institute, Joe Gray, Brian Druker, and Lisa Coussens hiring new cancer research faculty.
UTSW - Legendary research institution that is now focusing its efforts on building a clinical reputation. Hak Choy is a very ambitious chair who wants to attract scientists to his program. Lots of recent hires of physician-scientist basic science radonc faculty and an institutional history of providing excellent support for physician scientists.
 
Last edited:
Macbergleton makes some good points. I agree that MSKCC has made excellent connections with nearby institutions. Along the same lines, I wonder if MDACC will build collaborative basic science programs with Baylor or capitalize on the UT connection in pursuit of their basic/translational goals. Re: UCSD, I agree Dr. Mundt appears determined to make great strides on the basic and translational fronts, and I have no doubt they will be successful. It will be interesting to see what they do with Holman; during interviews, Dr. Mundt told me that they certainly will have a Holman resident at some point and he envisioned it as a great program for a non-PhD resident interested in a physician-scientist career.
 
Macbergleton makes some good points. I agree that MSKCC has made excellent connections with nearby institutions. Along the same lines, I wonder if MDACC will build collaborative basic science programs with Baylor or capitalize on the UT connection in pursuit of their basic/translational goals. Re: UCSD, I agree Dr. Mundt appears determined to make great strides on the basic and translational fronts, and I have no doubt they will be successful. It will be interesting to see what they do with Holman; during interviews, Dr. Mundt told me that they certainly will have a Holman resident at some point and he envisioned it as a great program for a non-PhD resident interested in a physician-scientist career.
As I understand it, there are few-to-no barriers for MDACC residents wanting to work with PIs at Baylor, UT, Texas Children's, or the Methodist Hospital Research Institute. One current resident has been quite successful working at TMHRI.
 
As I understand it, there are few-to-no barriers for MDACC residents wanting to work with PIs at Baylor, UT, Texas Children's, or the Methodist Hospital Research Institute. One current resident has been quite successful working at TMHRI.

Agreed - no barrier whatsoever for MDACC residents. In terms of research opportunities/collaborations/grants, you're limited by your own interests essentially.
 
FYI...this is totally just for fun! I know people will most likely disagree. And I am a resident, however, not smart enough to be at any of the following "Top 10" :)

10. U Michigan

9. Hopkins

8. Stanford

7. UCSF

6. Wash U

5. Yale

4. UPenn

3. MD Anderson

2. MSKCC

1. Harvard
 
Many of the rankings on this thread are highly suspect as they come from medical students applying to the specialty rather than from residents or attendings. Medical students tend to be influenced by factors that do not come into consideration by people already in the field (such as how easy or cush a program is, as for example is the case with Yale). The best way to get a sense of how departments are actually considered, imo, is as follows:

1. Talk to professors, off the record, from the top departments. They'll often give you a frank assessment if you've already established rapport with them.
2. Look at where a program's graduates are being hired on as new faculty.
3. Look at which departments newly hired chairmen are coming from

Additionally, there are a few things on these boards that have been hinted at but I think should be stated explicitly for the benefit of medical students. For whatever reason (probably the small nature of the field) people seem hesitant to state these widely known perceptions. The most apparent omission is acknowledgment of the fact that there happen to be some very malignant programs out there. Everyone knows this. The two most prominent examples are MSK and UChicago. People should be very aware of this before deciding to go to either of these programs, and be very careful when valuing reputation over how malignant a program is.
 
1. Talk to professors, off the record, from the top departments. They'll often give you a frank assessment if you've already established rapport with them.
2. Look at where a program's graduates are being hired on as new faculty.
3. Look at which departments newly hired chairmen are coming from

This is really for academic only. Otherwise this information will be of little consequence.

The most apparent omission is acknowledgment of the fact that there happen to be some very malignant programs out there. Everyone knows this. The two most prominent examples are MSK and UChicago. People should be very aware of this before deciding to go to either of these programs, and be very careful when valuing reputation over how malignant a program is.

There is malignant and there is 'Rad Onc' malignant. We are not talking about 'Surgery' malignant where residents have to work 100 hours per week and then lie about it so that the program doesn't run afoul of the ACGME.

Also, I can't speak for Chicago, but MSKCC is quite up-front about what it takes to be a resident there.
 
Many of the rankings on this thread are highly suspect as they come from medical students applying to the specialty rather than from residents or attendings. Medical students tend to be influenced by factors that do not come into consideration by people already in the field (such as how easy or cush a program is, as for example is the case with Yale). The best way to get a sense of how departments are actually considered, imo, is as follows:

1. Talk to professors, off the record, from the top departments. They'll often give you a frank assessment if you've already established rapport with them.
2. Look at where a program's graduates are being hired on as new faculty.
3. Look at which departments newly hired chairmen are coming from

Additionally, there are a few things on these boards that have been hinted at but I think should be stated explicitly for the benefit of medical students. For whatever reason (probably the small nature of the field) people seem hesitant to state these widely known perceptions. The most apparent omission is acknowledgment of the fact that there happen to be some very malignant programs out there. Everyone knows this. The two most prominent examples are MSK and UChicago. People should be very aware of this before deciding to go to either of these programs, and be very careful when valuing reputation over how malignant a program is.

Looks like I'm going to have to come to the defense of my fellow med students on this one.

First, I know it is popular to criticize online rankings, but I actually think the rankings on this website reflect reality. As RollTideRadOnc suggests, I spoke with a number of attendings and residents while I was on the trail this year about program rankings and got the impression that the online rankings, while imperfect, are fairly accurate. Second, I spoke with a well connected ASTRO fellow on the interview trail who reviews the rankings on SDN and was told that, overall, the rankings are pretty accurate.

Medical students do rank programs, but their rankings haven't differed significantly from the residents who have posted their own rankings. Medical students are the ones who are actively visiting programs, talking with attendings across the country, and comparing programs.

When considering the competitiveness of rad onc and the types of applicants applying to the specialty, I don't think med students rank programs based on 'easiness' or some other lame reason--they are seeking the best training for a competitive job market.

So on the whole, I think you can ignore the rankings here and just write them off as useless and inaccurate, but I think they are a reasonable reflection of reality.
 
For anybody who has trained or visited Yale, the "happy residents" concept you hear about from there is not necessarily defined by an "easy/cush" experience. Its the environment that makes the difference and that has evolved over a lengthy period of time apparently. Based on volume, or how late people stay at work, or whatever metric you choose, Im not sure that those in the know would think its necessarily "easy/cush" at all.
 
Many of the rankings on this thread are highly suspect as they come from medical students applying to the specialty rather than from residents or attendings. Medical students tend to be influenced by factors that do not come into consideration by people already in the field (such as how easy or cush a program is, as for example is the case with Yale). The best way to get a sense of how departments are actually considered, imo, is as follows:

1. Talk to professors, off the record, from the top departments. They'll often give you a frank assessment if you've already established rapport with them.
2. Look at where a program's graduates are being hired on as new faculty.
3. Look at which departments newly hired chairmen are coming from

Additionally, there are a few things on these boards that have been hinted at but I think should be stated explicitly for the benefit of medical students. For whatever reason (probably the small nature of the field) people seem hesitant to state these widely known perceptions. The most apparent omission is acknowledgment of the fact that there happen to be some very malignant programs out there. Everyone knows this. The two most prominent examples are MSK and UChicago. People should be very aware of this before deciding to go to either of these programs, and be very careful when valuing reputation over how malignant a program is.

I respect your objection, and think that the real benefit of this thread is that you can see many peoples opinions and therefore get a more general sense of the top 10 rather than a hard and fast set of departments.

I am still a medical student, but posted a ranking not based on what I thought but rather what I gathered over many, many interviews. Between my interviews and away rotations I feel like I had pretty good idea of what people thought in the academic world. As I mentioned in my post, I didn't necessarily agree with my own list. However, for the reasons you mentioned I didn't want my bias to influence the list since I am at the very beginning of my career.

As I have progressed these last 4 years this list has gone from "the grail" to, "Eh, Harvard, MSK and Anderson are a notch above." I imagine that in other 5 years my opinion of the list will change again.
 
wagy27, in case you didn't read my previous posts well, med school rankings was weighed at 18% whereas other factors such as rankings on this thread was weighed at 35%, cancer hospital ranking and NIH funding were each weighed at 24%. Thus I'm not solely basing this on med school rankings nor am I completely scraping previous posts. I am sorry you're very upset about my perspective (which I never claimed was entirely objective but only more so than previous ones), nonetheless, I still and will maintain that med school ranking and overall performance has a small and heeded contribution towards the radiation oncology program quality. I don't think I am in the minority on this.

beemer and sheldor: you both make great points, obviously to implement some of them would be time consuming. I do however agree that a field such as radiation oncology with so much research emphasis should have more respectable means to rank programs. My intention was to bring up the issue and maybe stimulate other means of ranking than the mere subjective.

Radiadouken: I agree it's turkey time. But first, let me first post 50 more messages, because apparently the number of messages posted on sdn by users is one objective metric of success according to wagy27

your ranking made me LOL
 
Haven't been on this forum in a while, but found it very useful when I was applying 2 years ago. Decided to check out this thread- it has become hilarious!

1. Someone posts an arbitrary (plus or minus arbitrarily-chosen "objective" components) ranking.
2. A current resident (and, apparently, now recent graduates!) of a program that gets the shaft in said ranking posts a scathing reply to that ranking.
3. 5-10 more posts filled continuing the bickering

I suggest we stop this thread before we all kill each other at the next ASTRO meeting. Also (admittedly, I have a conflict of interest in this prioritization) shouldn't we be more focused on getting jobs for our recent graduates? Even MD Anderson didn't hire any of their own residents last year apparently!
 
Can anyone that interviewed last year update this? I know its totally variable and people have biases, but it is very helpful for those of us applying this year. Thanks!
 
Can anyone that interviewed last year update this? I know its totally variable and people have biases, but it is very helpful for those of us applying this year. Thanks!

Mine a little bit up was written after I interviewed last year, so those are my thoughts from the trail!
 
I'll contribute because I totally used this when applying/making rank list but with a caveat that the interview impressions thread might be a better place to figure out which programs fit you the best. For example, some lists are based on basic science research, some more emphasis on location, etc. and it depends on what you are looking for personally. Also, I obviously didn't interview at all these places so this is based on my opinion from faculty, fellow interviewees, and total randomness...but it was kinda fun!

1. Harvard
2. MSK
2. MD Anderson
4. Wash U
4. Yale
6. UPenn
6. UCSF
8. Hopkins
9. Stanford
10. U of Chicago
11. U of Michigan
12. U of Wisconsin
13. Duke
14. Mayo Clinic
15. Vanderbilt
16. UT Southwestern
17. Beaumont
18. U of Florida
19. U of Maryland
20. U of Washington
21. UCLA
22. Ohio State
23. U of Colorado
24. U of Pitt
25. U of Iowa
26. Columbia
27. UNC
28. UCSD
29. U of Alabama
30. U of Utah
30. Cornell
32. Emory
33. Henry Ford
33. Thomas Jefferson
 
I hope you didn't pick your institution based on the above ranking system because to be honest it's kind of stupid.

Really, Lazers? ....So lame
We all know that for any one person to rank all programs is silly, but its better than nothing, it's fun, and makes for some fun debate. People read through the thread to get a general idea of where programs rank, so there is really no need to criticize any one post.
 
I hope you didn't pick your institution based on the above ranking system because to be honest it's kind of stupid.

Wow, sorry if I didn't include your program on my list!

For those applying, that list is totally my opinion only and just for fun. There are programs I'm sure deserve to be ranked higher or lower or included or not included. For example, places like Cleveland Clinic, Moffitt, MCW, UVA, etc probably deserve to be somewhere on the list but to be honest I just didn't interview there and didn't have friends interview there with strong opinions so I have no clue as to placement in my totally arbitrary list. Hopefully more people can contribute and applicants can see others opinions. I know the "Top 10" stays basically the same and that there will be controversy in the 10- 30 range but I think these mid-upper tier rankings are informative/fun for med students.
 
Thanks! Keep them coming!

Any thoughts to where NYU, Mt. Sinai, Fox Chase would factor into the above list?
 
Well, since you can't quantify it, maybe some people think it is, especially people that prefer the East coast. It's subjective... Not any stupider than anything else discussed.
 
Well, since you can't quantify it, maybe some people think it is, especially people that prefer the East coast. It's subjective... Not any stupider than anything else discussed.

True
 
Like who in their right mind would think Columbia (which is one of the worst programs to train at in the country)

May or may not be true, but many people have commented on how much that program has changed/is changing since K.S. Clifford Chao came on board. That's one reason why these rankings really have to taken with a huge grain of salt.

Similarly, Beaumont was a great program but unfortunately had a massive exodus of faculty to the private world. I won't claim to know the impact on that program, but I imagine it wasn't subtle.

That's why I think "tiers" make sense. There are probably 10-20 top programs which are research powerhouses and then some upper-middle and middle-tier places that are fantastic places to train from a clinical standpoint. The stuff at the bottom IMO should be programs that are on probation or that have issues meeting case requirements/diversity of cases.
 
Also, I think it helps to add a bit of commentary with rankings of programs. It helps the reader appreciate the perspective of the poster.

I do concur with what RaDoNc2014 said in regard to not being able to include all programs. For instance, I know a few programs that are "good" purely by reputation - I do not know faculty, current/former residents, or med students who rotated there. This may lead to inadvertent omission in rankings.
 
Actually I think this sort of random ranking is quite harmful for medstudents, where someone randomly assigns a rank to programs. This gives them the false impression that one program is better than another. Like who in their right mind would think Columbia (which is one of the worst programs to train at in the country) is better than UCSD?

I've heard that MD Anderson also scores quite poorly in the "is in midtown" category, important to those that have either married well or enjoy living in Manhattan on 60k/year.
 
Like who in their right mind would think Columbia (which is one of the worst programs to train at in the country) is better than UCSD?

I think you could make an argument for it. UCSD is a brand new program with no current graduates. I think it would generate remarkably little buzz if it wasn't for the location. I'm not saying it's a bad program, but I think it's laughable when a program that new is listed as a top program.
 
but I think it's laughable when a program that new is listed as a top program.

Perhaps, but when UTSW and Moffitt opened, I don't think too many people were laughing ;) If a department is solid from a faculty, technology and volume standpoint (and has NCI designation on top of that), it's not unrealistic to think it will be a "top" program rather quickly.
 
  • Like
Reactions: 1 user
Do I dare add my rankings to the hallowed list? Apparently yes, but please be gentle if you don't agree. This is my list of what I think the "top" programs are, from the perspective of someone interested in academics. I didn't rank past the top 10 or so since there are just too many programs that I'm not familiar with outside of these. I'm basing this on personal rotation and interview experience, discussions with department chairs, and recent job placement.
Rankings are organized in general tiers, order within tiers programs are listed alphabetically.

The Big 3:
- Harvard: unmatched diversity in clinical settings, limitless research opportunities, and the biggest name in academia.
- MDACC: nice people, friendly program, great clinical research with good basic options as well.
- MSKCC: considered "overrated" by some, but still one of the top three programs in my book. Huge clinical volume and increased emphasis on basic research with Dr. Powell. Very strong brachytherapy training.


The Next Best (A)
- Michigan: fantastic academic placement, insane physics and bio, and clinical leaders in many disease sites.
- Penn: getting a lot of hype on the trail, mostly deserved I think. Lots of recent growth and good research. Big emphasis on protons. Supposedly correcting previous deficiency in brachytherapy training.
- UCSF: big names and (disputed) king of the west coast.
- Wash U: good record of academic job placement, lots of research options with very strong Holman support, and good tech.


The Next Best (B)
- Chicago: great history, great basic science, and just a little bit scary.
- Duke: one of the "old boys," still highly regarded by the academic community. Possibly correcting deficiency in brachytherapy training.
- Hopkins: is the hype from the trail warranted? Maybe, but it's definitely a strong program heading in the right direction. Strong support for basic science research although Holman not supported. Charismatic chair.
- Stanford: nice location (although crazy expensive without the housing stipend offered by some other Stanford residency programs), big name, great options for research. Concerns in quality of life and education holding it back from its true potential.
- Wisconsin: a fan favorite among applicants. Happy residents, strong physics.

- Yale: another fan favorite, but good training in a nice environment with Yale's research powerhouse at your disposal.

The Honor Roll: I'm not as familiar with these programs, but I think the next best, in no order, would be: Emory, Vanderbilt, UCSD, Moffit, Mayo
 
Last edited:
Do I dare add my rankings to the hallowed list? Apparently yes, but please be gentle if you don't agree. This is my list of what I think the "top" programs are, from the perspective of someone interested in academics. I didn't rank past the top 10 or so since there are just too many programs that I'm not familiar with outside of these. I'm basing this on personal rotation and interview experience, discussions with department chairs, and recent job placement. Programs are organized in "mini-tiers" and order within these is arbitrary.

The big three:
-Harvard
-MDACC
-MSKCC

The next best (A):
-Wash U
-Penn
-Michigan
-Hopkins
-UCSF

The next best (B)
-Duke
-Yale
-Stanford
-Chicago

I would remove Hopkins from the A list and Duke from the B list.
 
Top