Alternative to USNews Rankings for Med Schools

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Narmerguy

Full Member
Moderator Emeritus
15+ Year Member
Joined
Jul 14, 2007
Messages
6,874
Reaction score
1,357
http://journals.lww.com/academicmed...op_Research_Medical_School__A_Call.98873.aspx

This is a very interesting approach, basically the authors measured things like publications, grants, clinical trials, and awards as a way to measure research output of graduates from all medical schools going back to the 50s. They then normalized and charted out which schools do better by average, and also show some trend data. They directly pit this as an alternative to USNews, and perhaps one that has more relevance to students interested in a research career. There are interesting disparities (UCSF ranked 4th in USNews but 17th with this tool). HMS is #1 pretty much forever.

Haven't had a chance to think too much on this yet but found it interesting and worth sharing.

Members don't see this ad.
 
  • Like
Reactions: 10 users
This is a very interesting approach, basically the authors measured things like publications, grants, clinical trials, and awards as a way to measure research output of graduates from all medical schools going back to the 50s. They then normalized and charted out which schools do better by average, and also show some trend data.
Fascinating. One of their premises is that focusing on the qualities of incoming students like USN&WR does (GPA, MCAT, acceptance rate) ignores the "value added" by a medical school, which is what should guide a prospective student's decision. As in, rather than looking at what students do before medical school, we should be looking at what they do afterward.

They admit, however, that they aren't necessarily measuring value added, because they don't control for the "quality" of students matriculating at an institution in any way. If USN&WR research rankings were a proxy for this, however, you can see which schools seem to have graduates who are more accomplished than one might expect.

As for UCSF, they write:
The University of California, San Francisco, School of Medicine (UCSF) was ranked fourth by USN&WR, in part because the faculty, not the graduates, excelled in securing NIH grants.
One of the authors of the study is a UCSF faculty member. Edit: A fellow, actually.
 
Last edited:
  • Like
Reactions: 6 users
Could there be a better way for measuring the quality of (perceived) medical education with respect to non-academically minded students. I wouldn't be surprised if 90% of Stanfords class ended up in Academe and was very productive but what metric do we have for physicians who don't end up in academics? Is there a feedback system for Residency PDs to rate the interns coming out of X,Y program? Is the measurement of non-academics even important? That is, do we have any reason to believe there is a serious deficit in enough programs for school quality to warrant analysis?
 
  • Like
Reactions: 3 users
Members don't see this ad :)
Did they do this as pubs/grants per grad or such? Some schools have way more students per class.
 
I don't think there will ever be an ideal methodology to rank medical schools. There are too many variables to measure, and a lot of them are not pertinent to medical education. If anything, future studies (regarding research) will continue to scramble the US News top 30 schools.

I realize this discussion is focused on academia, but I found this article about the social mission of medical education in case anyone is interested.
http://www.medicaleducationfutures.org/sites/default/files/article-internal/804.full_.pdf

This link might be better because it provides comments (at least in the desktop version). Perhaps they are also somewhere in the mobile format (Can't look-I have to get back to work :depressed:)
http://annals.org/article.aspx?articleid=745836
 
Last edited:
  • Like
Reactions: 1 users
Going to a medical school where many graduates succeeded doesn't guarantee your success or imply that the school itself was responsible for those successes. Medical school is geared towards producing clinicians rather than prolific researchers. And from my experience, it seems that a lot of what you get out of medical school depends on individual effort and luck in getting good preceptors that are invested in student education. Excellent schools have more excellent students
 
  • Like
Reactions: 1 users
Going to a medical school where many graduates succeeded doesn't guarantee your success or imply that the school itself was responsible for those successes...And from my experience, it seems that a lot of what you get out of medical school depends on individual effort and luck in getting good preceptors that are invested in student education.
These rankings are intended to be useful for applicants. We all expect to do our best, put in the work, and hope for good luck. But some parts of medical education are about more than hard work and luck. It would be helpful to be able to measure these factors, as the authors of the study are attempting to do.
 
  • Like
Reactions: 4 users
Measuring research productivity gives little to no indication of the quality of the school or the education. Success in medical school is about ability, hard work and connections with a little bit of luck sprinkled in which is measured by scores, grades and letters of recommendation. Research is still optional unless you are going into a competitive field or area at which point it is still a secondary goal at best
 
Measuring research productivity gives little to no indication of the quality of the school or the education. Success in medical school is about ability, hard work and connections with a little bit of luck sprinkled in which is measured by scores, grades and letters of recommendation. Research is still optional unless you are going into a competitive field or area at which point it is still a secondary goal at best
You seem to be measuring success in terms of what constitutes a strong residency application. The authors of the study are measuring research output not by students, but by graduates who have completed residency.

Aggregated over many graduates across more than 50 years, I think this measures something other than hard work or luck. Whatever exactly that is might not be relevant to every applicant, yet it seems worth investigating.
 
  • Like
Reactions: 2 users
Measuring research productivity gives little to no indication of the quality of the school or the education. Success in medical school is about ability, hard work and connections with a little bit of luck sprinkled in which is measured by scores, grades and letters of recommendation. Research is still optional unless you are going into a competitive field or area at which point it is still a secondary goal at best

Oh come on. You're arguing that this kind of ranking has no utility for students with research interests?

What about networking post med school when you're former classmates are productive researchers?

Your end game is too short. Think longer term. These things play out.
 
  • Like
Reactions: 2 users
I like this. There are clearly plenty of possible confounds, which the authors acknowledge (Residency training, selection of capable students vs. capable med schools), but it's nice to see an alternative to USNews.

UCSF, as mentioned, falls significantly because their faculty are great at securing grants, but their graduates not as much. I wonder why this is the case? I bet the University of WA is not on their top 25 for the same reason; tons of research happens at UW, hence the USNews #10 research position, but they send a lot of students into primary care who do less research themselves.

But can we totally discount faculty funding? I mean, on paper it sure sounds good to go to a med school with well-funded faculty, but does that really even matter? To what extent does the funding at UCSF "trickle down" to student education/opportunities/pubs?
 
  • Like
Reactions: 1 users
There isn't a supplement online that shows full data/list, is there?

Where's U Pittsburgh? They're all about touting themselves as Research Central — maybe that's a more modern re-invention, whereas this analysis goes too far back in time?
 
  • Like
Reactions: 1 users
I'm definitely noticing the fact that all of the USNWR top 25 schools that have been pushed out are public schools that happen to excel in both research and primary care (no UCLA, UNC, Pitt, UCSD, UWash)., and replaced by Dartmouth, Brown, Rochester, Boston, and Einstein.

It probably isn't a coincidence that the USNWR schools ranked 26-29 are all public schools as well (UVA, UT-Southwestern, Wisconsin, Oregon, Iowa)

And guess who is at the early 30's? Boston, Einstein, Rochester, Dartmouth, and Brown.
 
  • Like
Reactions: 3 users
Members don't see this ad :)
I like this. There are clearly plenty of possible confounds, which the authors acknowledge (Residency training, selection of capable students vs. capable med schools), but it's nice to see an alternative to USNews.

UCSF, as mentioned, falls significantly because their faculty are great at securing grants, but their graduates not as much. I wonder why this is the case? I bet the University of WA is not on their top 25 for the same reason; tons of research happens at UW, hence the USNews #10 research position, but they send a lot of students into primary care who do less research themselves.

But can we totally discount faculty funding? I mean, on paper it sure sounds good to go to a med school with well-funded faculty, but does that really even matter? To what extent does the funding at UCSF "trickle down" to student education/opportunities/pubs?
Some of this might be related to the career preferences of graduates. Also as you mentioned in your newer post, this study goes back decades.
 
Fascinating. One of their premises is that focusing on the qualities of incoming students like USN&WR does (GPA, MCAT, acceptance rate) ignores the "value added" by a medical school, which is what should guide a prospective student's decision. As in, rather than looking at what students do before medical school, we should be looking at what they do afterward.

They admit, however, that they aren't necessarily measuring value added, because they don't control for the "quality" of students matriculating at an institution in any way. If USN&WR research rankings were a proxy for this, however, you can see which schools seem to have graduates who are more accomplished than one might expect.

As for UCSF, they write:

One of the authors of the study is a UCSF faculty member.

I agree, I don't think this measures value-added any better than USN&WR. I think it is basically a truer "research" ranking of medical schools, which USN&WR attempts to get at but without any real attempt to tie research output to students rather than to professors. A followup study that should be done is to see how schools perform for students with certain "preclinical" attributes (MCAT, GPA, previous research output, etc). So, to ask the question: For students who enter medical school with a MCAT between 33 and 37 and a GPA between 3.7 and 3.9, and with no prior research publications, how does their research output change with admission to various medical schools?
 
  • Like
Reactions: 2 users
Some of this might be related to the career preferences of graduates. Also as you mentioned in your newer post, this study goes back decades.
And I don't think the interests of the graduates necessarily reflects the capability of the schools. It's possible that because of the lower in-state tuition offered at a lot of these public schools, graduates are more likely to pursue lower-paying specialties that are less research-focused. Just because more Boston University alums conduct research than UWash alums doesn't mean that Boston University provides better research opportunities for its students than those provided by UWash.

Edited for grammar.
 
Last edited:
  • Like
Reactions: 1 user
I'm definitely noticing the fact that all of the USNWR top 25 schools that have been pushed out are public schools that happen to excel in both research and primary care (no UCLA, UNC, Pitt, UCSD, UWash)., and replaced by Dartmouth, Brown, Rochester, Boston, and Einstein.
Wow, good catch. In fact, the top public school on the list is the same as the top public school on USN&WR's research rankings list . . . none other than UCSF. The only other public schools in the top 25 are Michigan at 21st and UVA at 24th.

I do not see any mention of this striking trend in the paper. Why not?
 
Wow, good catch. In fact, the top public school on the list is the same as the top public school on USN&WR's research rankings list . . . none other than UCSF. The only other public schools in the top 25 are Michigan at 21st and UVA at 24th.

I do not see any mention of this striking trend in the paper. Why not?
Well, this is based on the most recent USNWR rankings, so rankings will have shifted quite a bit between the years.

Also, as a slight correction, Emory was also pushed out of the top 25, but I think the pattern I pointed out is valid nevertheless.
 
And I don't think the interests of the graduates necessarily reflects the capability of the schools. It's possible that because of the lower in-state tuition offered at a lot of these public schools, graduates are more likely to pursue lower-paying specialties that are less research-focused. Just because more Boston University alums conduct research than UWash alums doesn't mean that Boston University provides better research opportunities for its students than those provided by UWash.

Edited for grammar.

Perhaps, but given the time over which the data was pulled, I imagine this is relatively unimportant in the big scheme of things. Remember, it's really within the last 20 years that the cost of education exploded. In addition, the huge discrepancies in reimbursement are also a relatively recent phenomenon. Again, there was a time in the not-too-distant past when your run of the mill generalist or family physician could make a very, very decent living.

In terms of educational opportunities, I'm not really sure what you're getting at. Almost every school will offer some kind of research opportunities. However, the quality of those opportunities can vary substantially. It doesn't seem like a large leap to go down this thought process: inherent interest in research -> choose school with high research output and/or more superior research opportunities -> get good research training -> continue doing research after graduation and over the course of your career.

There are a myriad of individual exceptions in any kind of examination like this. I don't think anyone would imply that, using your example, UW doesn't have research opportunities available to its students. That doesn't mean that the trends are invalid, and it doesn't counter the possible implication that a school like BU may have more substantial research opportunities available to be a part of.
 
  • Like
Reactions: 2 users
Wow, good catch. In fact, the top public school on the list is the same as the top public school on USN&WR's research rankings list . . . none other than UCSF. The only other public schools in the top 25 are Michigan at 21st and UVA at 24th.

I do not see any mention of this striking trend in the paper. Why not?

Don't want to rock the boat too much, I imagine.
 
  • Like
Reactions: 1 user
Perhaps, but given the time over which the data was pulled, I imagine this is relatively unimportant in the big scheme of things. Remember, it's really within the last 20 years that the cost of education exploded. In addition, the huge discrepancies in reimbursement are also a relatively recent phenomenon. Again, there was a time in the not-too-distant past when your run of the mill generalist or family physician could make a very, very decent living.

In terms of educational opportunities, I'm not really sure what you're getting at. Almost every school will offer some kind of research opportunities. However, the quality of those opportunities can vary substantially. It doesn't seem like a large leap to go down this thought process: inherent interest in research -> choose school with high research output and/or more superior research opportunities -> get good research training -> continue doing research after graduation and over the course of your career.

There are a myriad of individual exceptions in any kind of examination like this. I don't think anyone would imply that, using your example, UW doesn't have research opportunities available to its students. That doesn't mean that the trends are invalid, and it doesn't counter the possible implication that a school like BU may have more substantial research opportunities available to be a part of.
I was guess what I was saying was that for a lot of these public schools that are strong in research and primary care, they are more likely to get a mixture of students with a great variance in regards towards interest in research. It's impossible for me to quantify the amount of opportunities available at a school, but I would imagine that it's proportional to the research money acquired by that school. Just because certain students don't utilize the available funding/resources doesn't mean that it's not there.
 
  • Like
Reactions: 1 users
I was guess what I was saying was that for a lot of these public schools that are strong in research and primary care, they are more likely to get a mixture of students with a great variance in regards towards interest in research. It's impossible for me to quantify the amount of opportunities available at a school, but I would imagine that it's proportional to the research money acquired by that school. Just because certain students don't utilize the available funding/resources doesn't mean that it's not there.

Ah, I see your point. With that I would agree, and there very likely is some degree of selection bias in terms of the students that want to attend a certain institution. This could theoretically be quantified to some degree, but I doubt many people have the interest or desire to put the necessary work in to figure out just how much that plays a role in this whole shebang.
 
  • Like
Reactions: 1 user
Ah, I see your point. With that I would agree, and there very likely is some degree of selection bias in terms of the students that want to attend a certain institution. This could theoretically be quantified to some degree, but I doubt many people have the interest or desire to put the necessary work in to figure out just how much that plays a role in this whole shebang.

I'm not sure what kind of quantification you're thinking of. Could you elaborate?
 
I'm not sure what kind of quantification you're thinking of. Could you elaborate?

You could survey M1s to ask about their interests at matriculation to get a sense of whether people have research, clinical, academic, community, etc. interests. Obviously these things could change over the course of their training, but it might provide some insight into people's interests irrespective of the institution's "strengths." You could also survey M4s and ask similar things though I would think surveying at matriculation would provide more insight into how the institution might shape someone's initial interests into what they actually end up being "in reality."

Since the report in the OP seems to place a premium on students that ultimately do some form of research in their regular career, figuring out what proportion of students are actually interested in research and how many of those people actually end up doing research would be a more useful metric. This might be a more valuable metric than "how many people in the class end up pursuing research" by accounting for the people that have no interest in research to begin with, as @aprimenumber brings up.

I'm sure there's a more elegant way to explore things like this, but that's all the brainpower I've got to contribute to this theoretical investigation.
 
  • Like
Reactions: 3 users
I thought it was interesting that Harvard and JHU are basically in their own league when it comes to the composite ranking. And the list is very top heavy with the difference between 1 and 5 being larger than the difference between 5 and 24.
 
You could survey M1s to ask about their interests at matriculation to get a sense of whether people have research, clinical, academic, community, etc. interests. Obviously these things could change over the course of their training, but it might provide some insight into people's interests irrespective of the institution's "strengths." You could also survey M4s and ask similar things though I would think surveying at matriculation would provide more insight into how the institution might shape someone's initial interests into what they actually end up being "in reality."

Since the report in the OP seems to place a premium on students that ultimately do some form of research in their regular career, figuring out what proportion of students are actually interested in research and how many of those people actually end up doing research would be a more useful metric. This might be a more valuable metric than "how many people in the class end up pursuing research" by accounting for the people that have no interest in research to begin with, as @aprimenumber brings up.

I'm sure there's a more elegant way to explore things like this, but that's all the brainpower I've got to contribute to this theoretical investigation.

Is there a significant amount of premeds/entering MS1s who would claim to have zero interest in research?
 
I'd imagine there is a greater proportion at primary-care focused schools.

And that level of interest is likely to be different even amongst students professing interest in research. How does having previous graduate degree or being MSTP factor into this?
 
And that level of interest is likely to be different even amongst students professing interest in research. How does having previous graduate degree or being MSTP factor into this?
The survey wouldn't have to be binary. It can be a 1 to 10 scale or something.

The additional factors (i.e. graduate degree, MSTP) wouldn't need to be external factors to be considered. They can be shown through the responses returned. A MD/PhD candidate will likely mark a 10 down for their interest, and even a non-MSTP/non-dualdegree/non-previous degree applicant can mark down a 10 if they are still absolutely motivated to perform research.
 
The survey wouldn't have to be binary. It can be a 1 to 10 scale or something.

The additional factors (i.e. graduate degree, MSTP) wouldn't need to be external factors to be considered. They can be shown through the responses returned. A MD/PhD candidate will likely mark a 10 down for their interest, and even a non-MSTP/non-dualdegree/non-previous degree applicant can mark down a 10 if they are still absolutely motivated to perform research.
I actually mean the degree as a factor external to professed interest in research. Namely, does one's previous exposure to graduate degrees or attainment of a graduate degree actually change one's ability to achieve research career success, given a particular level of professed interest in research. To illustrate this, does an MD/PhD with a research interest of 5 achieve more than a MD only with a research interest of 5. If yes, one should also consider the proportion of a school's students who possess dual degrees.
 
  • Like
Reactions: 3 users
I'm definitely noticing the fact that all of the USNWR top 25 schools that have been pushed out are public schools that happen to excel in both research and primary care (no UCLA, UNC, Pitt, UCSD, UWash)., and replaced by Dartmouth, Brown, Rochester, Boston, and Einstein.

It probably isn't a coincidence that the USNWR schools ranked 26-29 are all public schools as well (UVA, UT-Southwestern, Wisconsin, Oregon, Iowa)

And guess who is at the early 30's? Boston, Einstein, Rochester, Dartmouth, and Brown.

Exactly.

There's no way schools like Einstein and Rochester have more/better research opportunities than UCSF and Michigan. Not to mention UCLA and Pitt.

Public research schools usually have a mission to produce primary care physicians for their state in addition to physician researchers. Because of this, they tend to recruit more students who aren't interested in research compared to T50 private schools. In addition, they usually have cheaper tuition and their students are less pressured to enter more competitive higher paying specialties. It only makes sense then that their alumni produce less research down the line. However, this isn't really relevant to research opportunities available to current students.

IMO US News would be better for ranking schools according to research opportunities.
 
  • Like
Reactions: 1 users
And that level of interest is likely to be different even amongst students professing interest in research. How does having previous graduate degree or being MSTP factor into this?

Can this new alternative model just be a measure of which schools are more focused on recruiting students with more professed interest in research? Harvard and Hopkins might be making sure that anyone that walks through their door has at least a 6 (on our imaginary scale). While the other schools are simply enrolling a more diverse student group. Does this new model really say anything about the value added to an incoming student? or the opportunities available to an incoming student?
 
  • Like
Reactions: 1 user
Actual research opportunities would be a more useful metric. For example, Cornell's neighborhood institutions (HSS, Rockefeller, and MSK) enhance the research options for its students. However, this isn't factored into the US News rankings, which makes Cornell underrated IMO.

In addition, I'm guessing that UT Houston's location (i.e. the Texas Medical Center) provides significant advantages to students who are interested in research.
 
  • Like
Reactions: 1 user
Most rankings have little utility for students. They don't tell you much besides a number which gives you a general idea of how other people perceive the school. What do you think medical school will teach you that will help you become an amazing researcher? Schools with more money will have more labs to choose from but most medical students are there for what, 2 months? Many students go to other institutions to do work; msk has people coming from there from all over the country. Clinical research can be done anywhere.

Things that actually matter for medical school: minimal classroom time. No mandatory obligations besides anatomy lab. Sufficient time for step 1. Faculty that addresses student concerns. Having access to upperclassmen and physicians to guide your specialty choice. Mentors that have the time and ability to guide students, especially retired physicians. Useful feedback from mentors. Ability to do more than sit around in rounds and write useless notes such as procedures and putting in orders. Elective time in third year to explore options. Strong department with nationally known faculty in the specialty of your interest. It's difficult to rank that
 
  • Like
Reactions: 1 user
My school did better in this ranking, so it must be more accurate.
 
  • Like
Reactions: 10 users
My school did better in this ranking, so it must be more accurate.

You joke, but I imagine part of the weights that are assigned to different parameters for things like USNews are intended to line up with "expectations". If their weights for things like, say, MCAT, caused too many unusual schools to rise/fall, they probably would "deepmhasize" that parameter. That's the trouble with rankings using multiple categories--who gets to decide what weights get assigned to what? Obviously everyone prefers a ranking that most weights the things which they are best at, and ascribes least weight to what they are poorest at.
 
  • Like
Reactions: 1 users
You joke, but I imagine part of the weights that are assigned to different parameters for things like USNews are intended to line up with "expectations".
Yep, it's "convenient" to maintain a formula that will keep Harvard at the top spot for eons...
 
  • Like
Reactions: 1 user
Is there a significant amount of premeds/entering MS1s who would claim to have zero interest in research?
Me. I have no interest in doing research as a main tenet of my career. I will only do it to become competitive for competitive residencies. Pretty sure I'm not the only one either.
 
Fascinating. Thank you for this. I guess this shows that the US News rankings can be pretty meaningless....
 
Fascinating. Thank you for this. I guess this shows that the US News rankings can be pretty meaningless....
Not saying that the US News rankings aren't meaningless, but after examining the rationale behind this alternate ranking system, and the pattern of results that emerge (namely the inherent bias against public schools), I still think the US News' criteria for ranking schools based on research capabilities and output is better.
 
Not saying that the US News rankings aren't meaningless, but after examining the rationale behind this alternate ranking system, and the pattern of results that emerge (namely the inherent bias against public schools), I still think the US News' criteria for ranking schools based on research capabilities and output is better.
Well to me, this alternate ranking system shows that schools in that 10-35 USNews ranking just might be more interchangeable than we think when it comes to research opportunity for students. This alternate ranking method seemed reasonable and completed shook up the schools 10-35. It makes you question the strict adherence to USNews where people will always regard a school ranked 15 as better than a school ranked 25. In actuality, if you change the ranking criteria a bit, then the schools may flip places. And the criteria that USnews use are largely arbitrary.
 
  • Like
Reactions: 1 user
This has already been posted and it's really not too different than US News. Almost all privates move up and there is some shuffling among the top 20. UCSF also goes way down.
 
This has already been posted and it's really not too different than US News. Almost all privates move up and there is some shuffling among the top 20. UCSF also goes way down.
Thanks for posting the other thread. I missed it!

While you're right, it's mostly the same as usnews, at least it's actually based on something meaningful and transparent. As long as pre-meds look at rankings (which we always will) isn't it about time we stop respecting usnews' for-profit garbage and focus instead on rankings like this one???
 
Thanks for posting the other thread. I missed it!

While you're right, it's mostly the same as usnews, at least it's actually based on something meaningful and transparent. As long as pre-meds look at rankings (which we always will) isn't it about time we stop respecting usnews' for-profit garbage and focus instead on rankings like this one???

Do you care about research? If not, this ranking means nothing for you. If you do, US NEWS is still pretty good.
 
I'm definitely noticing the fact that all of the USNWR top 25 schools that have been pushed out are public schools that happen to excel in both research and primary care (no UCLA, UNC, Pitt, UCSD, UWash)., and replaced by Dartmouth, Brown, Rochester, Boston, and Einstein.

It probably isn't a coincidence that the USNWR schools ranked 26-29 are all public schools as well (UVA, UT-Southwestern, Wisconsin, Oregon, Iowa)

And guess who is at the early 30's? Boston, Einstein, Rochester, Dartmouth, and Brown.
Contrary to popular belief, Pitt med is actually a private institution.
 
  • Like
Reactions: 5 users
Rankings don't mean a thing. Reputation is what matters
 
  • Like
Reactions: 1 user
Top