School Average Step 1 vs USNWR Rank

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

liquidcrawler

Full Member
7+ Year Member
Joined
Nov 18, 2015
Messages
286
Reaction score
485
Data: USNWR Research rankings 2018, USNWR reported step 1 average 2016

Hi guys, there is an ongoing reddit thread with a list of school's step 1 averages. S/o to @Serine_Minor for posting each school's average from USNWR data. I took the liberty to organize the data a little bit and thought I'd share it with the larger pre-med community. While USNWR rank isn't a definitive metric for how good a school is or how happy you'll be there, it may be a rough indicator for the type of students that attend those institutions. When you match step 1 average with USNWR rank, there are some interesting outliers in the data. Of course, I'd imagine the standard deviations are pretty high for these averages, so again we can probably only use this as a rough metric.

Comment: Blue = higher score/rank, yellow = lower score / rank

https://i.imgur.com/WjxjMvR.png
WjxjMvR.png

Members don't see this ad.
 
  • Like
Reactions: 6 users
Some interesting observations right off the bat is how high Cinncy and Mizzou step 1 scores are compared to similarly ranked peers. Another, perhaps not as surprising observation is how low UNC and UW step averages are compared to similarly ranked peers, perhaps due to their functions as being state schools where their mission is to train physicians willing to practice and perform primary care in the state.
 
  • Like
Reactions: 1 user
Some interesting observations right off the bat is how high Cinncy and Mizzou step 1 scores are compared to similarly ranked peers. Another, perhaps not as surprising observation is how low UNC and UW step averages are compared to similarly ranked peers, perhaps due to their functions as being state schools where their mission is to train physicians willing to practice and perform primary care in the state.

Try average Step 1 vs. mean MCAT.
 
  • Like
Reactions: 2 users
Members don't see this ad :)
...why in the world would Hofstra be ranked 71st with a 236 average score?

Median MCAT at Hofstra is 516 for what its worth, which might explain things. It's also a fairly new medical school (opened in 2008) that may have contributed somewhat to its rank on USNews.
 
Why even bring USNWR ranking into this. Only useful thing here is the average step 1 scores.
 
  • Like
Reactions: 3 users
Why even bring USNWR ranking into this. Only useful thing here is the average step 1 scores.

Because USNWR rank is generally used as a rough indicator for school caliber and step 1 average is generally used as a rough indicator for student caliber, I wanted to see how well they correlated or not. I think, for the most part, this shows they do and highlights some interesting exceptions.
 
It's all cyclic:

School has top ranking-> school gets top students -> school has higher Step1 scores -> school maintains top ranking

The MCAT is a huge confounding variable that correlation of USNWR rankings and USMLE Step 1 scores is absolutely pointless
 
If each resident forum could produce a crude list of the top X places for their particular discipline (defined as the most competitive programs) you could then use the match lists from each school (if they are published) to see what percentage of each graduating class, on average, matches at a top X place for their discipline. Alongside median MCAT / Step 1 / and USNWR ranking you could develop a more complete picture of how schools stack up.
 
  • Like
Reactions: 1 user
Try average Step 1 vs. mean MCAT.

Median is probably better than mean, but haven’t many people already tried this? IIRC Bio correlates the best at around 0.6 and every other section is weakly correlsted with Step 1.

In any case it would be trivial but tedious to do what you propose since all of the Step 1 scores are here in one place.

One question to raise: where do these Step 1 scores come from, do we have a reason to completely trust their source?

IIRC there was a line graph doing something similar not too long ago. @efle might remember
 
Members don't see this ad :)
...why in the world would Hofstra be ranked 71st with a 236 average score?



Hofstra is a relatively new school and they have recently graduated their first class. But if they keep this up, they rank will rise quickly.
 
I think if you adjusted for matriculating class MCAT you would see some of the outliers explained away.
 
If each resident forum could produce a crude list of the top X places for their particular discipline (defined as the most competitive programs) you could then use the match lists from each school (if they are published) to see what percentage of each graduating class, on average, matches at a top X place for their discipline. Alongside median MCAT / Step 1 / and USNWR ranking you could develop a more complete picture of how schools stack up.
Isn't that what doximity does?
 
Median is probably better than mean, but haven’t many people already tried this? IIRC Bio correlates the best at around 0.6 and every other section is weakly correlsted with Step 1.

I'm curious to see how well the current data set performed relative to other attempts.

Lucca said:
In any case it would be trivial but tedious to do what you propose

Well, it is research.
 
  • Like
Reactions: 3 users
Because USNWR rank is generally used as a rough indicator for school caliber and step 1 average is generally used as a rough indicator for student caliber, I wanted to see how well they correlated or not. I think, for the most part, this shows they do and highlights some interesting exceptions.
The only people who believe this are pre-meds and medical school deans. The incest by PDs at the uber-programs for Top Schools is a different matter.
 
  • Like
Reactions: 1 user
upload_2016-11-6_13-13-43-png.210554
View attachment 210550
With all the limitations in mind of the quality of the data and my abstraction method Just look at Missouri. They are hitting way above their median mcat would predict. And apparently it isnt something that is a fluke for one year. What are they doing? Are they better liars? Curious Minds want to know. But I think the most illuminating thing out of this entire conversation is that Medical schools publish input data- MCAT, GPA and such but do not publish performance data on the STEPs? Why not? Why rely on data like percent passing the step or percent completing medical training in 4 years. If they are so sure of their performance or quality of education provided, why not publish the data?

I did this some time back, There are really no good data sources to do this properly since schools may inflate or massage their step 1 averages and some report as medians vs means.
upload_2016-11-6_9-31-30-png.210550
 
  • Like
Reactions: 2 users
I recently attended an interview at FIU where they extensively bragged about their 240 average step 1 score even with an average MCAT of 509. It interesting because if that number is correct it would put them wayy up near the top ranking wise but they're definitely not considered a "top school" by most
 
  • Like
Reactions: 1 user
I did this some time back, There are really no good data sources to do this properly since schools may inflate or massage their step 1 averages and some report as medians vs means.

How can schools inflate their scores?
 
How can schools inflate their scores?
Run a multiyear average, run a single year average, run an average excluding failures, run an average on a certain subset program, run an average excluding retakes, report medians, report medians excluding failures, report medians over multiple years, run averages excluding linkage programs, run averages including linkage programs, There are an endless number of permutations and methods in which you can game the system and game the numbers.
 
  • Like
Reactions: 7 users
The only people who believe this are pre-meds and medical school deans. The incest by PDs at the uber-programs for Top Schools is a different matter.

Ah yes, because schools 1-20 definitely have the same match lists and reputation as 21-40. A student can be successful and match into a competitive speciality no matter where they go, and that's why I used USNWR ranks as a ROUGH indicator. I'm well aware many people don't take the list too seriously, but if I came from Mars and wanted to determine the best medical schools in the US, the USNWR would provide a rough idea of what's up.
 
Ah yes, because schools 1-20 definitely have the same match lists and reputation as 21-40. A student can be successful and match into a competitive speciality no matter where they go, and that's why I used USNWR ranks as a ROUGH indicator. I'm well aware many people don't take the list too seriously, but if I came from Mars and wanted to determine the best medical schools in the US, the USNWR would provide a rough idea of what's up.

You should check out this Malcolm Gladwell piece on college rankings. The same basic problem applies to medical school rankings. When USNWR first decided to start constructing these rankings they had to appear legitimate to laypeople, so they reverse engineered an algorithm that put places like Harvard, Hopkins, and Stanford at the top. For medical schools the cleanest way to do this is to emphasize institutional NIH grant funding, which has little or no direct effect on medical education.

With regard to undergraduate rankings, Gladwell summates the conundrum thusly: "A school like Penn State, then, can do little to improve its position. To go higher than forty-seventh, it needs a better reputation score, and to get a better reputation score it needs to be higher than forty-seventh. The U.S. News ratings are a self-fulfilling prophecy."
 
  • Like
Reactions: 14 users
I think most of us understand that the USNWR is not the most accurate when it comes to rankings, but I had a few questions regarding rankings in general. When residency programs consider medical school reputations, how are those reputations determined? How much do rankings/reputations matter? And how do medical students know if a school is one of the top schools?


Sent from my iPad using SDN mobile
 
  • Like
Reactions: 1 user
I think most of us understand that the USNWR is not the most accurate when it comes to rankings, but I had a few questions regarding rankings in general. When residency programs consider medical school reputations, how are those reputations determined? How much do rankings/reputations matter? And how do medical students know if a school is one of the top schools?

I got this from another sdn thread where supposedly residency program directors were asked to rank schools. The list is as follows...

Harvard, Hopkins, UCSF
Stanford, Penn
WashU, Duke, Columbia
Michigan
Cornell, UCLA, U of Wash, Vandy
Northwestern, Yale
Baylor, Emory, U Chicago, Pitt
Mayo
UTSW, UVA
NYU, Oregon, UCSD, UNC
Brown, Case Western, Dartmouth, Gtown, Sinai, Rochester, USC, U of Wisconsin
Indiana, Tufts, Colorado, Iowa, U of Minnesota
Boston U, Ohio State, U of Alabama, Wake Forest, U of Utah
Miami, Einstein

This is probably more relevant than the USNWR rankings, but it seems to correlate with it anyways...

source: Med School Rank List - Residency Directors?
 
I think most of us understand that the USNWR is not the most accurate when it comes to rankings, but I had a few questions regarding rankings in general. When residency programs consider medical school reputations, how are those reputations determined? How much do rankings/reputations matter? And how do medical students know if a school is one of the top schools?

US News is an accurate ranking as far as US News rankings are concerned. There's no universal "ranking" that is the "true" value. Another metric you can use is how PDs view schools and that is posted right above this post. In terms of residency, if you want to go to X residency, then your best bet at getting in there is by going to X school. Schools feed into their own residency programs - to the point where Harvard sends ~50% of its own grads into its own residency programs. At Stanford, it's about 1/3 to Stanford and 1/3 to Harvard. But beyond that, from people I've talked to, connections matter more. If a PD knows the department chair at your school who vouches for you, then that would give you a boost.
 
  • Like
Reactions: 1 user
I got this from another sdn thread where supposedly residency program directors were asked to rank schools. The list is as follows...

Harvard, Hopkins, UCSF
Stanford, Penn
WashU, Duke, Columbia
Michigan
Cornell, UCLA, U of Wash, Vandy
Northwestern, Yale
Baylor, Emory, U Chicago, Pitt
Mayo
UTSW, UVA
NYU, Oregon, UCSD, UNC
Brown, Case Western, Dartmouth, Gtown, Sinai, Rochester, USC, U of Wisconsin
Indiana, Tufts, Colorado, Iowa, U of Minnesota
Boston U, Ohio State, U of Alabama, Wake Forest, U of Utah
Miami, Einstein

This is probably more relevant than the USNWR rankings, but it seems to correlate with it anyways...

source: Med School Rank List - Residency Directors?

Yale and Columbia are definitely considered higher than represented (probably in the Stanford range) -- have you seen their match lists!?
 
Yale and Columbia are definitely considered higher than represented (probably in the Stanford range) -- have you seen their match lists!?
looking at matchlists requires the ability to read match lists. This is a fools errand.
 
  • Like
Reactions: 1 users
Yale and Columbia are definitely considered higher than represented (probably in the Stanford range) -- have you seen their match lists!?

The difference is not significant. It’s like 4.7 vs 4.8 out of 5 or something. Doesn’t really matter.

Neither does this list tbh
 
  • Like
Reactions: 1 users
Why does this list not matter? :/ As of now I've been accepted to a few med schools and I've been slightly looking at the match list and average step scores as factors to partially sway my decision. Ex: Arizona's in years past has always been 220-226 but Vermont's is 228-232 with some better matches. Should that not really be a factor? @Goro
There are a few problems in the evaluation you are conducting.
1. STEP scores are not reported in a standardized fashion, so are subject to gaming.
2. MCAT scores differ between the institutions and subsequent step differences may only reflect differences in matriculating students test taking abilities.
3. If you are not in a field it is extremely difficcult to determine what consitutites a "Good" program for match or the relative quality of the program in the field.
4. Match differences may occur due to differences in interests of student bodies and with a small N these can seem artificially large fluctations( 20 people going into Ortho one year and zero going in the next)
5. Match doesnt indicate if the students matched at their first choice program or field if they dual applied.
6. You are better off making evaluations based on corricula and opportunities at each school, pass /fall/unranked, 18 month vs traditional, EBL , PBL, Mandatory attendance , instructor written exams, Home programs for competitive specialities, research opportunities and funding.
 
  • Like
Reactions: 3 users
Why does this list not matter? :/ As of now I've been accepted to a few med schools and I've been slightly looking at the match list and average step scores as factors to partially sway my decision. Ex: Arizona's in years past has always been 220-226 but Vermont's is 228-232 with some better matches. Should that not really be a factor? @Goro

When given lots of choices I think a lot of people lose sight of the forest because of the trees. By this I mean students end up focusing on the minor details and/or differences between schools that ultimately don't matter. Right now it might seem that a 230 vs 235 average is a big deal but it's not a guarantee that you're going to get that score. What you should ultimately be considering is whether or not you're going to be happy being in that place for 4 years and how much debt you'll be accruing. These are 100% going to impact your overall quality of life now and in the future. Miserable at the school? Scores go down. Scores and STEP score suffers? Kiss high competitive specialties and your first choices good bye. Matching into specialty you don't want and if it doesn't pay as well and you're drowning in student debt? You're going to be that one pissed off attending residents/colleagues hate working with.

I remember when I was trying to choose residency programs to rank and parsing through little details in what "perks" one residency has over another. Then I realized these "perks" get old fast and will not carry you or make up for the fact that you hate being where you are or hate working with the malignant people in the program. So I decided what it is that will truly make me happy and chose a program based on that. I loved my 4 years at my residency program and am now very happy as an attending at the VA.
 
  • Like
Reactions: 1 users
Wow thanks for all those points. I'm most definitely going to have to rethink how I'm narrowing it down and picking a school to attend. I'm hoping I get into one or two I had interviews at that will make it an extremely clear/easy decision. If not, there's a few schools that are similar that have a lot of pros and cons for me that are going to be tough to choose, especially if match lists and step scores aren't that much help. Wow and I've like 75% been looking at match and step scores

I know of several schools that are using predictive analytics and can estimate (with disturbing accuracy) student Step 1 scores less than halfway through the preclinical curriculum. In the USMLE nature vs. nurture argument it seems that nature is the big winner.
 
  • Like
Reactions: 1 user
I know of several schools that are using predictive analytics and can estimate (with disturbing accuracy) student Step 1 scores less than halfway through the preclinical curriculum. In the USMLE nature vs. nurture argument it seems that nature is the big winner.
Can you share some of the variables they use in their models ?
 
  • Like
Reactions: 1 user
I know of several schools that are using predictive analytics and can estimate (with disturbing accuracy) student Step 1 scores less than halfway through the preclinical curriculum. In the USMLE nature vs. nurture argument it seems that nature is the big winner.
Some interviews i've been on mention stuff like this. They say they pay for NBME questions and craft a mock step 1 at the end of the first year. Based on their historical data, they can provide you with an estimate score that you'd get on the real thing in a year.
Is that similar to what you're talking about?
 
Why does this list not matter? :/ As of now I've been accepted to a few med schools and I've been slightly looking at the match list and average step scores as factors to partially sway my decision. Ex: Arizona's in years past has always been 220-226 but Vermont's is 228-232 with some better matches. Should that not really be a factor? @Goro

Which school do you like better? Which curriculum do you like better? Arizona is hot, Vermont is cold. Do you like hot or cold. What’s the price differential? How happy are the students you’ve met?
 
  • Like
Reactions: 1 users
Can you share some of the variables they use in their models ?

It all boils down to performance on basic science exams during M1 and M2. They alter the weight of certain subjects, and some also include data from the CBSE. Some use NBME questions, others use in-house questions that have been well characterized.
 
  • Like
Reactions: 1 user
Some interviews i've been on mention stuff like this. They say they pay for NBME questions and craft a mock step 1 at the end of the first year. Based on their historical data, they can provide you with an estimate score that you'd get on the real thing in a year.
Is that similar to what you're talking about?

Yes, it's similar.
 
It all boils down to performance on basic science exams during M1 and M2. They alter the weight of certain subjects, and some also include data from the CBSE. Some use NBME questions, others use in-house questions that have been well characterized.
Ah, I wasnt sure if they were doing something more complicated like incorporating mcat, SES, etc into the model.
 
Ah, I wasnt sure if they were doing something more complicated like incorporating mcat, SES, etc into the model.

There is evidence that SES and very high/low MCAT scores have some predictive value, but once you matriculate and start taking real medical school exams those factors are baked in.
 
  • Like
Reactions: 1 users
I know of several schools that are using predictive analytics and can estimate (with disturbing accuracy) student Step 1 scores less than halfway through the preclinical curriculum.

I believe one of those schools is Dell in Austin. The chair of medical education indicated their analytics predict that their first class will be scoring 1 STD above the national average when they take their Step 1. But who knows....
 
I believe one of those schools is Dell in Austin. The chair of medical education indicated their analytics predict that their first class will be scoring 1 STD above the national average when they take their Step 1. But who knows....
Jesus. 228 average wih an SD of 21, 249 places them at an average above every other school by about 5 points . With an incoming mcat average of 512 I call bull****.
 
  • Like
Reactions: 1 users
Jesus. 228 average wih an SD of 21, 249 places them at an average above every other school by about 5 points . With an incoming mcat average of 512 I call bull****.

IDK. Whether she misspoke or I misheard. I guess we'll find out soon enough. They indicated that starting rotations in the 2nd instead of traditional 3rd helps in this regard.
 
I guess we'll find out soon enough. They indicated that starting rotations in the 2nd instead of traditional 3rd helps in this regard.
Everyone and their uncle has been trying to adapt new corricula to introduce clinicals earlier. Even under best case scenario I suspect they will end up in 230+-10 more likely 230+-5.
 
  • Like
Reactions: 1 user
Top