Average step 1 score for each medical school?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
I'm bored. Another fail attempt at a hit thread. This isn't even a conversation worth having. OP doesn't even make it entertaining. At first he is all: "omg im not entitled to know my schools step 1 avg if im going to be going into debts!!1111" Then says: "Lol.. diary is gone.. having a great time in med school now lololol!!!11111" '

Troll harder next time plz 🙄
 
If you can form a half-cogent argument why USMLE scores should or should not be publicized, you can publish it in some crappy medical education journal and add it to your resume, like this gunner from Brown: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3746973/

Also, this issue is becoming less contentious because USNWR is already reporting scores. You have to pay for their online compass subscription service. It's a step in the right direction until schools get caught fabricating their scores (as several undergrad schools have in the past), at which point there might be more impetus for the NBME to give the data straight from the source in order to restore dignity to medical education administrators. The next question is whether/when USNWR will integrate the scores into their rankings.

2012 scores of top 25 research medical schools in USNWR 2014:
Penn (Perelman) 243
WashU 241
Harvard 240
U Chicago (Pritzker) 240
Baylor 240
Columbia 240
Northwestern (Feinberg) 239
Yale 239
Stanford 239
Vanderbilt 238
Johns Hopkins 238
Cornell (Weill) 237
UCSD 236
Duke 236
Mount Sinai (Icahn) 235
UCLA (Geffen) 234
Mayo 234
U Michigan 233
Emory 232
UNC 232
NYU 231
UCSF 230
Case Western 230
U Washington 227
 
Last edited:
A student who's spending thousand of dollars on a degree to know average step1 score of a school is an entitlement! What am I missing here?

You are missing the point. Your money (and the money that you're borrowing from the feds/other sources) buy you a medical school education. Nowhere does anything say "your tuition payment gives you the right to know the average Step 1 score" or anything to that effect. W19 continues to insist that this knowledge is actually a right (aka an entitlement, by definition). Perhaps he can make an argument that it should be a right, but he continues to insist that it already is one, which it isn't.

No one has suggested that you shouldn't be critical in choosing your medical school program. I am reminding posters that one's individual desires (without or without money involved) do not constitute a right.
 
You are missing the point. Your money (and the money that you're borrowing from the feds/other sources) buy you a medical school education. Nowhere does anything say "your tuition payment gives you the right to know the average Step 1 score" or anything to that effect. W19 continues to insist that this knowledge is actually a right (aka an entitlement, by definition). Perhaps he can make an argument that it should be a right, but he continues to insist that it already is one, which it isn't.

No one has suggested that you shouldn't be critical in choosing your medical school program. I am reminding posters that one's individual desires (without or without money involved) do not constitute a right.

I was talking from a student point of view... I think students should have the right to now these info...

I stand by that... If people choose not to use it in their decision making, I am fine with that... I don't see any convincing arguments that there is no value at all knowing that info... I won't buy the argument that what goes on into a particular school has ZERO effect on your step1 score...
 
I stand by that... If people choose not to use it in their decision making, I am fine with that... I don't see any convincing arguments that there is no value at all knowing that info... I won't buy the argument that what goes on into a particular school has ZERO effect on your step1 score...

I don't think anyone is arguing that there is a zero effect, but a lot of people who have been past the OMGSTEP1 stage of medical school are trying to say that looking at Step 1 scores in relative isolation is not only shortsighted but a really confounded metric of program quality and one would be better suited to look at other things to find out if the education is good or not. The analogy here is looking at some biomarker as an endpoint in a clinical trial rather than some obvious patient-applicable metric, like progression-free survival. We're suggesting you look at a real endpoint that everyone actually cares about by the day after Match Day.
 
If you can form a half-cogent argument why USMLE scores should or should not be publicized, you can publish it in some crappy medical education journal and add it to your resume, like this gunner from Brown: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3746973/

Also, this issue is becoming less contentious because USNWR is already reporting scores. You have to pay for their online compass subscription service. It's a step in the right direction until schools get caught fabricating their scores (as several undergrad schools have in the past), at which point there might be more impetus for the NBME to give the data straight from the source in order to restore dignity to medical education administrators. The next question is whether/when USNWR will integrate the scores into their rankings.

2012 scores of top 25 research medical schools in USNWR 2014:
Penn (Perelman) 243
WashU 241
Harvard 240
U Chicago (Pritzker) 240
Baylor 240
Columbia 240
Northwestern (Feinberg) 239
Yale 239
Stanford 239
Vanderbilt 238
Johns Hopkins 238
Cornell (Weill) 237
UCSD 236
Duke 236
Mount Sinai (Icahn) 235
UCLA (Geffen) 234
Mayo 234
U Michigan 233
Emory 232
UNC 232
NYU 231
UCSF 230
Case Western 230
U Washington 227

UCSF is only 23rd in the country in Step 1 rankings? No way I'm going there, definitely going to close doors for me.
 
It appears like people like to argue here so they can show off their intellectual 'prowessness'. That is the main reason I stay away from SDN... I did not see anything wrong with W19 statement above... When we are buying stuff these days, we like to know what we are buying...

I think someone made reference of these for-profit universities like Devry, Keiser etc... I have a friend who is paying back a massive 50k loan because one of these for-profit schools told him they had good job placement, which they didn't... The guy could not get a job and had to go back to get another AS degree at a CC...
 
I don't think anyone is arguing that there is a zero effect, but a lot of people who have been past the OMGSTEP1 stage of medical school are trying to say that looking at Step 1 scores in relative isolation is not only shortsighted but a really confounded metric of program quality and one would be better suited to look at other things to find out if the education is good or not. The analogy here is looking at some biomarker as an endpoint in a clinical trial rather than some obvious patient-applicable metric, like progression-free survival. We're suggesting you look at a real endpoint that everyone actually cares about by the day after Match Day.
I don't think we are going anywhere with this argument... It was just something I thought that should me made available to students... Anyway, Merry Xmas!🙂
 
UCSF is only 23rd in the country in Step 1 rankings? No way I'm going there, definitely going to close doors for me.

Lol.

That's what's so ridiculous about those numbers (beyond that fact that they're completely self reported).

Not surprisingly, the schools that put more grads into primary care have lower averages (UW, UCSF, UNC).
 
Yes I have a right to know that... If a school 1st time passing rate is less than 90% for whatever reason, most people will try to avoid that school... If I am spending 140k on a school, all relevant info regarding my success, which I think step 1 is, should be made available to me so I can make an informed decision... I guess we will have to disagree on that one and I don't think this is an entitlement... I don't buy that curriculum, quality of professors, type of exam given during MS1-2 and time given t0 prepare for the step do not play a major in score... Of course I know the quality of student play a big role as well, but I am ready to minimize the other components that can help students to achieve good score...

there's so many resources available, if someone does UFAP they should be able to get 200+. the only time you're going to see rates that low is when a school takes a bunch of bottom of the barrel candidates who don't study hard for it.

I mean the fact that you're even debating about passing rates kinda shows you don't understand the situation. If you want to be anywhere near competitive as any kind of applicant, you have better pass by a good margin.. The average person gets like 35 pts higher than passing... If you're worried about passing, that bodes very poorly for you. That would be like if 20 was passing the MCAT and you were worried about that.
 
UCSF is only 23rd in the country in Step 1 rankings? No way I'm going there, definitely going to close doors for me.

Lol.

That's what's so ridiculous about those numbers (beyond that fact that they're completely self reported).

Not surprisingly, the schools that put more grads into primary care have lower averages (UW, UCSF, UNC).

Are these scores only for the top 20 or for the whole country?
 
there's so many resources available, if someone does UFAP they should be able to get 200+. the only time you're going to see rates that low is when a school takes a bunch of bottom of the barrel candidates who don't study hard for it.

I mean the fact that you're even debating about passing rates kinda shows you don't understand the situation. If you want to be anywhere near competitive as any kind of applicant, you have better pass by a good margin.. The average person gets like 35 pts higher than passing... If you're worried about passing, that bodes very poorly for you. That would be like if 20 was passing the MCAT and you were worried about that.
I was not debating about passing at all; I was debating about average score ... I hope all LCME schools have at least a 90% 1st time passing rate... For the specialties (Psych/FM) that I am interested in, I just have to have average or hopefully above average score... I just thought stthe average step1 score should be made available for people who want to that that info...
 
Last edited:
it doesn't matter though, like the student selection of the schools is going to be more significant for scores than what the school teaches. just do UFAP and know it and you can get average for sure
 
it doesn't matter though, like the student selection of the schools is going to be more significant for scores than what the school teaches. just do UFAP and know it and you can get average for sure
True to some extent, but I think curriculum, quality of prof etc... will also have an impact.
 
I stand by that... If people choose not to use it in their decision making, I am fine with that... I don't see any convincing arguments that there is no value at all knowing that info... I won't buy the argument that what goes on into a particular school has ZERO effect on your step1 score...

I don't think anyone is arguing that there is a zero effect, but a lot of people who have been past the OMGSTEP1 stage of medical school are trying to say that looking at Step 1 scores in relative isolation is not only shortsighted but a really confounded metric of program quality and one would be better suited to look at other things to find out if the education is good or not. The analogy here is looking at some biomarker as an endpoint in a clinical trial rather than some obvious patient-applicable metric, like progression-free survival. We're suggesting you look at a real endpoint that everyone actually cares about by the day after Match Day.

True to some extent, but I think curriculum, quality of prof etc... will also have an impact.

I'd argue curriculum has ZERO effect on step score. Top schools have the highest averages because they end up matriculating the best students, not because they have a better curriculum. Test scores are dependent on the individual student - which is why there are random students at lower tier schools that crush the boards.

Edit: Caveat for places with reduced time for dedicated step studying, compressed curriculums. I'm referring to a standard 2 year preclinical curriculum - I dont believe it changes anything.
 
I'd argue curriculum has ZERO effect on step score. Top schools have the highest averages because they end up matriculating the best students, not because they have a better curriculum. Test scores are dependent on the individual student - which is why there are random students at lower tier schools that crush the boards.

Edit: Caveat for places with reduced time for dedicated step studying, compressed curriculums. I'm referring to a standard 2 year preclinical curriculum - I dont believe it changes anything.

I completely agree. Anyone that has taken Step 1 has half the brain to realize that Step 1 is all self study and the score is all self driven. I would say the curriculum at my school had absolutely NOTHING to do with my step score at all. I learned everything from the big 3 things:- Pathoma, First Aid and Uworld.
 
Average Step 1 score seems like an entirely worthless metric that would amount to little more than needless d**k-waving by premeds.
 
2012 scores of top 25 research medical schools in USNWR 2014:
Penn (Perelman) 243
WashU 241
Harvard 240
U Chicago (Pritzker) 240
Baylor 240
Columbia 240
Northwestern (Feinberg) 239
Yale 239
Stanford 239
Vanderbilt 238
Johns Hopkins 238
Cornell (Weill) 237
UCSD 236
Duke 236
Mount Sinai (Icahn) 235
UCLA (Geffen) 234
Mayo 234
U Michigan 233
Emory 232
UNC 232
NYU 231
UCSF 230
Case Western 230
U Washington 227


Not sure if those are even legit and then once you drop out of those "superstar" schools, you are essentially average. UCSF's Step 1 score is right at the national average, despite being a powerhouse school.
 
Those scores are from 2012 when the average was 227
Step 1 averages seem to be obsolete as soon as they are posted as they rise all the time these days
When someone posted the uva website, people saw 2014 and it says that they averaged 237 this year which is nice but I saw the averages, you see that in 1999-2001 it was 215 and it was 216 until 2004 and then it began rising

http://www.med-ed.virginia.edu/handbook/academics/licensure.cfm
 
Last edited:
I think means are relatively worthless in this context. Now if schools would publish a histogram like uva does, then that would be far more informative. Ideally it would report aggregate data from the last three years. Even then, it probably says more about the people admitted than the educational experience they receive.

There is also the issue of error in the three digit score. The scores posted above do not differ significantly from the national mean of 230. The usmle's published data says that 2 scores must differ by 16 points or more for the difference to be significant. Obviously their document was talking about individual scores, not institutional averages, but it puts the differences in score in perspective.
 
I'd argue curriculum has ZERO effect on step score. Top schools have the highest averages because they end up matriculating the best students, not because they have a better curriculum. Test scores are dependent on the individual student - which is why there are random students at lower tier schools that crush the boards.

Edit: Caveat for places with reduced time for dedicated step studying, compressed curriculums. I'm referring to a standard 2 year preclinical curriculum - I dont believe it changes anything.

Very true.

Using Step 1 as even a very minor factor in choosing a medical school is dumb. Far and away, the largest determinant of your score is YOU.
 
I think means are relatively worthless in this context. Now if schools would publish a histogram like uva does, then that would be far more informative. Ideally it would report aggregate data from the last three years. Even then, it probably says more about the people admitted than the educational experience they receive.

There is also the issue of error in the three digit score. The scores posted above do not differ significantly from the national mean of 230. The usmle's published data says that 2 scores must differ by 16 points or more for the difference to be significant. Obviously their document was talking about individual scores, not institutional averages, but it puts the differences in score in perspective.

that's not how it works statistically. You have 2 samples. To compare them, you'd use the standard error of the mean, so you'd take the SD of all step one test scores, and divide it by the square root of the number of students in the school in questions sample. I mean your way shouldn't even pass the sniff test. That's like saying if you flip a coin 100 times and you get heads 60 times and tails 40. That's probably barely statistically significant at a reasonable power, but if you did it 10000 times and got 6000 and 4000, that's going to be significant at higher levels.

Between 2 individual scores a 16 pt change can be due to chance, not between 2 samples and its going to be a way bigger difference than you're making it out to be. If you have 100 kid samples and the population SD is 16, that means at 3.2 points difference of the sample, there's significance. I'm not saying thats due to some phenomenal teaching method, but lets allow the stats to show their true meaning here. Having the class average be 12 pts above the national average is extremely statistically strong.
 
Last edited:
WashU is so obsessed with standardized test scores, I bet they like brainwash their kids with pathoma. Their people are obsessed with the MCAT so I doubt step 1 is any different
 
  • Like
Reactions: W19
that's not how it works statistically. You have 2 samples. To compare them, you'd use the standard error of the mean, so you'd take the SD of all step one test scores, and divide it by the square root of the number of students in the school in questions sample. I mean your way shouldn't even pass the sniff test. That's like saying if you flip a coin 100 times and you get heads 60 times and tails 40. That's probably barely statistically significant at a reasonable power, but if you did it 10000 times and got 6000 and 4000, that's going to be significant at higher levels.

Between 2 individual scores a 16 pt change can be due to chance, not between 2 samples and its going to be a way bigger difference than you're making it out to be. If you have 100 kid samples and the population SD is 16, that means at 1.6 points difference of the sample, there's significance. I'm not saying thats due to some phenomenal teaching method, but lets allow the stats to show their true meaning here. Having the class average be 12 pts above the national average is extremely statistically strong.

Did you actually read my post? I could respond by simply copy/pasting my first post. Please re read, this time paying attention to all of the words.

You should also Google the usmle's document on comparing scores and the inherent error in the scaling process itself.
 
that's not how it works statistically. You have 2 samples. To compare them, you'd use the standard error of the mean, so you'd take the SD of all step one test scores, and divide it by the square root of the number of students in the school in questions sample. I mean your way shouldn't even pass the sniff test. That's like saying if you flip a coin 100 times and you get heads 60 times and tails 40. That's probably barely statistically significant at a reasonable power, but if you did it 10000 times and got 6000 and 4000, that's going to be significant at higher levels.

Between 2 individual scores a 16 pt change can be due to chance, not between 2 samples and its going to be a way bigger difference than you're making it out to be. If you have 100 kid samples and the population SD is 16, that means at 1.6 points difference of the sample, there's significance. I'm not saying thats due to some phenomenal teaching method, but lets allow the stats to show their true meaning here. Having the class average be 12 pts above the national average is extremely statistically strong.

What does statistically strong even mean
Have you even taken step 1
 
Did you actually read my post? I could respond by simply copy/pasting my first post. Please re read, this time paying attention to all of the words.

You should also Google the usmle's document on comparing scores and the inherent error in the scaling process itself.

I read your post. and there's a huge difference between saying one score varies by 16 pts and using that in the context of a large sample.
 
There is also the issue of error in the three digit score. The scores posted above do not differ significantly from the national mean of 230. The usmle's published data says that 2 scores must differ by 16 points or more for the difference to be significant. Obviously their document was talking about individual scores, not institutional averages, but it puts the differences in score in perspective.
Absolutely true. 16 points is quite a large difference to anyone reading these forums but it's hard to argue with an official document. I've attached the PDF to this post for anyone else who's curious about this discussion.

Also, a while ago a number of people were speculating about the percentiles each USMLE score correlated with and came up with their own values based on data extracted from the 2011 Charting Outcomes, but the norm tables included in this document are slightly different. Note that these percentile ranks only apply to the cohort for 2011-2013. Hopefully this serves to alleviate some anxiety when people see all the amazing scores that dominate these boards. It also seems that the theoretical top score is 300 for all 3 step exams.

To the OP: it's been said before but I'll say it again. Step 1 scores have little to do with what school you go to and more to do with the kind of student and test taker you are. There are 260s at Harvard, UCSF and others as well as at state schools, and as someone posted in another thread, it's not like students at top schools get special textbooks with extra diseases that others aren't taught. Your education and your scores are largely dependent on your effort and intelligence.
 

Attachments

Scores also vary from year to year. No one in the class below us got a 260... Yet 10 people in my class got above a 260 lol
 
WashU is so obsessed with standardized test scores, I bet they like brainwash their kids with pathoma. Their people are obsessed with the MCAT so I doubt step 1 is any different

Lol. Pre-clinical classes here aren't taught to the boards as much as at other schools, though professors say they're shifting towards more "Step 1 style questions". And historically we've had 4 weeks of dedicated time (I have heard it's more like 5 this year). I don't think the attitude here towards Step 1 is significantly different than any other school.
 
Lol. Pre-clinical classes here aren't taught to the boards as much as at other schools, though professors say they're shifting towards more "Step 1 style questions". And historically we've had 4 weeks of dedicated time (I have heard it's more like 5 this year). I don't think the attitude here towards Step 1 is significantly different than any other school.

I have a hard time believing that they wouldn't have a hard on for step 1, when they are unquestionably the school with the biggest hard on for MCAT. It's truly an obsession.
 
I have a hard time believing that they wouldn't have a hard on for step 1, when they are unquestionably the school with the biggest hard on for MCAT. It's truly an obsession.

Median MCAT of admitted students factors into USNews rankings. However, Step 1 score produced by matriculants has no bearing on rankings.

So, if USNews rankings is behind the "stat-obsession" of some top schools, then it would stand to reason that the concern with MCAT needn't entail a similar concern with Step.
 
Last edited:
I have a hard time believing that they wouldn't have a hard on for step 1, when they are unquestionably the school with the biggest hard on for MCAT. It's truly an obsession.

Idk, you could make a strong argument that WashU's curriculum could be restructured in a way that is much more conductive to rocking the boards (i.e. different style of questions, more even distribution of workload between 1st and 2nd year, more dedicated time, etc...). I mean, we still score well, but it's not like everyone here is "STEP 1!!!!1!!1!1!!!" all the time.
 
So if the average Step 1 is very low, the students are more likely the issue instead of the curriculum? Then why do curriculum changes result in avg. board score drops or elevations hmmm?

Nonsense, nonsense and nonsense. Seriously go to a school that teaches for the boards, since PDs see it as a more objective metric.

If a school wants to experiment with their curriculum they should just abolish the boards. If they are not then they're knowingly toying with the future of their students.

The purpose of the first two years is to get the necessary pre clinical knowledge and that is what Step 1 is intended to test. All schools should teach for the boards. period!. In fact that's the whole point of the first two years!!! If thats not the case then abolish the boards for they serve no purpose in assessing the required knowledge. Don't hate me cause I'm logical.
 
So if the average Step 1 is very low, the students are more likely the issue instead of the curriculum? Then why do curriculum changes result in avg. board score drops or elevations hmmm?

Nonsense, nonsense and nonsense. Seriously go to a school that teaches for the boards, since PDs see it as a more objective metric.

If a school wants to experiment with their curriculum they should just abolish the boards. If they are not then they're knowingly toying with the future of their students.

The purpose of the first two years is to get the necessary pre clinical knowledge and that is what Step 1 is intended to test. All schools should teach for the boards. period!. In fact that's the whole point of the first two years!!! If thats not the case then abolish the boards for they serve no purpose in assessing the required knowledge. Don't hate me cause I'm logical.

Plz tell me this is trolling.
 
Plz tell me this is trolling.
Instead of saying that he/she is trolling, why not refuting what he said? When you look at PDs' surgery, the overwhelming majority of them use step 1 as the most important metric when choosing an applicant for a residency spot... so it would not be egregious if schools tailor their teaching method to step 1...
 
Last edited:
FWIW, COCA requires that every DO school to starting posting COMLEX passing rate in their website next year..
 
I would argue for schools to do away with P/F entirely and go back to grades and reporting real class rank on the MSPE.

Step 1 was never intended to be used as it is now; you can read numerous articles and editorials on this is the academic medicine literature going back many years. Unfortunately, board scores are the only real objective metric PDs have as schools continue to eschew grades and ranks in favor P/F and code words that may or may not be defined (see that paper in Radiology with the table of all the US med schools' MSPEs and code words/definitions). If PDs had more solid data from the schools, then Step 1 could be looked at a little more holistically rather than as the only marker of academic ability.

As it is now, I think schools should at least consider boards as they plan the curricula and write their exams, and many do whether they admit it or not. Ensuring that they cover all the material and that they test with good questions would go a long way. Making class exams similar in length to boards would also be helpful. Utilizing the same timing per question, etc.
 
I would argue for schools to do away with P/F entirely and go back to grades and reporting real class rank on the MSPE.

Step 1 was never intended to be used as it is now; you can read numerous articles and editorials on this is the academic medicine literature going back many years. Unfortunately, board scores are the only real objective metric PDs have as schools continue to eschew grades and ranks in favor P/F and code words that may or may not be defined (see that paper in Radiology with the table of all the US med schools' MSPEs and code words/definitions). If PDs had more solid data from the schools, then Step 1 could be looked at a little more holistically rather than as the only marker of academic ability.

As it is now, I think schools should at least consider boards as they plan the curricula and write their exams, and many do whether they admit it or not. Ensuring that they cover all the material and that they test with good questions would go a long way. Making class exams similar in length to boards would also be helpful. Utilizing the same timing per question, etc.

Do you know the link?
 
FWIW, COCA requires that every DO school to starting posting COMLEX passing rate in their website next year..

Who cares about passing rates? That's like have undergrad schools post the %s of pre-meds that score above 20 on the MCAT. Very few people are using passing as their definitive metric for success or not. Typical academics being unable to comprehend what is actually going on.
 
You can still teach clinical skills and teach to the boards. I think the point @DocVader1990 is making is valid. We've all had lectures that were clearly covering material one should learn during a clerkship, or material that was obviously a lecturer's area of research and only loosely related to your course.
 
Plz tell me this is trolling.

Die a hero or live long enough to be accused of trolling...

The true issue here is scarcity, there are simply not enough resources/ residency spots for everybody to get what they want since most of the applicants (if not all of them) are more than qualified. This has resulted in medical schools admission committees and residency programs to judge on a "relative scale" compared to one's peers. This system is "unfair" because people who are more than competent to perform a job may be deemed underperforming simply based on a comparison to their fellow peers. Unfortunately there is simply no other choice.

This "relative scale" practice is justified when looking at the increased overall benefit to the system ( due to having better, more driven, more fanatical applicants/future doctors) at the cost of the individual success of the students.

Tis the nature of life. Accept it and do your best to work the system but remember than in the end the house always wins.
 
how is scarcity a problem when MD match is like 99 % and DO and FMG essentially fill spots that are left over

oh wait it's not
 
I would argue for schools to do away with P/F entirely and go back to grades and reporting real class rank on the MSPE.

Step 1 was never intended to be used as it is now; you can read numerous articles and editorials on this is the academic medicine literature going back many years. Unfortunately, board scores are the only real objective metric PDs have as schools continue to eschew grades and ranks in favor P/F and code words that may or may not be defined (see that paper in Radiology with the table of all the US med schools' MSPEs and code words/definitions). If PDs had more solid data from the schools, then Step 1 could be looked at a little more holistically rather than as the only marker of academic ability.

As it is now, I think schools should at least consider boards as they plan the curricula and write their exams, and many do whether they admit it or not. Ensuring that they cover all the material and that they test with good questions would go a long way. Making class exams similar in length to boards would also be helpful. Utilizing the same timing per question, etc.

Even if we got rid of P/F, how would that makes things more objective? It still wouldn't help to differentiate students at top schools versus lesser ones (Whatever that means).

All I can see this doing is making first and second year outrageously competitive and non-collaborative, but maybe I'm wrong
 
who cares what step 1 was intended for? it clearly works or people wouldn't use it as such. step 1 seems like a good evaluator of student performance to me. it's basically a holistic evaluation of a students pre-clinical basic science knowledge. even the best M2s clinical skills are going to suck compared to an older student or resident, so I don't really know what else would be on it to adequately evaluate student performance in a relative context. not to mention it's a pretty just way to do it due to the plethora of resources out there for it, vs valuing individual performance in school more, where the resources to succeed aren't nearly as distributed and are much more subjective school to school.
 
who cares what step 1 was intended for? it clearly works or people wouldn't use it as such. step 1 seems like a good evaluator of student performance to me. it's basically a holistic evaluation of a students pre-clinical basic science knowledge. even the best M2s clinical skills are going to suck compared to an older student or resident, so I don't really know what else would be on it to adequately evaluate student performance in a relative context. not to mention it's a pretty just way to do it due to the plethora of resources out there for it, vs valuing individual performance in school more, where the resources to succeed aren't nearly as distributed and are much more subjective school to school.

It does not work, it's just the only thing that allows for medical students across the country to be compared to one another. It's a pretty poor evaluation of basic knowledge, it can tell you how much effort someone can put in and it's a good test of how well people can integrate information. The fact that you can choose between answers from a list does not bear any resemblance to real life.
 
Even if we got rid of P/F, how would that makes things more objective? It still wouldn't help to differentiate students at top schools versus lesser ones (Whatever that means).

All I can see this doing is making first and second year outrageously competitive and non-collaborative, but maybe I'm wrong

True, grades don't compare across schools, but knowing how someone compares to their peers who went through the same thing is valuable in itself. It would give step scores some context that they currently don't have. under the current system your step score is the only metric that defines you in a true p/f system. Sure, there's a fairness and objectivity to that, but having context could be valuable. A 240 from a top student vs an above average one vs a below average one each tells a different story.
 
Top