Average board scores

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Status
Not open for further replies.
Moreover, if a med school changes its curriculum at some point, and the new students of that curriculum have an average Step 1 score that's much higher than the year before's, can you still make the assumption that the curriculum played no part in that jump?

I assume you are talking about your own school here? I would hesitate to make much of an assumption based on one year of data. Especially since I don't really know that 240->243 or whatever it was (I've seen so many different quotes for vandy's scores here) is "much" higher
 
Last edited:
Ok, so I'm curious about something.

In theory, I totally agree that exam scores are based on your own performance, and everyone else's doesn't matter, and everyone uses the same books, etc. I also buy the "cream always rises to the top" theory where top schools have top students who have gotten top MCAT scores and are therefore top standardized test takers. Ok.

Now, I'm making the assumption that at least SOME of the scores in the list often-quoted here on SDN are accurate, + or - a few points. You may of course reject this assumption altogether, in which case my point is moot.

However, if you do accept that assumption, what do you think leads some schools to have dependably high averages every year? Sure, they tend to be the "top schools", but if you think about it, Wash U always has the super-bookish students with the absurd MCAT scores, and their step 1 average isn't the highest according to the list (which, again, may be inaccurate). If their students have the best chance of getting high scores to begin with, but other schools beat them more than once or twice, why do you think that is? (btw I'm absolutely not saying Wash U's averages are low or whatever, I'm just using the example of the biggest number-ho school)

Moreover, if a med school changes its curriculum at some point, and the new students of that curriculum have an average Step 1 score that's much higher than the year before's, can you still make the assumption that the curriculum played no part in that jump? Last year, when I was applying to schools and they had just changed their curriculum, their professors/deans/admission people always made comments like "we're waiting on the step 1 scores for the first people with the new curriculum to see how it's working out" or "we're confident that the changes will show up in higher step 1 scores", which means that the administration and faculty of the med school seem to believe they have a part in shaping those average scores.


Sorry for the long-winded point, but I guess I'm just wondering why the dogma is so much anti-curriculum here when my own common sense and people I've spoken to at various schools seem to disagree.

Step 1 is all about effort and how well of a test taker you are. As long as you've learned the basic knowledge from your preclinical years, it doesn't matter how you were taught it. Everyone has their own way of learning, whether it be attending lectures, participating in group discussions, or blowing off classes and reading at home. It doesn't matter, because as long as you've learned the basic foundation then you are on equal playing ground with other people. I don't buy into these "innovative curriculum" claims because at the end of the day each school is teaching the same material, except maybe in a different way. The main focus on studying for step 1 is trying to remember what you've forgotten. We're not talking simple physics equations like on the MCAT. No, you need to remember all those biochemical pathways, all those nerves that innervate which muscles, all those congenital disorders and what deficiencies they are associated with. That stuff requires a lot of time and effort to study for, and a lot of it is extremely boring (especially since you've learned really cool pathophysiology during your second year). Your curriculum will not help you re-memorize these things, you have to do it yourself. That's why books like First Aid and Microbio Made Ridiculously Simple became so popular because it helps to memorize all these little inane things that you need to know for the boards.

The way your school runs its curriculum is certainly important, just not so much for step 1. I don't agree that any increase in board scores (even if they are accurate, which they most likely are not) after curriculum changes means anything. It just means that that class did better than the previous class, which commonly happens anyway.

You will see next year when you take it for yourself.
 
I assume you are talking about your own school here? I would hesitate to make much of an assumption based on one year of data. Especially since I don't really know that 240->243 or whatever it was (I've seen so many different quotes for vandy's scores here) is "much" higher

Well, I was and I wasn't. My own school had a 10-ish point rise after the curriculum change, which made me wonder if it had anything to do with it, but as I mentioned in my post, I heard at other schools that their curriculum changes were supposed to bring higher scores. So faculty at other places think there's a correlation. I just wonder why everyone on SDN disagrees. I really don't want to get anyone on the defensive, I'm just honestly curious.
 
If their students have the best chance of getting high scores to begin with, but other schools beat them more than once or twice, why do you think that is?

Moreover, if a med school changes its curriculum at some point, and the new students of that curriculum have an average Step 1 score that's much higher than the year before's, can you still make the assumption that the curriculum played no part in that jump?
I never considered WashU and I didn't check out how Vandy's curriculum changed (as a school that fits your second question) so I'm of course not answering for what I think explains such differences, but what about the amount of time students have off for studying? Maybe WashU doesn't provide students with as much time to prep than other similarly ranked schools. Maybe new curriculum changes added a couple weeks to what students had off for Step 1. So is the selling point for a curriculum then how much time students have off to study?

Addressing the general nature of your questions, I think even if we can trust the numbers people have posted to within +/- 3 points (I really feel like if a school wants to reference their average scores on their website, they need to man up and post the official NBME reports like UVa does, so applicants know there wasn't any tweaking of the numbers), that's still too much variation to compare schools. Someone posted that Wash U's average was 235 - but there's no year attached. Given that Step 1 scores across the country have gradually risen over the years, it'd distort things if you compare it to another school's 2008 or 2009 score if Wash U's score is in fact from 2006 or something. Vanderbilt's 2009 score, with 25% of the class not factored in, could end up changing a decent amount.

So considering a precision of +/- a few points, uncertainty about years, etc. - that could account for 5-6 points alone. And then add in differences in classes (UVa jumped from 226 in 2007 to 235 in 2008, despite both classes having the same curriculum!), and it gets very hard to evaluate how effective the curriculum is.

Some links to show UVa's awesome transparency about their scores:
http://www.med-ed.virginia.edu/handbook/pdf/usmle1-07.pdf
http://www.med-ed.virginia.edu/handbook/pdf/usmle1-08.pdf
http://www.med-ed.virginia.edu/handbook/pdf/usmle1-09.pdf
 
Step 1 is all about effort and how well of a test taker you are. As long as you've learned the basic knowledge from your preclinical years, it doesn't matter how you were taught it. Everyone has their own way of learning, whether it be attending lectures, participating in group discussions, or blowing off classes and reading at home. It doesn't matter, because as long as you've learned the basic foundation then you are on equal playing ground with other people. I don't buy into these "innovative curriculum" claims because at the end of the day each school is teaching the same material, except maybe in a different way. The main focus on studying for step 1 is trying to remember what you've forgotten. We're not talking simple physics equations like on the MCAT. No, you need to remember all those biochemical pathways, all those nerves that innervate which muscles, all those congenital disorders and what deficiencies they are associated with. That stuff requires a lot of time and effort to study for, and a lot of it is extremely boring (especially since you've learned really cool pathophysiology during your second year). Your curriculum will not help you re-memorize these things, you have to do it yourself. That's why books like First Aid and Microbio Made Ridiculously Simple became so popular because it helps to memorize all these little inane things that you need to know for the boards.

The way your school runs its curriculum is certainly important, just not so much for step 1. I don't agree that any increase in board scores (even if they are accurate, which they most likely are not) after curriculum changes means anything. It just means that that class did better than the previous class, which commonly happens anyway.

You will see next year when you take it for yourself.

See again, theoretically I agree with you (and I'm sure I'll have clearer opinions next year when I take Step 1 myself). But I can definitely see how being in a school in which a lot of the test questions are presented as those clinical vignettes from first year would help you get used to that type of testing. Or being in a school where a lot of professors write Step I questions would lead you to get a lot more of an emphasis on that material (as well as more step 1-style questions). I mean, obviously everyone has to learn however many pages of first aid worth of material, and it's tedious and awful and no one remembers all the steps in the krebs cycle. But part of taking the test is also stamina (are you used to taking a really long multiple choice exam?) and preparation for those kinds of questions (are you used to thinking in terms of clinical cases?) and familiarity with those topics (are you re-learning something for the second time and it was breezed through the first time, or is it something that has been stressed 17 times during your career?). See what I mean? I just don't understand how that wouldn't affect your score at all.
 
I never considered WashU and I didn't check out how Vandy's curriculum changed (as a school that fits your second question) so I'm of course not answering for what I think explains such differences, but what about the amount of time students have off for studying? Maybe WashU doesn't provide students with as much time to prep than other similarly ranked schools. Maybe new curriculum changes added a couple weeks to what students had off for Step 1. So is the selling point for a curriculum then how much time students have off to study?

Addressing the general nature of your questions, I think even if we can trust the numbers people have posted to within +/- 3 points (I really feel like if a school wants to reference their average scores on their website, they need to man up and post the official NBME reports like UVa does, so applicants know there wasn't any tweaking of the numbers), that's still too much variation to compare schools. Someone posted that Wash U's average was 235 - but there's no year attached. Given that Step 1 scores across the country have gradually risen over the years, it'd distort things if you compare it to a 2008 or 2009 score if it's in fact from 2006 or something. Vanderbilt's 2009 score, with 25% of the class not factored in, could end up changing a decent amount.

So considering a precision of +/- a few points, uncertainty about years, etc. - that could account for 5-6 points alone. And then add in differences in classes (UVa jumped from 226 in 2007 to 235 in 2008, despite both classes having the same curriculum!), and it gets very hard to evaluate how effective the curriculum is.

Some links to show UVa's awesome transparency about their scores:
http://www.med-ed.virginia.edu/handbook/pdf/usmle1-07.pdf
http://www.med-ed.virginia.edu/handbook/pdf/usmle1-08.pdf
http://www.med-ed.virginia.edu/handbook/pdf/usmle1-09.pdf

Yeah, I mean obviously none of us have any evidence that the other schools have the scores that were mentioned, which is why these threads always end in name-calling and no conclusion ever comes of them. I just think it's interesting that the school is always completely discounted in these discussions, and people are called idiots for even suggesting that the curriculum might have something to do with a certain score. I'm really not speaking personally by the way, I would have come here regardless of their Step 1 score and I realize that I'll have to work a ton, yadda yadda. I'm in no way trying to imply that "my school is better than your school" and trying to validate my own superiority or whatever. I just find it curious. It's one of those peculiar SDN "mind of the masses" things, like "carib sucks", "USNews is lame" and the hatred of certain schools and sometimes irrational adoration for others.

I don't always disagree with the common opinion, nor do I necessarily disagree with this one, but I do wonder why it's so widespread. It almost seems like a lack of data (aka no official scores) has led to a very, very decisive conclusion: the data would show no correlation whatsoever, and if you argue that there might be a correlation, well you have no data to prove it. How does that make any sense?
 
Well, I was and I wasn't. My own school had a 10-ish point rise after the curriculum change, which made me wonder if it had anything to do with it, but as I mentioned in my post, I heard at other schools that their curriculum changes were supposed to bring higher scores. So faculty at other places think there's a correlation. I just wonder why everyone on SDN disagrees. I really don't want to get anyone on the defensive, I'm just honestly curious.

I think it does depend, in part, on how the curriculum is structured. Duke has its students do ALL of their clinical rotations and take it at any time they want during their research year. Penn also has you take Step 1 after clinical rotations. On the flip side, these schools don't teach to the boards as some other programs definitely do. Nevertheless, these schools both have great Step 1 scores, although they don't advertise it nearly as much as other schools seem to (*cough* Vandy/UVa *cough*). It's not a stretch to say that people will do better on an exam if they have more time/experience before they take it.
 
See again, theoretically I agree with you (and I'm sure I'll have clearer opinions next year when I take Step 1 myself). But I can definitely see how being in a school in which a lot of the test questions are presented as those clinical vignettes from first year would help you get used to that type of testing. Or being in a school where a lot of professors write Step I questions would lead you to get a lot more of an emphasis on that material (as well as more step 1-style questions). I mean, obviously everyone has to learn however many pages of first aid worth of material, and it's tedious and awful and no one remembers all the steps in the krebs cycle. But part of taking the test is also stamina (are you used to taking a really long multiple choice exam?) and preparation for those kinds of questions (are you used to thinking in terms of clinical cases?) and familiarity with those topics (are you re-learning something for the second time and it was breezed through the first time, or is it something that has been stressed 17 times during your career?). See what I mean? I just don't understand how that wouldn't affect your score at all.

I'm pretty sure this is the definition of teaching to the boards. Personally, I think this is less important, as the goal is to be a good clinician and not to do well on an exam. Just because it's on the boards, it doesn't mean it's any more important than some of other stuff they teach but isn't on the boards...
 
See again, theoretically I agree with you (and I'm sure I'll have clearer opinions next year when I take Step 1 myself). But I can definitely see how being in a school in which a lot of the test questions are presented as those clinical vignettes from first year would help you get used to that type of testing. Or being in a school where a lot of professors write Step I questions would lead you to get a lot more of an emphasis on that material (as well as more step 1-style questions). I mean, obviously everyone has to learn however many pages of first aid worth of material, and it's tedious and awful and no one remembers all the steps in the krebs cycle. But part of taking the test is also stamina (are you used to taking a really long multiple choice exam?) and preparation for those kinds of questions (are you used to thinking in terms of clinical cases?) and familiarity with those topics (are you re-learning something for the second time and it was breezed through the first time, or is it something that has been stressed 17 times during your career?). See what I mean? I just don't understand how that wouldn't affect your score at all.

Like I said, it may affect your score slightly, but it's insignificant to how much effort you yourself put in and how good of a test taker you are. Every school has professors that write questions for step 1, but it won't really matter because in the end, you will be using a question bank to study which is a much better simulation of the test than an old exam that you've taken first year. Stamina is all about how good of a test taker you are and you're better off building up stamina in the short term (i.e. that 1-2 month study period) than from spread out exams over the first two years. About familiarity of topics - the more conceptual aspects of the test are easier to study for and will require less time to re-learn (if at all). The harder parts of the test are the rote memorization aspects, and it doesn't matter how you were taught originally. Even if you understand the concepts from biochem you're still going to have to memorize all of those pathways again when you're studying for step 1. If you're trying to argue that a school fails to teach material present on step 1, then I'd agree that the students will have a harder time. But every school teaches the appropriate material, just in different ways...

Average board scores are just a selling point for a school, nothing more.
 
Well, I was and I wasn't. My own school had a 10-ish point rise after the curriculum change, which made me wonder if it had anything to do with it, but as I mentioned in my post, I heard at other schools that their curriculum changes were supposed to bring higher scores. So faculty at other places think there's a correlation. I just wonder why everyone on SDN disagrees. I really don't want to get anyone on the defensive, I'm just honestly curious.

No it didn't. My class had a 240 average (down from 245 with ~25% unreported). The current 3rd years had a 243 or 244 (I saw 2 diff numbers at 2 diff meetings), down from 247 at initial reporting with ~25% unreported
 
Nevertheless, these schools both have great Step 1 scores, although they don't advertise it nearly as much as other schools seem to (*cough* Vandy/UVa *cough*).
Honestly, neither Vandy nor UVa really played up their Step 1 Scores. I read about Vandy's score on here and had asked my host about it, but aside from one of the deans saying "we do very well on the Step 1", nobody else mentioned it during the interview day. UVa actually didn't mention their scores a single time on my day; everything is posted online, but you have to go through their student resources section to find them.

Compare that to Duke, where my faculty speaker in the morning did the whole "we're not going to tell you our averages, but you need great scores to get great residencies and our match-list is awesome, so do the math" deal and students mentioned how well they do throughout the day.
 
The problem with average board scores by year is that most med school classes aren't that big. It only takes 2 or 3 people to have a bad day and you can bring down the average quite a bit. You mainly want to look at board PASS percentage. We hired a guy at our school that has been doing some pretty hardcore statistical analysis of MCAT and GPA with future board scores and all that stuff. He's supposedly been doing it on a level that really hasn't been done before. So far, it shows that MCAT is pretty worthless in indicating med school performance including board scores. There is a very very weak correlation between MCAT and boards. I forget the number, but it was almost statistically insignificant. One thing that has been noted is that more global and intuitive thinkers tend to do better on boards. That is the type of person that likes to think of things in the big picture and goes more by instinct. That instinct doesn't mean they didn't memorize it, but it just seems to be implemented in a different fashion. I'm this type of person and I can tell you that on any test I really try to use a nice logical thought progression I WILL screw it up. It is a hard thing to explain.

Anyway, if 98%+ of the class is passing the boards every year then you can be pretty safe in assuming they are giving you the knowledge to be successful on it.
 
Honestly, neither Vandy nor UVa really played up their Step 1 Scores. I read about Vandy's score on here and had asked my host about it, but aside from one of the deans saying "we do very well on the Step 1", nobody else mentioned it during the interview day. UVa actually didn't mention their scores a single time on my day; everything is posted online, but you have to go through their student resources section to find them.

Compare that to Duke, where my faculty speaker in the morning did the whole "we're not going to tell you our averages, but you need great scores to get great residencies and our match-list is awesome, so do the math" deal and students mentioned how well they do throughout the day.

Both did quite heavily on my interview day. I heard it at least a half dozen times at both places. At UVa, I took it to mean more that some really bright kids come here and you should consider doing so too. UVa actually phrased it as saying half their class scored above 250, which is pretty impressive. Vandy had more of we're the best in the country vibe to it. Either way, it's not a big deal. I don't even view it as a selling point. It's so completely unimportant in making a decision.

At Duke, I got more of "you're responsible adults and you take take the test whenever you want. We trust you'll do well, as our students have for years and years."
 
Status
Not open for further replies.
Top