Which schools are truly P/F?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

glassesvar

Full Member
5+ Year Member
Joined
Mar 4, 2018
Messages
99
Reaction score
37
as in no rankings, no honors, just truly P/F

Members don't see this ad.
 
no AOA either
 
Members don't see this ad :)
why? do residencies use the fact that some schools have alpha omega alpha as a way to figure out the top students in the class?

For AOA to be determined, there may be some sort of internal ranking system (even if the curriculum is supposedly P/F). These internal ranks can be included by medical schools on residency apps. So those schools don’t have “true” P/F, where everyone who passes is really viewed the same regardless of how high they passed.
 
I wouldn't mind AOA as long as the rankings aren't reported to residencies, which I know of at least 1 school that does that.
 
why? do residencies use the fact that some schools have alpha omega alpha as a way to figure out the top students in the class?

For AOA to be determined, there may be some sort of internal ranking system (even if the curriculum is supposedly P/F). These internal ranks can be included by medical schools on residency apps. So those schools don’t have “true” P/F, where everyone who passes is really viewed the same regardless of how high they passed.

Schools can have both a true preclinical P/F system and AOA - in that case, AOA would usually be determined by M3 clerkship grades, along with other things like extracurriculars / leadership.
 
but do they report the internal ranking to residencies?
Hofstra, a true preclinical P/F school, includes a "Comparative Performance" section in the MSPE:

"The overall comparative performance category is based upon clerkship grades and USMLE
Step 1 scores and it is determined by the committee commissioned by the Dean. There are five
categories used and they are NOT based on strict quintiles. The five categories are: Very Good;
Very Good to Excellent; Excellent; Excellent to Outstanding; and Outstanding."
 
Hofstra, a true preclinical P/F school, includes a "Comparative Performance" section in the MSPE:

"The overall comparative performance category is based upon clerkship grades and USMLE
Step 1 scores and it is determined by the committee commissioned by the Dean. There are five
categories used and they are NOT based on strict quintiles. The five categories are: Very Good;
Very Good to Excellent; Excellent; Excellent to Outstanding; and Outstanding."
that's good to know as I am waitlisted at Hofstra! but I was wondering specifically if BU reports internal ranking to residencies? 😱
 
Members don't see this ad :)
There are some top schools with no internal or external ranking at the level of the school. Many do not have AOA anymore either. And some have gone, or are going P/F for preclinical and clinical years. Some are even petitioning the USMLE to go P/F for Step exams. It makes some sense at the tippy top schools bc there are so many fantastic students, but the residency PD's complain that they can not tell the students apart. One of the BMS-affiliated (best medical school!) residency programs shies away from students from a couple of top medical schools, bc they are so frustrated with the ungraded, unranked system.

However, within the specialty at a given medical school, they might internally rank the candidates and put that information in their LOR's for residency application. Keep in mind, however, that any internal ranking within a specialty might not be very objective and might favor those doing research with the chair or program director.

In EM, each med school with a residency writes a SLOE (Standardized Letter of Eval) for each student and they essentially have to rank the students compared to the others applying that year across several domains. Orthopedics has been doing it for years at many schools. Many other surgical subspecialties also rank the students applying, as does general surgery. IM just started doing it at our school a couple years ago - they do not numerically rank the students, but they do identify those in the top 5% or so and then another group that is distinguished themselves as top 15% or so. It just makes me uncomfortable as I am not sure what factors they are using for these rankings.
 
There are some top schools with no internal or external ranking at the level of the school. Many do not have AOA anymore either. And some have gone, or are going P/F for preclinical and clinical years. Some are even petitioning the USMLE to go P/F for Step exams. It makes some sense at the tippy top schools bc there are so many fantastic students, but the residency PD's complain that they can not tell the students apart. One of the BMS-affiliated (best medical school!) residency programs shies away from students from a couple of top medical schools, bc they are so frustrated with the ungraded, unranked system.

However, within the specialty at a given medical school, they might internally rank the candidates and put that information in their LOR's for residency application. Keep in mind, however, that any internal ranking within a specialty might not be very objective and might favor those doing research with the chair or program director.

In EM, each med school with a residency writes a SLOE (Standardized Letter of Eval) for each student and they essentially have to rank the students compared to the others applying that year across several domains. Orthopedics has been doing it for years at many schools. Many other surgical subspecialties also rank the students applying, as does general surgery. IM just started doing it at our school a couple years ago - they do not numerically rank the students, but they do identify those in the top 5% or so and then another group that is distinguished themselves as top 15% or so. It just makes me uncomfortable as I am not sure what factors they are using for these rankings.
It would seam that any kind of evaluation during the clinical years would be difficult. Aside from the two ends of the spectrum - the students who just absolutely kill it and the students who you think may kill a patient - how do clinical evaluations get evaluated when they are inherently so subjective?
 
It would seam that any kind of evaluation during the clinical years would be difficult. Aside from the two ends of the spectrum - the students who just absolutely kill it and the students who you think may kill a patient - how do clinical evaluations get evaluated when they are inherently so subjective?
It is really tough. It can sometimes feel like a popularity contest - the students who are best-liked by the residents often do get better evaluations. That is why a lot of schools are going P/F. And it never seemed fair to me that some students got assigned to spend a month with a notoriously tough grader who went to a school where only 10% of students got honors back in the day - and that person would think a "PASS" was ok to give out at our school, while others gave "HONORS" as the default grade.

And a study looking at even the words chosen in these narrative evaluations showed implicit bias that hurt women as well as URIMs. So we do have a long way to go. The best way to evaluate students for a residency is to have them come do a visiting rotation for a month. Obviously, students are limited by how many "audition" rotations they can do, and some specialties (IM, NEUROLOGY, PEDS, PSYCH, DERM) have not really warmed up to the idea of accommodating visiting students...But in Ortho, Neurosurgery, Plastics, ENT, it is almost essential that students rotate away, and some programs will not even rank a student who has not done an away rotation at their place.
 
It is really tough. It can sometimes feel like a popularity contest - the students who are best-liked by the residents often do get better evaluations. That is why a lot of schools are going P/F. And it never seemed fair to me that some students got assigned to spend a month with a notoriously tough grader who went to a school where only 10% of students got honors back in the day - and that person would think a "PASS" was ok to give out at our school, while others gave "HONORS" as the default grade.

And a study looking at even the words chosen in these narrative evaluations showed implicit bias that hurt women as well as URIMs. So we do have a long way to go. The best way to evaluate students for a residency is to have them come do a visiting rotation for a month. Obviously, students are limited by how many "audition" rotations they can do, and some specialties (IM, NEUROLOGY, PEDS, PSYCH, DERM) have not really warmed up to the idea of accommodating visiting students...But in Ortho, Neurosurgery, Plastics, ENT, it is almost essential that students rotate away, and some programs will not even rank a student who has not done an away rotation at their place.
Is there any push at the NRMP level (I don't know specifics of any organization names) for standardization of clinical evaluation letters across Medical schools?

I know in the Army, for evaluations there is a specific template and format where the top 25% gets "exceeds course standards", the next 30% down receive "met course standards", the next 25% down get "marginally met course standards" and the next 25% down get "did not meet course standards." And then it is literally just the same words in between the different categories of evaluation with 2 or 3 sentences of unique student features at the end. Something tiered like that seems like it would work for HH/H/P/LP/F
 
It is internally ranked. Generally, the only true P/F schools without internal ranking are top 10/20, but it's good to double-check.

Uh no, that sounds like a weird generalization seeing as how I go to an upper mid-tier school and we have zero internal ranking. AOA is determined based on Step 1 and clerkship.
 
CCLCM has no preclinical or clinical grades, exams, internal rankings, AOA, or any comparative information reported to residency programs (not even coded adjectives). Shelf exams are also optional.
 
At the end of the day I personally would want to have pass/fail all four years of medical school because I'd prefer learning how to be a great clinician, to learning how to take standardized exams.

But in the long run I think this would only be beneficial for students coming from the Hopkins/Yale's of the world. There's already a lot of inbreeding among t20 programs *cough* Harvard. Eliminating clerkship/step1 grades all together would just stratify medical students further, preventing truly deserving applicants at "lower" med schools from matching outside their programs bracket. A greater emphasis would be placed on medical school admissions which isn't all that meritocratic in the first place. That is not cash money at all.
 
Last edited:
At the end of the day I personally would want to have pass/fail all four years of medical school because I'd prefer learning how to be a great clinician, to learning how to take standardized exams.
Frankly I think the best way to learn is under the pressure of having to perform. If you're not worried about how you will be evaluated based on your work, you probably won't perform as well. I know I wouldn't have pushed myself as much on my notes, my research, and my shelf studying if I knew I was going to get a P regardless. My learning happened most at the edges of my comfort zone, putting myself out there with an A/P on rounds and backing it up, even if I was off the mark.

Additionally, taking standardized exams is a huge part of training, like it or not. I think it's important to the integrity of medicine that we all prove that we have the knowledge and reasoning skills expected of a physician and can deliver them when asked to. It would not be a good look for doctors not to have to prove their knowledge for licensing, boarding, etc. In this way I would argue that exams protect our profession and we should not be looking to substantially alter them (except for Step 2CS which is nonsense).

I agree with your second point. People have different goals in med school. If you're gunning for derm there should be a means for you to distinguish yourself on the basis of your knowledge, effort, and skill. If you can't produce, you won't match. Everyone in a given entering class has a basically equal opportunity based on the key components of a residency application—I don't think there's an obligation to provide everyone an equal outcome for different levels of knowledge, skill, effort, and achievement in school.
 
Sinai is true P/F. No AOA anymore either 👍👍👍

Is there any push at the NRMP level (I don't know specifics of any organization names) for standardization of clinical evaluation letters across Medical schools?

I know in the Army, for evaluations there is a specific template and format where the top 25% gets "exceeds course standards", the next 30% down receive "met course standards", the next 25% down get "marginally met course standards" and the next 25% down get "did not meet course standards." And then it is literally just the same words in between the different categories of evaluation with 2 or 3 sentences of unique student features at the end. Something tiered like that seems like it would work for HH/H/P/LP/F

But the Army's model has all the same issues, if not worse. As is the case with any evaluation system, there are "standards", but outside of a small handful of objective tasks--e.g. maxing the APFT or qualifying expert at the range--you're at the mercy of the whims of your rater & senior rater. Evaluations in the Army are rampant with bias and subjectivity--the specific issue that gorowannabe mentioned (language/attitudes that disadvantage women & minorities) is a huge problem in the Army. Basically, you won't address bias through standardization as long as 1) the standards themselves are biased, or 2) those enforcing the standards are biased.
 
At my interview at TCU/UNTHSC, Dean Flynn said the school will be 100% P/F with only internal ranking to allow for tracking of progress through medical school.
 
Hackensack is true P/F. Also i believe hofstra is as well
 
Is there any push at the NRMP level (I don't know specifics of any organization names) for standardization of clinical evaluation letters across Medical schools?

I know in the Army, for evaluations there is a specific template and format where the top 25% gets "exceeds course standards", the next 30% down receive "met course standards", the next 25% down get "marginally met course standards" and the next 25% down get "did not meet course standards." And then it is literally just the same words in between the different categories of evaluation with 2 or 3 sentences of unique student features at the end. Something tiered like that seems like it would work for HH/H/P/LP/F
The EM standardized letter of evaluation that is written for each student by their home institution and anywhere else that they rotate looks like the Army system. Students are evaluated in tiers like this, in several different domains. So even if a school is entirely P/F, if they go into EM, the EM department is forced to rank them. And they will be ranked by any visiting rotations.

Ortho and ENT and some other specialties are also moving in this direction as a way for the residency program directors to be able to distinguish the candidates.
 
The EM standardized letter of evaluation that is written for each student by their home institution and anywhere else that they rotate looks like the Army system. Students are evaluated in tiers like this, in several different domains. So even if a school is entirely P/F, if they go into EM, the EM department is forced to rank them. And they will be ranked by any visiting rotations.

Ortho and ENT and some other specialties are also moving in this direction as a way for the residency program directors to be able to distinguish the candidates.
That seems like an unfortunate necessity for the competitive nature of many specialties.
 
That seems like an unfortunate necessity for the competitive nature of many specialties.
The PD I work with told me that these letters are starting to become just as useless, since the vast majority of applicants are allegedly in the top 5% according to those standard letters. Instead of ranging from lukewarm/good/excellent, the letters now range between excellent/amazing/best I've ever seen, so "reading between the lines" is still important as ever.
 
The PD I work with told me that these letters are starting to become just as useless, since the vast majority of applicants are allegedly in the top 5% according to those standard letters. Instead of ranging from lukewarm/good/excellent, the letters now range between excellent/amazing/best I've ever seen, so "reading between the lines" is still important as ever.
Maybe instead of letter schools should just have to assign those “good” to “excellent” categories on a national database where PDs can search through a school and find their applicant. Idk
 
Frankly I think the best way to learn is under the pressure of having to perform. If you're not worried about how you will be evaluated based on your work, you probably won't perform as well. I know I wouldn't have pushed myself as much on my notes, my research, and my shelf studying if I knew I was going to get a P regardless. My learning happened most at the edges of my comfort zone, putting myself out there with an A/P on rounds and backing it up, even if I was off the mark.

This is probably a "different strokes for different folks" kind of thing. I also had grading on rotations, and I personally found it to be terrible for my learning, because learning itself was de-emphasized. After all, the students who knew things already looked better and got the H. If you needed to learn, you weren't performing well enough.

Post-MSPE fourth year rotations have by far been better for my learning, because instead of stressing out over how I'm perceived/whether I'm doing enough for honors, I get to focus on learning from the clinicians I'm working with. So it very much depends on your personality.
 
This is probably a "different strokes for different folks" kind of thing. I also had grading on rotations, and I personally found it to be terrible for my learning, because learning itself was de-emphasized. After all, the students who knew things already looked better and got the H. If you needed to learn, you weren't performing well enough.

Post-MSPE fourth year rotations have by far been better for my learning, because instead of stressing out over how I'm perceived/whether I'm doing enough for honors, I get to focus on learning from the clinicians I'm working with. So it very much depends on your personality.
That is why my school is going P/F for third year. Fourth year sub internships will still be graded.
 
Harvard is unranked PF in preclinical and the principal clinical year. Advanced clinical electives after second year are graded. Each student receives 1 or 2 summative evaluations in a discipline relevant for the specialty they are applying into. For example, those applying into neurology receive two separate evaluations for their performance on medicine and neurology rotations. Those applying into neurosurgery receive two separate evaluations for their performance on medicine and surgery.
 
Truly pass fail schools with no other indicator of your likelihood to pass board exams will only increase exponentially the importance of your step scores to your future residency. They won’t have what else to go on re whether you’re a dud. And if you’re an excellent test taker you really don’t have what to worry about pass fail vs. grades. I wouldn’t let it influence your decision too much. HTH.
 
Are 4th year grades the ones that ‘matter’ the most when it comes to matching?
Yes, there will of course be emphasis on 4th year grades and on USMLE scores. But the best information about an applicant for residency will come from "audition" rotations, where students rotate at an away institution for an entire month. Hard to hide incompetence or crazy or annoying for an entire month. Obviously, it will behoove students to select appropriate places for their audition rotations, as it is only reasonable to do 2, or at most, 3 away rotations. (Our school limits students to 2 away rotations in any one specialty). It would not make sense to do an away rotation at a place that is too much of a reach.

And there are a lot of specialties who do not welcome visiting students, so those specialties will have to rely on the narratives provided by clerkship supervisors and USMLE test scores.
 
Harvard is unranked PF in preclinical and the principal clinical year. Advanced clinical electives after second year are graded. Each student receives 1 or 2 summative evaluations in a discipline relevant for the specialty they are applying into. For example, those applying into neurology receive two separate evaluations for their performance on medicine and neurology rotations. Those applying into neurosurgery receive two separate evaluations for their performance on medicine and surgery.
This makes more sense imo
 
Yes, there will of course be emphasis on 4th year grades and on USMLE scores. But the best information about an applicant for residency will come from "audition" rotations, where students rotate at an away institution for an entire month. Hard to hide incompetence or crazy or annoying for an entire month. Obviously, it will behoove students to select appropriate places for their audition rotations, as it is only reasonable to do 2, or at most, 3 away rotations. (Our school limits students to 2 away rotations in any one specialty). It would not make sense to do an away rotation at a place that is too much of a reach.

And there are a lot of specialties who do not welcome visiting students, so those specialties will have to rely on the narratives provided by clerkship supervisors and USMLE test scores.
Until I got to your last paragraph I was going to say I thought audition rotations were mostly for the high competitive specialties among MDs and for DO students.

Our audition rotations becoming more prevalent among most specialties? In particular our schools who are reluctant to take Visiting students transitioning to a more involved process?

That kind of sucks that it all just seems like a crapshoot. I can understand why the tippy top medical schools have students matching into their best programs. Because if all of valuations look the same the step score in the name of the schools is really all that matters it seems.
 
That is why my school is going P/F for third year. Fourth year sub internships will still be graded.

My school has decided to do the same. We're all very jealous of the MS2s, since they're the first class benefitting from the change. 😉
 
So should p/f third years be a major deciding factor in choosing between schools? Assuming the schools are all of equal (high) prestige?
 
You really won’t have so many 4th year grades to present when you apply. Not more than a month or two.

Depends on the school. I had 4 sub-is on my MSPE and could’ve fit in a 5th had I not taken time for Step 2.
 
Top