- Joined
- Mar 4, 2018
- Messages
- 99
- Reaction score
- 37
as in no rankings, no honors, just truly P/F
why? do residencies use the fact that some schools have alpha omega alpha as a way to figure out the top students in the class?no AOA either
why? do residencies use the fact that some schools have alpha omega alpha as a way to figure out the top students in the class?
what about BU?I wouldn't mind AOA as long as the rankings aren't reported to residencies, which I know of at least 1 school that does that.
what about BU?
why? do residencies use the fact that some schools have alpha omega alpha as a way to figure out the top students in the class?
For AOA to be determined, there may be some sort of internal ranking system (even if the curriculum is supposedly P/F). These internal ranks can be included by medical schools on residency apps. So those schools don’t have “true” P/F, where everyone who passes is really viewed the same regardless of how high they passed.
but do they report the internal ranking to residencies?It is internally ranked. Generally, the only true P/F schools without internal ranking are top 10/20, but it's good to double-check.
Hofstra, a true preclinical P/F school, includes a "Comparative Performance" section in the MSPE:but do they report the internal ranking to residencies?
that's good to know as I am waitlisted at Hofstra! but I was wondering specifically if BU reports internal ranking to residencies? 😱Hofstra, a true preclinical P/F school, includes a "Comparative Performance" section in the MSPE:
"The overall comparative performance category is based upon clerkship grades and USMLE
Step 1 scores and it is determined by the committee commissioned by the Dean. There are five
categories used and they are NOT based on strict quintiles. The five categories are: Very Good;
Very Good to Excellent; Excellent; Excellent to Outstanding; and Outstanding."
It would seam that any kind of evaluation during the clinical years would be difficult. Aside from the two ends of the spectrum - the students who just absolutely kill it and the students who you think may kill a patient - how do clinical evaluations get evaluated when they are inherently so subjective?There are some top schools with no internal or external ranking at the level of the school. Many do not have AOA anymore either. And some have gone, or are going P/F for preclinical and clinical years. Some are even petitioning the USMLE to go P/F for Step exams. It makes some sense at the tippy top schools bc there are so many fantastic students, but the residency PD's complain that they can not tell the students apart. One of the BMS-affiliated (best medical school!) residency programs shies away from students from a couple of top medical schools, bc they are so frustrated with the ungraded, unranked system.
However, within the specialty at a given medical school, they might internally rank the candidates and put that information in their LOR's for residency application. Keep in mind, however, that any internal ranking within a specialty might not be very objective and might favor those doing research with the chair or program director.
In EM, each med school with a residency writes a SLOE (Standardized Letter of Eval) for each student and they essentially have to rank the students compared to the others applying that year across several domains. Orthopedics has been doing it for years at many schools. Many other surgical subspecialties also rank the students applying, as does general surgery. IM just started doing it at our school a couple years ago - they do not numerically rank the students, but they do identify those in the top 5% or so and then another group that is distinguished themselves as top 15% or so. It just makes me uncomfortable as I am not sure what factors they are using for these rankings.
It is really tough. It can sometimes feel like a popularity contest - the students who are best-liked by the residents often do get better evaluations. That is why a lot of schools are going P/F. And it never seemed fair to me that some students got assigned to spend a month with a notoriously tough grader who went to a school where only 10% of students got honors back in the day - and that person would think a "PASS" was ok to give out at our school, while others gave "HONORS" as the default grade.It would seam that any kind of evaluation during the clinical years would be difficult. Aside from the two ends of the spectrum - the students who just absolutely kill it and the students who you think may kill a patient - how do clinical evaluations get evaluated when they are inherently so subjective?
Is there any push at the NRMP level (I don't know specifics of any organization names) for standardization of clinical evaluation letters across Medical schools?It is really tough. It can sometimes feel like a popularity contest - the students who are best-liked by the residents often do get better evaluations. That is why a lot of schools are going P/F. And it never seemed fair to me that some students got assigned to spend a month with a notoriously tough grader who went to a school where only 10% of students got honors back in the day - and that person would think a "PASS" was ok to give out at our school, while others gave "HONORS" as the default grade.
And a study looking at even the words chosen in these narrative evaluations showed implicit bias that hurt women as well as URIMs. So we do have a long way to go. The best way to evaluate students for a residency is to have them come do a visiting rotation for a month. Obviously, students are limited by how many "audition" rotations they can do, and some specialties (IM, NEUROLOGY, PEDS, PSYCH, DERM) have not really warmed up to the idea of accommodating visiting students...But in Ortho, Neurosurgery, Plastics, ENT, it is almost essential that students rotate away, and some programs will not even rank a student who has not done an away rotation at their place.
Mayo looking very appealingI'm pretty sure Mayo is true p/f. They don't do internal rankings and don't have AOA.
It is internally ranked. Generally, the only true P/F schools without internal ranking are top 10/20, but it's good to double-check.
Frankly I think the best way to learn is under the pressure of having to perform. If you're not worried about how you will be evaluated based on your work, you probably won't perform as well. I know I wouldn't have pushed myself as much on my notes, my research, and my shelf studying if I knew I was going to get a P regardless. My learning happened most at the edges of my comfort zone, putting myself out there with an A/P on rounds and backing it up, even if I was off the mark.At the end of the day I personally would want to have pass/fail all four years of medical school because I'd prefer learning how to be a great clinician, to learning how to take standardized exams.
Is there any push at the NRMP level (I don't know specifics of any organization names) for standardization of clinical evaluation letters across Medical schools?
I know in the Army, for evaluations there is a specific template and format where the top 25% gets "exceeds course standards", the next 30% down receive "met course standards", the next 25% down get "marginally met course standards" and the next 25% down get "did not meet course standards." And then it is literally just the same words in between the different categories of evaluation with 2 or 3 sentences of unique student features at the end. Something tiered like that seems like it would work for HH/H/P/LP/F
The EM standardized letter of evaluation that is written for each student by their home institution and anywhere else that they rotate looks like the Army system. Students are evaluated in tiers like this, in several different domains. So even if a school is entirely P/F, if they go into EM, the EM department is forced to rank them. And they will be ranked by any visiting rotations.Is there any push at the NRMP level (I don't know specifics of any organization names) for standardization of clinical evaluation letters across Medical schools?
I know in the Army, for evaluations there is a specific template and format where the top 25% gets "exceeds course standards", the next 30% down receive "met course standards", the next 25% down get "marginally met course standards" and the next 25% down get "did not meet course standards." And then it is literally just the same words in between the different categories of evaluation with 2 or 3 sentences of unique student features at the end. Something tiered like that seems like it would work for HH/H/P/LP/F
That seems like an unfortunate necessity for the competitive nature of many specialties.The EM standardized letter of evaluation that is written for each student by their home institution and anywhere else that they rotate looks like the Army system. Students are evaluated in tiers like this, in several different domains. So even if a school is entirely P/F, if they go into EM, the EM department is forced to rank them. And they will be ranked by any visiting rotations.
Ortho and ENT and some other specialties are also moving in this direction as a way for the residency program directors to be able to distinguish the candidates.
The PD I work with told me that these letters are starting to become just as useless, since the vast majority of applicants are allegedly in the top 5% according to those standard letters. Instead of ranging from lukewarm/good/excellent, the letters now range between excellent/amazing/best I've ever seen, so "reading between the lines" is still important as ever.That seems like an unfortunate necessity for the competitive nature of many specialties.
Maybe instead of letter schools should just have to assign those “good” to “excellent” categories on a national database where PDs can search through a school and find their applicant. IdkThe PD I work with told me that these letters are starting to become just as useless, since the vast majority of applicants are allegedly in the top 5% according to those standard letters. Instead of ranging from lukewarm/good/excellent, the letters now range between excellent/amazing/best I've ever seen, so "reading between the lines" is still important as ever.
Frankly I think the best way to learn is under the pressure of having to perform. If you're not worried about how you will be evaluated based on your work, you probably won't perform as well. I know I wouldn't have pushed myself as much on my notes, my research, and my shelf studying if I knew I was going to get a P regardless. My learning happened most at the edges of my comfort zone, putting myself out there with an A/P on rounds and backing it up, even if I was off the mark.
That is why my school is going P/F for third year. Fourth year sub internships will still be graded.This is probably a "different strokes for different folks" kind of thing. I also had grading on rotations, and I personally found it to be terrible for my learning, because learning itself was de-emphasized. After all, the students who knew things already looked better and got the H. If you needed to learn, you weren't performing well enough.
Post-MSPE fourth year rotations have by far been better for my learning, because instead of stressing out over how I'm perceived/whether I'm doing enough for honors, I get to focus on learning from the clinicians I'm working with. So it very much depends on your personality.
Are 4th year grades the ones that ‘matter’ the most when it comes to matching?That is why my school is going P/F for third year. Fourth year sub internships will still be graded.
Yes, there will of course be emphasis on 4th year grades and on USMLE scores. But the best information about an applicant for residency will come from "audition" rotations, where students rotate at an away institution for an entire month. Hard to hide incompetence or crazy or annoying for an entire month. Obviously, it will behoove students to select appropriate places for their audition rotations, as it is only reasonable to do 2, or at most, 3 away rotations. (Our school limits students to 2 away rotations in any one specialty). It would not make sense to do an away rotation at a place that is too much of a reach.Are 4th year grades the ones that ‘matter’ the most when it comes to matching?
This makes more sense imoHarvard is unranked PF in preclinical and the principal clinical year. Advanced clinical electives after second year are graded. Each student receives 1 or 2 summative evaluations in a discipline relevant for the specialty they are applying into. For example, those applying into neurology receive two separate evaluations for their performance on medicine and neurology rotations. Those applying into neurosurgery receive two separate evaluations for their performance on medicine and surgery.
Until I got to your last paragraph I was going to say I thought audition rotations were mostly for the high competitive specialties among MDs and for DO students.Yes, there will of course be emphasis on 4th year grades and on USMLE scores. But the best information about an applicant for residency will come from "audition" rotations, where students rotate at an away institution for an entire month. Hard to hide incompetence or crazy or annoying for an entire month. Obviously, it will behoove students to select appropriate places for their audition rotations, as it is only reasonable to do 2, or at most, 3 away rotations. (Our school limits students to 2 away rotations in any one specialty). It would not make sense to do an away rotation at a place that is too much of a reach.
And there are a lot of specialties who do not welcome visiting students, so those specialties will have to rely on the narratives provided by clerkship supervisors and USMLE test scores.
That is why my school is going P/F for third year. Fourth year sub internships will still be graded.
You really won’t have so many 4th year grades to present when you apply. Not more than a month or two.