I'll try to refrain from making political commentary about the individual mentioned in the article who doesn't understand the difference between proficiency and growth - oops.
Medical education does need to change and teaching material in a different order (i.e. the change from traditional -> systems-based) does not constitute a curriculum overhaul, regardless of how med school PR departments spin it.
First, let's start with how material is presented. At my school, we have non-mandatory lectures that are recorded. The same lectures, with very minor changes, are taught year after year. We also have access to lectures from previous years. So, what's the benefit of going to lecture? The benefit of live lectures is supposed to be the ability to interact with your instructor, ask questions, etc. Our faculty are open to basic questions but if it's anything that will derail the class from the lecture, they usually defer questions to after class/during breaks. They don't mean to be rude but they often have way too many slides to cover in an hour anyways. Additionally, almost any question a medical student comes up with can be answered by searching the web. This effectively removes any benefit of live lectures and these lectures are just as effective as pre-recorded lectures. Thus, our live lecture attendance is absolutely abysmal. Students now have two options for learning via lectures - watching lectures given by professors from our school or watching lectures given by professors who have been vetted by thousands of medical students around the country (Kaplan, Pathoma, Goljan). The lectures given by vetted professors are often well-organized, explain concepts clearly, and provide 'high-yield' information that students can use to build their foundation of knowledge. Not only that, but companies who are in the business of providing high quality content have a vested interest (read: profit $$$) in using the latest technology and animations to make learning efficient for students. What motive do faculty members have to completely revamp their 30 year old slides? None. Class lectures end up being an obstacle to passing in-house exams which are obstacles to studying for Step 1. That takes care of lectures; what else do students use to study? Well, most people I know go through some version of the following process:
1) Watch lectures and take some sort of notes to organize the information.
2) Find a way to master the information (reading through notes, flash cards, drawing diagrams, making tables).
3) Do questions to assess their mastery and fill in gaps. Rinse and repeat for every block.
Schools really do not offer structured activities for any of these other steps in the learning process; all they offer are lectures which is just the first step. My school does do small group activities where we work through cases but they literally copy/paste a classic patient presentation out of FA and present it as a case
. We also don't get any practice questions/cases from faculty, which would be helpful since board prep Qbanks often write questions that aren't anything like what you'd face when seeing real patients. The one thing I do find valuable at my school is the clinical skills component. Of course, this is best learned in-person with faculty supervision.
Next, I'll address the issue of P/F grading systems leading to increased importance of Step 1. Again, I'll pull from my own experience. It is not difficult to pass in-house exams at my school. It is very possible to scrape by every block and pass all in-house exams and then be screwed for Step 1. My school showed data to prove this. So why aren't our exams more difficult? Why is the bar for passing a block exam incongruent with the minimum amount of information the student should have mastered from that block? I imagine this a common situation at other schools. The bar for passing in-house exams should be set such that students will be on a reasonable trajectory to become competent physicians. The same holds true for board exams, the passing mark for Step 1, Step 2, etc should be set so that achieving a passing score reasonably ensures an appropriate level of competence. Taking Step 1 as an example, it is an exam to assess mastery of pre-clinical knowledge. The passing score for Step 1 should reflect the amount of knowledge any M2 should have obtained during the first two years, regardless of desired specialty. The way residencies handle Step 1 scores assumes that this isn't true....apparently a medical student wanting to match family medicine should have mastered x% of pre-clinical material while a student wanting to match derm should have mastered 3x% of pre-clinical material. This is absurd considering all medical students are taught a similar body of information during pre-clinical years, regardless of what specialty they want to go into. I get the supply/demand argument for specialties and the argument that specialty-specific board exams for certain specialties may be more difficult than for other specialties. These arguments only carry so much weight though - are higher Step 1 scores unequivocally correlated with a higher pass rate on specialty-specific board exams? If so, is this correlation so strong that a student must have a 250+ Step 1 to pass the derm boards? The difference in pre-clinical knowledge mastery between a 192 on Step 1 and a 250+ is astronomical. Does a student really need that much more mastery of pre-clinical knowledge to be on a trajectory to pass derm boards? Highly unlikely.
I have more concerns/questions than solutions/ideas. All I know is we need forward-thinking people to find better ways to teach the pre-clinical years. As I've said in other threads, my pre-clinical education could have been largely replaced with about $1k worth of resources produced by companies who have been vetted by thousands of medical students from around the world.