Archives: The Future of Pathology Training

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Johnny Sunshine

Full Member
10+ Year Member
Joined
Feb 24, 2012
Messages
167
Reaction score
18
So, the March issue of Archives of Pathology & Laboratory Medicine is out. It seems that, in this month and also the upcoming April issue, they have a series of articles under the heading "The Future of Pathology Training and Training Programs".

Click here for the list of articles from the March 2014 issue of Archives.

I am disappointed (but not surprised) to note a certain topic is not broached. Not a whiff. Can anyone guess what that topic might be?

Maybe one of the April articles will bring it up... 🙄
 
Its funny how they have an article about reducing laboratory costs right under an article about a whole slide imaging, which is a waste of money. LOL
 
While I agree whole slide imaging is a waste of money, I like the idea of signing cases from the beach with a tablet. :joyful:
 
I get tired of" experts" claiming that whole slide imaging is the future and inevitable. Shows how out of touch these people are with reality. What lab wants to invest in this expense? You dont get paid anymore money for doing it. Last time I checked reimbursements were heading down. And you have to run validation studies. Just seems like a total pain. I foresee very little future for it.
 
One local hospital I know of has recently obtained as small scanner, with software to automatically evaluate for ER, PR, and Her2 immunohistochemistry for breast cancers by a computer algorithm. They are beginning with those IHC because "that's where the money is".

The machine may save some time, since presumably a pathologist wouldn't have to lay eyes on the slides. But even in a large medical center that is just a handful of slides per day. They say a machine is more consistent at evaluating percent positivity as well, but the benefit of increasing the accuracy by a few percentage points is not obvious to me. Besides, these machines habitually categorize normal internal control as positive tumor", so their results are highly variable depending on the percent of normal vs. abnormal tissue within the specimen. I just don't think I can trust such results.
 
One local hospital I know of has recently obtained as small scanner, with software to automatically evaluate for ER, PR, and Her2 immunohistochemistry for breast cancers by a computer algorithm. They are beginning with those IHC because "that's where the money is".

The machine may save some time, since presumably a pathologist wouldn't have to lay eyes on the slides. But even in a large medical center that is just a handful of slides per day. They say a machine is more consistent at evaluating percent positivity as well, but the benefit of increasing the accuracy by a few percentage points is not obvious to me. Besides, these machines habitually categorize normal internal control as positive tumor", so their results are highly variable depending on the percent of normal vs. abnormal tissue within the specimen. I just don't think I can trust such results.
The scanner doesn't do it on its own. A Pathologist selects the area to be analyzed. As long as the d bag pathologist isn't selecting regions with a lot of normal breast it is very accurate.
 
The scanner doesn't do it on its own. A Pathologist selects the area to be analyzed. As long as the d bag pathologist isn't selecting regions with a lot of normal breast it is very accurate.

Our group switched over to one of these image analysis program which allows us to charge for the computer fee involved. It's a bit more work (which is just a couple of mouse clicks) than just sending it out to a reference lab and having them doing everything, but we get an extra $15 per breast panel...:meh:
 
Whole slide imaging (IMHO) will be most likely to be important in two situations:
1) Consults
2) Frozen sections particularly at offsite labs.

The first will save money and time on transport. The problem will be being able to bill for it. The second is already being done.

For routine diagnosis it may be important in niche situations but will probably not attain huge importance, at least for a while anyway.

Slide scanning is really nice for tumor boards, picture taking, etc however.
 
Flipped through some of it, starting with Suzanne Powell's intro [have to say I've never liked her...ever since she came to give our program a talk several years ago and shat all over anyone's aspirations to be anything other than a research-oriented ivory-tower academic schlub]. Jesus what the hell are the 'graphic representations of milestone self assessments" on pages 312-314? It's like microarray of stupidity. I understand the desire people have to quantitate academic and professional achievement, but I can't help but get irritated at the overabundance of meaningless (or at least ambiguous) data.
I guess those that can do and those that can't go into politics, academics and governing bodies and publish insane amounts of meaningless crap.
 
... It's like microarray of stupidity....

LOL. I think they can use the data to identify molecular subtypes of residents.

I like how residents get more honest as they go from year 1 to 4. Also, I like how 5=proficient; yet none of the residents get a score of 5 for anything. Meaning they are not proficient at anything. It must be data from the Methodist Hospital program then.
 
This entire article validates the current system of indentured service under threats to large corporations that would like to continue their practice of not teaching, collecting corporate welfare, and is now expecting everyone else to go along with the fraud of limited teaching among other cardinal sins. There is no discussion on what protections are afforded the doctor in a residency program from abuses of power and corruption, when they figure that out, try again to write about education and milestones.
 
Last edited:
It's like microarray of stupidity.

You should consider becoming a novelist.
 
We use whole slide imaging. I work at a big academic center and we receive consults from overseas.
 
LOL. I think they can use the data to identify molecular subtypes of residents.

I like how residents get more honest as they go from year 1 to 4. Also, I like how 5=proficient; yet none of the residents get a score of 5 for anything. Meaning they are not proficient at anything. It must be data from the Methodist Hospital program then.

Speaking of stupidity, a quick 20 scan of the section shows that "Level 5 represents a cognitively and technically proficient provider of services who is in the early phases of independent practice and would typically be 2 or more years out of residency training".

Of course you wouldn't see many scores of 5.
 
Speaking of stupidity, a quick 20 scan of the section shows that "Level 5 represents a cognitively and technically proficient provider of services who is in the early phases of independent practice and would typically be 2 or more years out of residency training".

Of course you wouldn't see many scores of 5.

Don't pathologists think its embarrassing that the leaders of the field are absolutely ridiculous? I read this too, and it is a meaningless piece of garbage! What is the point of it? The only thing I've taken away from it is that the pathology education system expects that residents, once their training is complete, are still not ready to practice independently. Is this because the training is inadequate, the people recruited are inadequate, or the system is set up such that the perpetual presence of fellows is required for academic programs to run. I suspect all three are true. No wonder nobody wants to do pathology. Might as well just do radiology; at least nobody thinks you graduated at the bottom of the class.
 
Don't pathologists think its embarrassing that the leaders of the field are absolutely ridiculous? I read this too, and it is a meaningless piece of garbage! What is the point of it? The only thing I've taken away from it is that the pathology education system expects that residents, once their training is complete, are still not ready to practice independently. Is this because the training is inadequate, the people recruited are inadequate, or the system is set up such that the perpetual presence of fellows is required for academic programs to run. I suspect all three are true. No wonder nobody wants to do pathology. Might as well just do radiology; at least nobody thinks you graduated at the bottom of the class.

The point is to somewhat quantify educational progress. Something that has proven difficult to do since, I don't know, the beginning of formal education systems some thousands of years ago. I don't see why some of you get your panties in a wad when someone writes an article on measuring resident progress - it's a pretty essential part of the training system.

My post was to make fun of the poster's thinly veiled insult of the author and her institution when he/she couldn't even take the time to understand the definition of a 5. Seriously...would you expect most residents to ever document that they believe they have the same skill set as somebody that has been independently practicing for 2 years given the context is for measuring competency advancement? The crap that some of you folks can pull from industry happenings or published articles is borderline conspiracy theory paranoia.
 
Those were some pretty boring articles. And the autopsy one just seems silly. Of course you integrate clin lab findings before death into the autopsy findings. Having to track down the Phd in charge of chemistry or microbiology is just a waste of time.
 
Don't pathologists think its embarrassing that the leaders of the field are absolutely ridiculous? I read this too, and it is a meaningless piece of garbage! What is the point of it? The only thing I've taken away from it is that the pathology education system expects that residents, once their training is complete, are still not ready to practice independently. Is this because the training is inadequate, the people recruited are inadequate, or the system is set up such that the perpetual presence of fellows is required for academic programs to run. I suspect all three are true. No wonder nobody wants to do pathology. Might as well just do radiology; at least nobody thinks you graduated at the bottom of the class.


I have not read the articles. However, based on the previous posts, I wonder whether the "The Future of Pathology Training and Training Programs" is akin to Flexner report. If so, it would be a turning point for our Specialty.
 
Whole slide imaging (IMHO) will be most likely to be important in two situations:
1) Consults
2) Frozen sections particularly at offsite labs.

The first will save money and time on transport. The problem will be being able to bill for it. The second is already being done.

For routine diagnosis it may be important in niche situations but will probably not attain huge importance, at least for a while anyway.

Slide scanning is really nice for tumor boards, picture taking, etc however.

In my opinion, this is a an extraordinary technology that will revolutionize how we practice in the future. High cost, lack of high power hardware/software, (very) high speed internet and lack of regulatory infra-structure are hindering its wide adoption.
 
Last edited:
In my opinion, this is a an extraordinary technology that will revolutionize how we practice in the future. High cost, lack of high power hardware/software, (very) high speed internet and lack of regulatory infra-structure are hindering its wide adoption.

And lack of ability to bill for its use. Always follow the money.
 
Don't pathologists think its embarrassing that the leaders of the field are absolutely ridiculous? I read this too, and it is a meaningless piece of garbage! What is the point of it? The only thing I've taken away from it is that the pathology education system expects that residents, once their training is complete, are still not ready to practice independently. Is this because the training is inadequate, the people recruited are inadequate, or the system is set up such that the perpetual presence of fellows is required for academic programs to run. I suspect all three are true. No wonder nobody wants to do pathology. Might as well just do radiology; at least nobody thinks you graduated at the bottom of the class.

Do you have a real healthcare job? Because if you did you would realize that there is a massive trend in healthcare now to quantify just about everything in terms of a numerical value, so that it can be tracked, studied, and utilized for performance-based pay or other metrics. Almost every hospital now is "graded" based on subjective and somewhat irrational collection of data related to patient experience or meeting arbitrary goals of questionable utility or significance (although many of these goals are truly valid and reasonable). Do you honestly think radiology isn't doing this too? And "practicing independently" does not always mean "completely efficient and proficient." This is true in every profession in the world - people do not reach their top performance immediately upon starting their career. If the existence of a 5 on a 5 point scale was equivalent to "adequately trained and competent" what would that even mean? If you are able to train and produce residents that are 5/5 on everything after completion of training it will essentially mean that trainees and practitioners have been reduced to slightly autonomous robots who are completing algorithms that can be memorized at the expense of real clinical decisionmaking and the importance of experience.

I think you need to use a little more common sense here.
 
Do you honestly think radiology isn't doing this too? And "practicing independently" does not always mean "completely efficient and proficient." This is true in every profession in the world - people do not reach their top performance immediately upon starting their career.

My program director has stated that all residencies, in all specialties, will eventually be developing into the "5 point" system. Pathology is just one of the relatively early guinea-pigs.

And no one is expecting 5's from graduating residents. They expect at least 3's to complete the program, but preferably mostly 4's. They fully acknowledge that one would have to be several years out of residency/fellowships to have a chance of attaining "all 5's". The system was apparently designed intentionally to be that way.
 
... And "practicing independently" does not always mean "completely efficient and proficient." This is true in every profession in the world - people do not reach their top performance immediately upon starting their career. If the existence of a 5 on a 5 point scale was equivalent to "adequately trained and competent" what would that even mean? If you are able to train and produce residents that are 5/5 on everything after completion of training it will essentially mean that trainees and practitioners have been reduced to slightly autonomous robots who are completing algorithms that can be memorized at the expense of real clinical decisionmaking and the importance of experience.

I think you need to use a little more common sense here.

Our training program always had these types of resident grading systems- I always hated them because they pretend to be objective but are just as subjective in practice as any other system. For example, looking at the "Example Pathology Milestone" of Frozen sections, level 1 is someone who is not even doing the frozen (a med student) or novice and requires direct supervision. OK, I get that. Level 2- a resident who understands the contraindications to the frozen and how to call back the results. OK, so basically every resident after 3 frozens is a level 2. Level 3- does the frozen with skill and quality; Level 4- AT THE END OF RESIDENCY does #3 with consistency. So really there is nothing to differentiate level 3 and 4 other than subjectivity and finishing residency. Level 5 is being proficient. Whatever that means. This also has a time requirement. I don't know about your training program, but at mine all frozens had to be called in within 20 min., and this was true for residents/fellows/attendings.

What ends up happening in reality is faculty read the nonsense above and just say- "is this resident performing as expected for a [appropriate year] resident?" and then gives them whatever score that is. For each institution, this will still be subjective based on the relative quality of their residents, and they still won't have what they sought out- a way to compare trainees from different systems. In the end, I think they more or less proved this with their data- that all 4 programs had similar scores. They should have included some crap programs here too, and seen that the resident scores would probably also have been the same.

Then there is the whole proficient thing. It means either just "competent" (google dictionary) or "well-advanced" (Meriam-Webster). So depending on who is reading the skill points it will have different meaning. Not only that, residents should at least be competent by the end of their training. This is not something that needs debate.

I understand grading honesty and responsibility and professionalism, but using this scale is kinda silly. How does one improve on honesty? Is this something taught in residency or an innate character trait? Of course it should be followed, but this does not require a 5 point system. I guess you can only be truly honest once you are a minimum 2 years out from residency.
 
What ends up happening in reality is faculty read the nonsense above and just say- "is this resident performing as expected for a [appropriate year] resident?" and then gives them whatever score that is. For each institution, this will still be subjective based on the relative quality of their residents, and they still won't have what they sought out- a way to compare trainees from different systems. In the end, I think they more or less proved this with their data- that all 4 programs had similar scores. They should have included some crap programs here too, and seen that the resident scores would probably also have been the same.

Then there is the whole proficient thing. It means either just "competent" (google dictionary) or "well-advanced" (Meriam-Webster). So depending on who is reading the skill points it will have different meaning. Not only that, residents should at least be competent by the end of their training. This is not something that needs debate.

I understand grading honesty and responsibility and professionalism, but using this scale is kinda silly. How does one improve on honesty? Is this something taught in residency or an innate character trait? Of course it should be followed, but this does not require a 5 point system. I guess you can only be truly honest once you are a minimum 2 years out from residency.

I agree.
I'm not shocked or surprised by what's going on, I'm just annoyed at the level of documentation of metrics-based assessments in modern medical education (and practice), and I'm particularly annoyed with this approach in pathology because, as GB points out, it's entirely subjective...pathology residents are entirely 100% insulated from anything other than 'fear' of failure. Every other field has a direct patient diagnostic/therapeutic impact; path residents are sheltered from having any meaningful impact outside repercussions from faculty, which is to say, pathology faculty have more of a role in shaping path residents than many institutions acknowledge. Along that vein, a disproportionate time is spent in the gross room vs at the scope, IMO.
 
I understand grading honesty and responsibility and professionalism, but using this scale is kinda silly. How does one improve on honesty? Is this something taught in residency or an innate character trait? Of course it should be followed, but this does not require a 5 point system. I guess you can only be truly honest once you are a minimum 2 years out from residency.

I don't disagree with this. IMHO it gets worse when you are in practice and the "patient satisfaction" and other such mandates for quality begin to count more. It reminds me a lot of the car dealership thing - "if you are satisfied, give us a 5 because otherwise we get docked pay." It's ludicrous. Everyone has different definitions of exceptional performance but yet it is still stratified that way. I never give anyone or anything the top point on the scale because there is always room for improvement (not a bad thing!) but I know other people who give everything a 5.
 
Top