metrics to evaluate residency program?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

coredump

Member
10+ Year Member
20+ Year Member
Joined
May 10, 2002
Messages
104
Reaction score
1
Anyone know what metrics the ACGME (or any other agency) uses to evaluate pathology residency programs?

For example: percent graduating residents offered a job/fellowship? percent graduating residents passing the board exams?
 
Anyone know what metrics the ACGME (or any other agency) uses to evaluate pathology residency programs?

For example: percent graduating residents offered a job/fellowship? percent graduating residents passing the board exams?

Board pass rate.
 
Anyone know what metrics the ACGME (or any other agency) uses to evaluate pathology residency programs?

For example: percent graduating residents offered a job/fellowship? percent graduating residents passing the board exams?

Well, I'm sure ACGME uses boards pass rates, plus attrition rates (transfers out, quitting, etc) plus evaluations and disciplinary actions. But that's not that relevant to rising residents. You should more be asking how students should evaluate pathology residency programs if they were able to have access to any data they need.

Well I don't think the first one really exists because so many residents leave one program and do a fellowship at another. So programs that send all their residents to fellowship elsewhere will have a 100% fellowship acceptance rate, excluding those who leave the country or leave the field or something.

% offered a job is probably not the greatest thing either, because true unemployment among pathologists is exceedingly low. NUMBER of job offers and TIMING of job offers would be good to know, but that data is too complex to be of any use in reality. Too many variables. Board pass rate is the best assessment of competence.

A really good metric would be "where are your graduates 5 years after they finish?" Because at that point they would be finished their 1-3 fellowships and in their first or often second job. If 5 years after graduation all your residents are happy and productive, good program. If after 5 years they are all working in 2 man groups and covering 6 hospitals and still looking for a better job, not as good of a program.
 
But that's not that relevant to rising residents.

I agree, but my curiosity does not lie in evaluating prospective programs.

I agree the metrics are not perfect, but it wouldn't stop programs from trying to show sufficient evidence of meeting (or exceeding) regulatory goals - especially if failing to do so would incur penalties, such as probation or diminished attractiveness as a place to train.

If there were no metrics at all, then programs would have less incentive to help residents secure fellowships and/or jobs. I'm fairly certain board pass rates are monitored, but wasn't sure about the placement rates.
 
The decision is entirely personal. The number one most important thing is the alumni contacts you have access to. And there is little way to evaluate this during the match. And of course if you ask, everyone will say they have good alumni contacts for job searching, so it is useless to try to find out. Roll the dice and pick a program that you like.
 
If there were no metrics at all, then programs would have less incentive to help residents secure fellowships and/or jobs. I'm fairly certain board pass rates are monitored, but wasn't sure about the placement rates.

Why are you fairly certain that board pass rates are monitored? I see no evidence that is the case where I am. I also see little evidence of any incentive to help residents secure fellowships or jobs. I doubt either of those things matter to the ACGME. The ACGME seems to care about work hours, whether people are being abused, evaluations, whether the training is broad enough, stuff that happens in the present while a resident is training, not so much about outcomes.
 
Why are you fairly certain that board pass rates are monitored? I see no evidence that is the case where I am. I also see little evidence of any incentive to help residents secure fellowships or jobs. I doubt either of those things matter to the ACGME. The ACGME seems to care about work hours, whether people are being abused, evaluations, whether the training is broad enough, stuff that happens in the present while a resident is training, not so much about outcomes.

According to the ACGME Common Program requirements
:

C.
Program Evaluation and Improvement
1. The program must document formal, systematic evaluation of the curriculum at
least annually. The program must monitor and track each of the following areas:
  • a) resident performance;
  • b) faculty development;
    [*]c) graduate performance, including performance of program graduates on the certification examination; and,
  • d) program quality. Specifically:
  • (1) Residents and faculty must have the opportunity to evaluate the
  • program confidentially and in writing at least annually, and
  • (2) The program must use the results of residents’ assessments of the
  • program together with other program evaluation results to
  • improve the program.
2. If deficiencies are found, the program should prepare a written plan of action to document initiatives to improve performance in the areas listed in section V.C.1. The action plan should be reviewed and approved by the teaching faculty and documented in meeting minutes.

Note, "deficiencies" are not further defined. "Plan of action" just needs to be developed. None of this has any teeth.
 
Top