ePortfolios & assessment of surgical skills

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

hajiyev

Full Member
Joined
Aug 27, 2021
Messages
15
Reaction score
0
Good day!
I'm researching ePortfolio usage among surgical residents across the world (as part of a thesis project). And following are questions which some of you may have insight on.
The topic is "Application of Technology Acceptance Model (TAM) in Digital Portfolios"...

But any insight on this narrow topic is highly appreciated (like if a question below is totally invalid!!):

  1. How much would you say the below mentioned points are pain-points for surgical eportfolio usage? What may be the cause of portfolios being viewed as ‘thick-box’ exercises rather than an educational tool?
    • Manual, time-consuming & error-prone data entry
    • Most evaluations in the form of simple text
    • Hawthorne effect (observer effect) or inter/intra-rater reliability
    • Complicated user experience with portfolio management system interfaces
    • Variety of assessment tools that require supervisor time to go through, occasionally
  2. In countries like the US & UK, surgical cases are separately logged than CBME learning instances/milestones. How much of a friction do you think this is in wider acceptance of digital portfolios? (or Would you say that since it is a regulatory requirement, there is even no question of that...?)
  3. Do you think it's ever possible to innovate in surgical competency assessment considering country-specific needs, top-down controlled curriculums?
  4. How would you rate the chance of the following novelties in disrupting how surgeons are being evaluated:
    • Addition of video analysis of recorded surgical operations into the portfolio
    • Combined analysis of all data in current portfolios by AI algorithms, i.e. continuous evaluation of data collected in a portfolio rather than occasional supervisor review and feedback
    • Combining progress data of peer residents to guide individual users
    • Other?
  5. The original TAM says that Perceived Usefulness and Perceived Ease of Use are primary factors in how a new technology is accepted. Which of the two would be critical in the surgical training portfolio tool acceptance?
thanks a lot!

Members don't see this ad.
 
The vast majority (perhaps all?) of surgical residents who frequent this sub forum are US grads in US residencies and we do not have surgical eportfolios. I don't even know what that is. We just have the ACGME case log system and some milestone thingy that our program directors fill out every six months.
 
  • Like
Reactions: 1 user
The vast majority (perhaps all?) of surgical residents who frequent this sub forum are US grads in US residencies and we do not have surgical eportfolios. I don't even know what that is. We just have the ACGME case log system and some milestone thingy that our program directors fill out every six months.
I know, and this is also a kind of feedback I wanted to hear.
Isn't the ACGME case logs also a kind of portfolio? I mean you residents can login, review, evaluate past performance etc. through that system, don't you?
When you said "Milestone thingy" (the CBME part of it)... do you mean as surgeons you consider it more of a bureaucracy than assessment system helping to grow surgical skills?
May be part of the problems is (if there is a problem of course) those piles of information about learning process - cases, milestones, evaluations etc. are only there for a supervisor or program director to check once in a 6 month (or to get certified). The information should be more frequently and intelligently analyzed and become a real tool for growth.
 
Members don't see this ad :)
I know, and this is also a kind of feedback I wanted to hear.
Isn't the ACGME case logs also a kind of portfolio? I mean you residents can login, review, evaluate past performance etc. through that system, don't you?
When you said "Milestone thingy" (the CBME part of it)... do you mean as surgeons you consider it more of a bureaucracy than assessment system helping to grow surgical skills?
May be part of the problems is (if there is a problem of course) those piles of information about learning process - cases, milestones, evaluations etc. are only there for a supervisor or program director to check once in a 6 month (or to get certified). The information should be more frequently and intelligently analyzed and become a real tool for growth.

From my experience, trainees only log cases in the system, there is no evaluation of the case in that system. Simply logging a code to count up satisfying the minimum cases required for graduation. Cases are typically evaluated in other ways at conferences: interesting or complex case conferences, morbidity and mortality conference, etc.

I think milestones are an attempt at a more equitable and predictable means of evaluating residents as they process through training. I don’t know anyone that thinks it is a perfect system, but I think overall it is an improvement over the prior way of doing things which could seem more capricious.

I know Ortho and plastics have to have some kind of portfolio of cases they do in their first year in practice for their oral boards. But there is already a system for that and I don’t know there is a point to overhauling an entire system that already works well to create some kind of portfolio system like you are talking about. 🤷🏼‍♀️
 
From my experience, trainees only log cases in the system, there is no evaluation of the case in that system. Simply logging a code to count up satisfying the minimum cases required for graduation. Cases are typically evaluated in other ways at conferences: interesting or complex case conferences, morbidity and mortality conference, etc.

I think milestones are an attempt at a more equitable and predictable means of evaluating residents as they process through training. I don’t know anyone that thinks it is a perfect system, but I think overall it is an improvement over the prior way of doing things which could seem more capricious.

I know Ortho and plastics have to have some kind of portfolio of cases they do in their first year in practice for their oral boards. But there is already a system for that and I don’t know there is a point to overhauling an entire system that already works well to create some kind of portfolio system like you are talking about. 🤷🏼‍♀️
ACGME is publishing the yearly case log national report where generalized statistics can be seen. Not on individual level though. I mean if any intelligent analysis can be made on those statistics, it is not there. Wouldn't it be valuable if a resident could see (on a continuous basis) how his practice of cases are different from overall population of residents across US?

Milestones are good system to assess how good a surgeon you are becoming in all aspects - not only in clinical skills. But how valuable it is if many see it as "tick-box exercise" (I'm quoting a research article here)? Or as @Lem0nz called it "milestone thingy"....

Btw I'm still confused why case logs are separate from "Medical Expertise" or "Patient Care" competency. It seems logical that logging a successful surgical operation on a patient would count as progress in one of those categories....

I'm only a researcher in this field, of course, and what I'm talking about might be absolutely meaningless. But what I have seen so far, almost everywhere - US, UK, Australia, Canada, Netherlands - portfolios are just what they are: collection of data or observations, be it about procedures done in the past or demonstrations of a certain capability (talking professionally to a patient).
Some individual (supervisor, program manager, RRC or resident himself) has to go through it once in a while and derive conclusions. There is no automation at all...
 
ACGME is publishing the yearly case log national report where generalized statistics can be seen. Not on individual level though. I mean if any intelligent analysis can be made on those statistics, it is not there. Wouldn't it be valuable if a resident could see (on a continuous basis) how his practice of cases are different from overall population of residents across US?

Milestones are good system to assess how good a surgeon you are becoming in all aspects - not only in clinical skills. But how valuable it is if many see it as "tick-box exercise" (I'm quoting a research article here)? Or as @Lem0nz called it "milestone thingy"....

Btw I'm still confused why case logs are separate from "Medical Expertise" or "Patient Care" competency. It seems logical that logging a successful surgical operation on a patient would count as progress in one of those categories....

I'm only a researcher in this field, of course, and what I'm talking about might be absolutely meaningless. But what I have seen so far, almost everywhere - US, UK, Australia, Canada, Netherlands - portfolios are just what they are: collection of data or observations, be it about procedures done in the past or demonstrations of a certain capability (talking professionally to a patient).
Some individual (supervisor, program manager, RRC or resident himself) has to go through it once in a while and derive conclusions. There is no automation at all...

We see program operative statistics when interviewing for residency/fellowship, which is helpful when decisions are being made about how to rank programs for the Match. Each program presents them at interviews (and typically shows chief operative logs).

Milestones were initiated in part because of the admission that case numbers were not enough to evaluate progression. The idea is that you need both to fully evaluate surgical competency. Residents are evaluated constantly and repeated errors or deficiencies are noted and corrected both in real time, and as I mentioned before at conferences like M&M.

I’m still not really clear on what type of information would be included in this “portfolio” but overall it sounds like a ton of extra work with questionable benefit.
 
Last edited:
You need to look at the C-SATS program from J&J. It is the closest thing to what you are asking for and about. It is still in its infancy and is not widely adopted. It is probably the future direction of surgical training but it will take another 10-20 years to completely roll out and may STILL not be universally adopted, I would be very surprised some of the smaller surgical programs had the finances and ability to heavily invest in something like that.

There are other initiatives to record surgical residents technical abilities in either a training environment or in real life in the US system but all of them are experimental and other than C-SATS I do not know of any personally that are done across more than a single training program.

We just haven't needed that as a country yet. As we continue to further specialize we might because some surgical trainees are losing important open surgical skills. Alternatively, we may continue to evolve the MIS space so well that we create more optimal bailout procedures that don't require open surgery at all and then... I'm just wrong. Which would actually be kinda neat.

But I doubt it.

And for what its worth, the concept of C-SATS is really cool but in actual real world practice I would not find it useful. For example, I'm 3 months in as an attending and while it would be really nice to have my cases reviewed by an attending surgeon who has been doing this for 20-30 years to evaluate economy of motion and surgical safety of my movements, in reality it would be absolutely silly because the reasons prompting those inefficiencies are *so variable* and may have nothing to do with my ability technically as a surgeon. For example - I'm in the HPB space, and none of the nurses or my NP assistant have any experience doing my large operations. So I'm spending way more time teaching them how to safely and efficiently assist me than working on improving my own skills at the moment. Something like C-SATS would not be able to factor that in or offer particularly useful feedback which is why we still use a mentoring system and/or proctoring system in the US when we're adopting new techniques, approaches, or building programs. They can generally give much more holistic and practical advice.
 
  • Like
Reactions: 1 user
You need to look at the C-SATS program from J&J. It is the closest thing to what you are asking for and about. It is still in its infancy and is not widely adopted. It is probably the future direction of surgical training but it will take another 10-20 years to completely roll out and may STILL not be universally adopted, I would be very surprised some of the smaller surgical programs had the finances and ability to heavily invest in something like that.

There are other initiatives to record surgical residents technical abilities in either a training environment or in real life in the US system but all of them are experimental and other than C-SATS I do not know of any personally that are done across more than a single training program.

We just haven't needed that as a country yet. As we continue to further specialize we might because some surgical trainees are losing important open surgical skills. Alternatively, we may continue to evolve the MIS space so well that we create more optimal bailout procedures that don't require open surgery at all and then... I'm just wrong. Which would actually be kinda neat.

But I doubt it.

And for what its worth, the concept of C-SATS is really cool but in actual real world practice I would not find it useful. For example, I'm 3 months in as an attending and while it would be really nice to have my cases reviewed by an attending surgeon who has been doing this for 20-30 years to evaluate economy of motion and surgical safety of my movements, in reality it would be absolutely silly because the reasons prompting those inefficiencies are *so variable* and may have nothing to do with my ability technically as a surgeon. For example - I'm in the HPB space, and none of the nurses or my NP assistant have any experience doing my large operations. So I'm spending way more time teaching them how to safely and efficiently assist me than working on improving my own skills at the moment. Something like C-SATS would not be able to factor that in or offer particularly useful feedback which is why we still use a mentoring system and/or proctoring system in the US when we're adopting new techniques, approaches, or building programs. They can generally give much more holistic and practical advice.
Thanks for that tip - C-SATS seems to be really a great tool and something close to what I had described before...
I agree with your concerns about variations in context and probably C-SATS can do something about it: it's not totally automated after all and when you upload a video (which is a choice: you may exclude non-representative ones) you could include other information to create a realistic picture.
That said one of the C-SATS videos I watched on Youtube aptly mentions "creating a safe & protected learning environment" as one of the goals. Basically it says that the way a surgeon does a surgery is very much personal, "it's my business, it's my life, I'm dedicated to getting better but when you start pulling covers that close to home it gets personal..." is literally what he said...
As you say in 10-20 years it may get more widely accepted....
 
We see program operative statistics when interviewing for residency/fellowship, which is helpful when decisions are being made about how to rank programs for the Match. Each program presents them at interviews (and typically shows chief operative logs).

Milestones were initiated in part because of the admission that case numbers were not enough to evaluate progression. The idea is that you need both to fully evaluate surgical competency. Residents are evaluated constantly and repeated errors or deficiencies are noted and corrected both in real time, and as I mentioned before at conferences like M&M.

I’m still not really clear on what type of information would be included in this “portfolio” but overall it sounds like a ton of extra work with questionable benefit.
Agree that case logs especially their quantity does not create full picture how a surgeon is progressing in development.
Milestones as a separate library (portfolio) of proofs is a good complement.
Things like video analysis may still make the portfolios richer. C-SATS has a library of videos uploaded & rated - could be nice to have that integrated into your ACGME portfolio, don't you think so?

And when you think of all that textual information in the milestones & case logs, wouldn't it be nice to have some kind of intelligent system that sits on top and makes useful analytics? I mean you mentioned that those portfolios are reviewed quite infrequently and by humans. Possibly, by constantly asking such questions as: What topics (competences) a new activity is registered? How often did it happen in last few weeks? How fast is the progress compared to peers? What keywords come up more frequently in those instances? How short or long are the feedback? etc...
 
Agree that case logs especially their quantity does not create full picture how a surgeon is progressing in development.
Milestones as a separate library (portfolio) of proofs is a good complement.
Things like video analysis may still make the portfolios richer. C-SATS has a library of videos uploaded & rated - could be nice to have that integrated into your ACGME portfolio, don't you think so?

And when you think of all that textual information in the milestones & case logs, wouldn't it be nice to have some kind of intelligent system that sits on top and makes useful analytics? I mean you mentioned that those portfolios are reviewed quite infrequently and by humans. Possibly, by constantly asking such questions as: What topics (competences) a new activity is registered? How often did it happen in last few weeks? How fast is the progress compared to peers? What keywords come up more frequently in those instances? How short or long are the feedback? etc...
Sure, video analysis sounds great, except you have to have a widely validated system of evaluation and you have to be able to provide every program with the necessary technological tools to film these procedures, transfer them to this portfolio, rate, etc. Seems like we're a ways from having something like that realistically utilized across surgical training programs due to validation and resource limitations.

Milestones are a nice complement when used properly, which I am not clear most programs and trainees actually use them in a meaningful way. And studying who is and isn't and why is really difficult for a number of reasons. Yes, it would be nice to have an objective, intelligent system integrated with these milestones to improve evaluations, and that is already happening. EPAs (entrustable professional activities) are a new form of competency evaluation that is in the works now and will be integrated as a standard tool in the acgme evaluation process in the next 2-3 years for gen surg. Vascular surgery also writing EPAs now. Not sure who else. I won't delve into explaining them here, there are plenty of publications on it. But the big picture goal is to integrate them with the current milestones system to create more meaningful feedback and evaluation for trainees.

I have never heard of a surgical eportfolio frankly. Unless you're basically describing something like New Innovations, which is just a website where all these things are uploaded and organized (evaluations, milestones, duty hours, contract, anything related to your performance). This is what many places use to track all of these elements. Of course the formal case log system is separate, but I'm not sure that matters.
 
Last edited:
Agree that case logs especially their quantity does not create full picture how a surgeon is progressing in development.
Milestones as a separate library (portfolio) of proofs is a good complement.
Things like video analysis may still make the portfolios richer. C-SATS has a library of videos uploaded & rated - could be nice to have that integrated into your ACGME portfolio, don't you think so?

And when you think of all that textual information in the milestones & case logs, wouldn't it be nice to have some kind of intelligent system that sits on top and makes useful analytics? I mean you mentioned that those portfolios are reviewed quite infrequently and by humans. Possibly, by constantly asking such questions as: What topics (competences) a new activity is registered? How often did it happen in last few weeks? How fast is the progress compared to peers? What keywords come up more frequently in those instances? How short or long are the feedback? etc...

I’m only a couple years out of training, so I don’t think I’m quite at the old dinosaur level yet, but honestly no. I don’t think video analysis would be that useful in video format. Also you over estimate the recording ability in the average OR. Some places are really set up for it which is where those operative videos come out of. But the average OR isn’t set up to record automatically and a lot don’t have the ability at all. Also you have to take into account that there are a lot of different ways to skin a cat and each attending surgeon has their own preferences and quirks and such for how they want a resident to do something. If you show a group of surgeons a given video, there will be several opinions about what the ideal technique or suture or etc they would use in a given scenario.
 
I’m only a couple years out of training, so I don’t think I’m quite at the old dinosaur level yet, but honestly no. I don’t think video analysis would be that useful in video format. Also you over estimate the recording ability in the average OR. Some places are really set up for it which is where those operative videos come out of. But the average OR isn’t set up to record automatically and a lot don’t have the ability at all. Also you have to take into account that there are a lot of different ways to skin a cat and each attending surgeon has their own preferences and quirks and such for how they want a resident to do something. If you show a group of surgeons a given video, there will be several opinions about what the ideal technique or suture or etc they would use in a given scenario.
You also have to take into account that skinning cats is just plain weird so when you factor in how many of us do that it just confounds all of the data.

God damn cats.

#catdad
 
  • Wow
Reactions: 1 user
I’m only a couple years out of training, so I don’t think I’m quite at the old dinosaur level yet, but honestly no. I don’t think video analysis would be that useful in video format. Also you over estimate the recording ability in the average OR. Some places are really set up for it which is where those operative videos come out of. But the average OR isn’t set up to record automatically and a lot don’t have the ability at all. Also you have to take into account that there are a lot of different ways to skin a cat and each attending surgeon has their own preferences and quirks and such for how they want a resident to do something. If you show a group of surgeons a given video, there will be several opinions about what the ideal technique or suture or etc they would use in a given scenario.
This comment made me look into the use of video training for surgical skills, because I frankly haven't ever used it, but there's some pretty convincing evidence out there that it is a very effective training tool for what it's worth! I found this nice systematic review:


The big picture message is: 'When compared with a nonvideo training group, video training was associated with improved resident knowledge (100%), improved operative performance (81.3%), and greater participant satisfaction (100%)'.

Anyway, it's always nice to be challenged on our perspectives and learn something new! Everyone learns differently, so I respect it isn't something you find helpful, but important to keep in mind it seems the vast majority of those we train would benefit from it based on the current evidence in the literature.

EDIT: But, completely agree with you that it would be a resource-intensive evaluation system that would be difficult to implement in many (maybe even most) programs.
 
My two house panthers can’t read so shhhh they’ll never know about my horrifying metaphor. 😂
good that you said that or I was already imagining operations on cats as part of surgical training o_O
 
Sure, video analysis sounds great, except you have to have a widely validated system of evaluation and you have to be able to provide every program with the necessary technological tools to film these procedures, transfer them to this portfolio, rate, etc. Seems like we're a ways from having something like that realistically utilized across surgical training programs due to validation and resource limitations.

Milestones are a nice complement when used properly, which I am not clear most programs and trainees actually use them in a meaningful way. And studying who is and isn't and why is really difficult for a number of reasons. Yes, it would be nice to have an objective, intelligent system integrated with these milestones to improve evaluations, and that is already happening. EPAs (entrustable professional activities) are a new form of competency evaluation that is in the works now and will be integrated as a standard tool in the acgme evaluation process in the next 2-3 years for gen surg. Vascular surgery also writing EPAs now. Not sure who else. I won't delve into explaining them here, there are plenty of publications on it. But the big picture goal is to integrate them with the current milestones system to create more meaningful feedback and evaluation for trainees.

I have never heard of a surgical eportfolio frankly. Unless you're basically describing something like New Innovations, which is just a website where all these things are uploaded and organized (evaluations, milestones, duty hours, contract, anything related to your performance). This is what many places use to track all of these elements. Of course the formal case log system is separate, but I'm not sure that matters.
Personally I'm surprised that in US "eportfolio of surgeon" sounds strange when this is already actively used f.e. in UK.
I'm not a surgeon so may not feel exact context but it normally should be damn useful thing to have all your achievements, experience, learning available at a click of mouse. Ability to create a CV, reflect back on your experiences etc.
EPA's only enable breaking down the competences in smaller sub-units. In that sense it helps to identify at what stage the resident is on a competence acquisition and where he needs to go faster...
It is not a completely new evaluation system and I was expecting Milestones are already based on EPAs (you are telling they are not).
 
Personally I'm surprised that in US "eportfolio of surgeon" sounds strange when this is already actively used f.e. in UK.
I'm not a surgeon so may not feel exact context but it normally should be damn useful thing to have all your achievements, experience, learning available at a click of mouse. Ability to create a CV, reflect back on your experiences etc.
EPA's only enable breaking down the competences in smaller sub-units. In that sense it helps to identify at what stage the resident is on a competence acquisition and where he needs to go faster...
It is not a completely new evaluation system and I was expecting Milestones are already based on EPAs (you are telling they are not).
Yes, that is all available on New Innovations (or this is what every place I have ever trained uses). I have never heard the term 'eportfolio', but it is exactly what you're describing.

EPAs are actually intended to encompass and integrate multiple milestones, not break them down further.
 
Yes, that is all available on New Innovations (or this is what every place I have ever trained uses). I have never heard the term 'eportfolio', but it is exactly what you're describing.

EPAs are actually intended to encompass and integrate multiple milestones, not break them down further.
What is "New Innovations"? an application used in your residency program? Can you please share a link so I can study that as well?
 
Can you guys also comment how big is the size of "new innovations" or "milestones" per resident?
I mean if ACGME requires you to record 850 cases during the post-graduate training that already makes so many small text logs and then you probably have how many milestone instances to be recorded? Would you say that 1500 is a good estimate of how many textual files (small or big) need to be recorded and analyzed to capture the progress?
 
Top