ePortfolios & assessment of surgical skills

hajiyev

New Member
Aug 27, 2021
9
0
  1. Non-Student
    Good day!
    I'm researching ePortfolio usage among surgical residents across the world (as part of a thesis project). And following are questions which some of you may have insight on.
    The topic is "Application of Technology Acceptance Model (TAM) in Digital Portfolios"...

    But any insight on this narrow topic is highly appreciated (like if a question below is totally invalid!!):

    1. How much would you say the below mentioned points are pain-points for surgical eportfolio usage? What may be the cause of portfolios being viewed as ‘thick-box’ exercises rather than an educational tool?
      • Manual, time-consuming & error-prone data entry
      • Most evaluations in the form of simple text
      • Hawthorne effect (observer effect) or inter/intra-rater reliability
      • Complicated user experience with portfolio management system interfaces
      • Variety of assessment tools that require supervisor time to go through, occasionally
    2. In countries like the US & UK, surgical cases are separately logged than CBME learning instances/milestones. How much of a friction do you think this is in wider acceptance of digital portfolios? (or Would you say that since it is a regulatory requirement, there is even no question of that...?)
    3. Do you think it's ever possible to innovate in surgical competency assessment considering country-specific needs, top-down controlled curriculums?
    4. How would you rate the chance of the following novelties in disrupting how surgeons are being evaluated:
      • Addition of video analysis of recorded surgical operations into the portfolio
      • Combined analysis of all data in current portfolios by AI algorithms, i.e. continuous evaluation of data collected in a portfolio rather than occasional supervisor review and feedback
      • Combining progress data of peer residents to guide individual users
      • Other?
    5. The original TAM says that Perceived Usefulness and Perceived Ease of Use are primary factors in how a new technology is accepted. Which of the two would be critical in the surgical training portfolio tool acceptance?
    thanks a lot!
     

    Lem0nz

    Broke Rule 3 of GS
    10+ Year Member
    Sep 30, 2011
    771
    1,854
    1. Attending Physician
      The vast majority (perhaps all?) of surgical residents who frequent this sub forum are US grads in US residencies and we do not have surgical eportfolios. I don't even know what that is. We just have the ACGME case log system and some milestone thingy that our program directors fill out every six months.
       
      • Like
      Reactions: 1 user

      hajiyev

      New Member
      Aug 27, 2021
      9
      0
      1. Non-Student
        The vast majority (perhaps all?) of surgical residents who frequent this sub forum are US grads in US residencies and we do not have surgical eportfolios. I don't even know what that is. We just have the ACGME case log system and some milestone thingy that our program directors fill out every six months.
        I know, and this is also a kind of feedback I wanted to hear.
        Isn't the ACGME case logs also a kind of portfolio? I mean you residents can login, review, evaluate past performance etc. through that system, don't you?
        When you said "Milestone thingy" (the CBME part of it)... do you mean as surgeons you consider it more of a bureaucracy than assessment system helping to grow surgical skills?
        May be part of the problems is (if there is a problem of course) those piles of information about learning process - cases, milestones, evaluations etc. are only there for a supervisor or program director to check once in a 6 month (or to get certified). The information should be more frequently and intelligently analyzed and become a real tool for growth.
         
        About the Ads

        LucidSplash

        #LimbSalvage
        15+ Year Member
        Gold Member
        Feb 27, 2005
        3,282
        4,288
        1. Attending Physician
          I know, and this is also a kind of feedback I wanted to hear.
          Isn't the ACGME case logs also a kind of portfolio? I mean you residents can login, review, evaluate past performance etc. through that system, don't you?
          When you said "Milestone thingy" (the CBME part of it)... do you mean as surgeons you consider it more of a bureaucracy than assessment system helping to grow surgical skills?
          May be part of the problems is (if there is a problem of course) those piles of information about learning process - cases, milestones, evaluations etc. are only there for a supervisor or program director to check once in a 6 month (or to get certified). The information should be more frequently and intelligently analyzed and become a real tool for growth.

          From my experience, trainees only log cases in the system, there is no evaluation of the case in that system. Simply logging a code to count up satisfying the minimum cases required for graduation. Cases are typically evaluated in other ways at conferences: interesting or complex case conferences, morbidity and mortality conference, etc.

          I think milestones are an attempt at a more equitable and predictable means of evaluating residents as they process through training. I don’t know anyone that thinks it is a perfect system, but I think overall it is an improvement over the prior way of doing things which could seem more capricious.

          I know Ortho and plastics have to have some kind of portfolio of cases they do in their first year in practice for their oral boards. But there is already a system for that and I don’t know there is a point to overhauling an entire system that already works well to create some kind of portfolio system like you are talking about. 🤷🏼‍♀️
           

          hajiyev

          New Member
          Aug 27, 2021
          9
          0
          1. Non-Student
            From my experience, trainees only log cases in the system, there is no evaluation of the case in that system. Simply logging a code to count up satisfying the minimum cases required for graduation. Cases are typically evaluated in other ways at conferences: interesting or complex case conferences, morbidity and mortality conference, etc.

            I think milestones are an attempt at a more equitable and predictable means of evaluating residents as they process through training. I don’t know anyone that thinks it is a perfect system, but I think overall it is an improvement over the prior way of doing things which could seem more capricious.

            I know Ortho and plastics have to have some kind of portfolio of cases they do in their first year in practice for their oral boards. But there is already a system for that and I don’t know there is a point to overhauling an entire system that already works well to create some kind of portfolio system like you are talking about. 🤷🏼‍♀️
            ACGME is publishing the yearly case log national report where generalized statistics can be seen. Not on individual level though. I mean if any intelligent analysis can be made on those statistics, it is not there. Wouldn't it be valuable if a resident could see (on a continuous basis) how his practice of cases are different from overall population of residents across US?

            Milestones are good system to assess how good a surgeon you are becoming in all aspects - not only in clinical skills. But how valuable it is if many see it as "tick-box exercise" (I'm quoting a research article here)? Or as @Lem0nz called it "milestone thingy"....

            Btw I'm still confused why case logs are separate from "Medical Expertise" or "Patient Care" competency. It seems logical that logging a successful surgical operation on a patient would count as progress in one of those categories....

            I'm only a researcher in this field, of course, and what I'm talking about might be absolutely meaningless. But what I have seen so far, almost everywhere - US, UK, Australia, Canada, Netherlands - portfolios are just what they are: collection of data or observations, be it about procedures done in the past or demonstrations of a certain capability (talking professionally to a patient).
            Some individual (supervisor, program manager, RRC or resident himself) has to go through it once in a while and derive conclusions. There is no automation at all...
             

            LucidSplash

            #LimbSalvage
            15+ Year Member
            Gold Member
            Feb 27, 2005
            3,282
            4,288
            1. Attending Physician
              ACGME is publishing the yearly case log national report where generalized statistics can be seen. Not on individual level though. I mean if any intelligent analysis can be made on those statistics, it is not there. Wouldn't it be valuable if a resident could see (on a continuous basis) how his practice of cases are different from overall population of residents across US?

              Milestones are good system to assess how good a surgeon you are becoming in all aspects - not only in clinical skills. But how valuable it is if many see it as "tick-box exercise" (I'm quoting a research article here)? Or as @Lem0nz called it "milestone thingy"....

              Btw I'm still confused why case logs are separate from "Medical Expertise" or "Patient Care" competency. It seems logical that logging a successful surgical operation on a patient would count as progress in one of those categories....

              I'm only a researcher in this field, of course, and what I'm talking about might be absolutely meaningless. But what I have seen so far, almost everywhere - US, UK, Australia, Canada, Netherlands - portfolios are just what they are: collection of data or observations, be it about procedures done in the past or demonstrations of a certain capability (talking professionally to a patient).
              Some individual (supervisor, program manager, RRC or resident himself) has to go through it once in a while and derive conclusions. There is no automation at all...

              We see program operative statistics when interviewing for residency/fellowship, which is helpful when decisions are being made about how to rank programs for the Match. Each program presents them at interviews (and typically shows chief operative logs).

              Milestones were initiated in part because of the admission that case numbers were not enough to evaluate progression. The idea is that you need both to fully evaluate surgical competency. Residents are evaluated constantly and repeated errors or deficiencies are noted and corrected both in real time, and as I mentioned before at conferences like M&M.

              I’m still not really clear on what type of information would be included in this “portfolio” but overall it sounds like a ton of extra work with questionable benefit.
               
              Last edited:

              Lem0nz

              Broke Rule 3 of GS
              10+ Year Member
              Sep 30, 2011
              771
              1,854
              1. Attending Physician
                You need to look at the C-SATS program from J&J. It is the closest thing to what you are asking for and about. It is still in its infancy and is not widely adopted. It is probably the future direction of surgical training but it will take another 10-20 years to completely roll out and may STILL not be universally adopted, I would be very surprised some of the smaller surgical programs had the finances and ability to heavily invest in something like that.

                There are other initiatives to record surgical residents technical abilities in either a training environment or in real life in the US system but all of them are experimental and other than C-SATS I do not know of any personally that are done across more than a single training program.

                We just haven't needed that as a country yet. As we continue to further specialize we might because some surgical trainees are losing important open surgical skills. Alternatively, we may continue to evolve the MIS space so well that we create more optimal bailout procedures that don't require open surgery at all and then... I'm just wrong. Which would actually be kinda neat.

                But I doubt it.

                And for what its worth, the concept of C-SATS is really cool but in actual real world practice I would not find it useful. For example, I'm 3 months in as an attending and while it would be really nice to have my cases reviewed by an attending surgeon who has been doing this for 20-30 years to evaluate economy of motion and surgical safety of my movements, in reality it would be absolutely silly because the reasons prompting those inefficiencies are *so variable* and may have nothing to do with my ability technically as a surgeon. For example - I'm in the HPB space, and none of the nurses or my NP assistant have any experience doing my large operations. So I'm spending way more time teaching them how to safely and efficiently assist me than working on improving my own skills at the moment. Something like C-SATS would not be able to factor that in or offer particularly useful feedback which is why we still use a mentoring system and/or proctoring system in the US when we're adopting new techniques, approaches, or building programs. They can generally give much more holistic and practical advice.
                 
                • Like
                Reactions: 1 user
                About the Ads

                Your message may be considered spam for the following reasons:

                1. Your new thread title is very short, and likely is unhelpful.
                2. Your reply is very short and likely does not add anything to the thread.
                3. Your reply is very long and likely does not add anything to the thread.
                4. It is very likely that it does not need any further discussion and thus bumping it serves no purpose.
                5. Your message is mostly quotes or spoilers.
                6. Your reply has occurred very quickly after a previous reply and likely does not add anything to the thread.
                7. This thread is locked.