Technology/AI implementation in Psychiatry

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

thelastpsych

Full Member
2+ Year Member
Joined
May 3, 2021
Messages
37
Reaction score
28
I know there are some posts that speculate about psychiatrists/therapists losing their jobs to AI and such...but I foud them to be kinda unrealistic considering things like nuance, context, legal and ethical problems.

However, I do think there is a bigger space for the implementation of AI and new techonologies as complementary tools for the modern psychiatrist. Which tecnhologies/uses for AI do you see in the next few years being implemented in the field of Psychiatry?

Members don't see this ad.
 
Last edited:
I know there are some posts that speculate about psychiatrists/therapists losing their jobs to AI and such...but I foud them to be kinda unrealistic considering things like nuance, context, legal and ethical problems.

However, I do think there is a bigger space for the implementation of AI and new techonologies as complementary tools for the modern psychiatrist. Which tecnhologies/uses for AI do you see in the next few years being implemented in the field of Psychiatry?
So many things. For example, I asked ChatGPT to make me a patient information leaflet on things a patient could do to support their own recovery from depression. Then I asked it to tailor it for a patient of a certain age, and with certain physical limitations. It was basically perfect.
 
  • Like
Reactions: 4 users
I use it to help me make my website, write template letters such as prior authorizations or school letters, and proofread my stuff. Limited use right now.
 
Members don't see this ad :)
Our university just put out a special issue to staff (internal) advising them not to feed patient data to ChatGPT.

Apparently, people are putting tons of PHI into it - which is an egregious HIPPA violation.
 
  • Like
  • Wow
Reactions: 2 users
Our university just put out a special issue to staff (internal) advising them not to feed patient data to ChatGPT.

Apparently, people are putting tons of PHI into it - which is an egregious HIPPA violation.
Yeah definitely shouldn't be using plain ol' cGPT with identified patient data. I believe they are working on a HIPAA compliant implementation.

This came up in another thread recently but I could see the transcription/notewriting apps being a huge quality of life improvement. I am hoping that does not lead to pressure to see more patients--I don't think the time savings on notes translates to typical psychiatrist schedules in a way that's very straightforward/helpful (unless you're a 15-min med check grinder.

I wonder how a specifically tuned LLM would compare to the average therapist. We all know there are tons of LPC's and similar out there who mostly talk about themselves/do "friend" therapy. An LLM doing bog-standard CBT/ACT/DBT wouldn't have the common factor of an interested, empathically attuned human but would probably do a better job at teaching the therapy skills. (Cue debates about whether the benefits of behavioral therapies are the behaviors or the structure that facilitates a dyad to make progress "together"/accountability and encouragement to change.)
 
I wonder how a specifically tuned LLM would compare to the average therapist. We all know there are tons of LPC's and similar out there who mostly talk about themselves/do "friend" therapy. An LLM doing bog-standard CBT/ACT/DBT wouldn't have the common factor of an interested, empathically attuned human but would probably do a better job at teaching the therapy skills. (Cue debates about whether the benefits of behavioral therapies are the behaviors or the structure that facilitates a dyad to make progress "together"/accountability and encouragement to change.)
Not an LLM but I did hear that those who use Woebot are more open to regular psychotherapy afterwards. It's as if it's a sampler/introduction.

When I was learning CBT, I felt it was really educational for the patient rather than about using the relationship between therapist and patient. As someone who is psychodynamically oriented, this teaching and socratic approach really turned me off but I think it is probably really helpful for people who need/want it. I do think that something like an AI could help, especially if there's difficulty in coming up or exploring alternative thoughts.
 
  • Like
Reactions: 1 users
Yeah definitely shouldn't be using plain ol' cGPT with identified patient data. I believe they are working on a HIPAA compliant implementation.

This came up in another thread recently but I could see the transcription/notewriting apps being a huge quality of life improvement. I am hoping that does not lead to pressure to see more patients--I don't think the time savings on notes translates to typical psychiatrist schedules in a way that's very straightforward/helpful (unless you're a 15-min med check grinder.

I wonder how a specifically tuned LLM would compare to the average therapist. We all know there are tons of LPC's and similar out there who mostly talk about themselves/do "friend" therapy. An LLM doing bog-standard CBT/ACT/DBT wouldn't have the common factor of an interested, empathically attuned human but would probably do a better job at teaching the therapy skills. (Cue debates about whether the benefits of behavioral therapies are the behaviors or the structure that facilitates a dyad to make progress "together"/accountability and encouragement to change.)
Well said debates will definitely be put to the test. I do not keep up on psychology literature but my understanding is that most of the implementations of online CBT worksheets/curriculum have positive findings and one would expect that to only juice up further from realistic AI responses that are dynamic to the person's own thoughts/biases.
 
I've noticed AI helps with writing the body of text but from there you have to edit it.

IMHO this is like being a film editor and you're going to make a 2 hour movie, but you're provided with thousands of hours of footage. AI produces a lot of bad stuff and wrong stuff, but if could be faster and easier with it giving you a lot of stuff, and then you prune out what you need.

I was writing a legal report and asked it to compare and contrast Dementia vs Delirium and it did so very well. Of course I didn't just copy and paste it. I edited it further but it made what would've been about 30 minutes of work into 5 minutes.

Now this is with the author already full well knowing the difference but I did it from scratch it'd be about 30 minutes. The AI presented it very well.

IF someone wrote a paper not knowing anything on the subject the AI could very likely put in something the "author" might very likely regret.
 
Last edited:
  • Like
Reactions: 2 users
Not an LLM but I did hear that those who use Woebot are more open to regular psychotherapy afterwards. It's as if it's a sampler/introduction.

When I was learning CBT, I felt it was really educational for the patient rather than about using the relationship between therapist and patient. As someone who is psychodynamically oriented, this teaching and socratic approach really turned me off but I think it is probably really helpful for people who need/want it. I do think that something like an AI could help, especially if there's difficulty in coming up or exploring alternative thoughts.
That's interesting to hear. All of my CBT supervisors were also very dynamically oriented so common factors/utility of dynamic orientation were continually reinforced. I think that detracted from learning more "manualized" basic CBT but highlighted a lot of overlap in the two with regard to figuring out the core beliefs that should be challenged (stated differently: arriving at a useful shared formulation.)

There's even "ancient" (60's) literature on how patients sometimes do feel more comfortable with a computer interviewer, at least with certain subjects, so I could see the sampler phenomenon being spot-on.
 
That's interesting to hear. All of my CBT supervisors were also very dynamically oriented so common factors/utility of dynamic orientation were continually reinforced. I think that detracted from learning more "manualized" basic CBT but highlighted a lot of overlap in the two with regard to figuring out the core beliefs that should be challenged (stated differently: arriving at a useful shared formulation.)

There's even "ancient" (60's) literature on how patients sometimes do feel more comfortable with a computer interviewer, at least with certain subjects, so I could see the sampler phenomenon being spot-on.
Manualized CBT is so boring. I much prefer the relational aspect that goes into the therapeutic alliance. The CBT approach to the therapeutic relationship follows these guidelines in the manual:
  • Treat every client at every session the way I’d like to be treated if I were a client.
  • Be a nice human being in the room and help the client feel safe.
  • Remember, clients are supposed to pose challenges; that’s why they need treatment.
  • Keep expectations for my client and myself reasonable.
These seem extremely obvious, although I do know therapists who give "tough love" to patients, have too high expectations, and have told me that they would never go to therapy themselves (???).

The ways to demonstrate empathy, validation, acceptance, inspiring hope all seem so stilted and plays enough into therapist stereotypes to turn me off from CBT all together. Many of my patients find it way too superficial to be beneficial. It's a good starting off point to learn therapy, but doesn't have as much nuance once you enter even slightly more complex cognitive structures.

EDIT: This seems like something that AI can help introduce to patients.
 
Last edited:
  • Like
Reactions: 1 users
People have been trying to involve "AI" (i.e machine learning classifiers) in psychiatric research for years now.
I don't expect anything too drastic.

Where I see a lot of potential is basically with collecting data. Things that are more useful than "have you been feeling down for the past 2 weeks". i.e behavioral, linguistic, affect..etc data that could really transform our field for the better.

ChatGPT probably is going to be useful for some menial tasks, like note writing or searching a medical record.

I don't think the technology is there to replace humans in therapy. No one is going to put so much trust and liability in a dumb system that has no understanding of what is being told, nevermind all the emotional nuances.
 
Top