ChatGTP

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

starbucks12345

New Member
Joined
Feb 8, 2023
Messages
2
Reaction score
0
What do you think are the chances that AI can replace the role psychologists serve? Considering my Psych PhD but want to be realistic about how potential job displacement. I'm also curious how psychologists might use AI in providing services. I can see it, possibly, during intake procedures.

Members don't see this ad.
 
 
  • Like
Reactions: 1 users
What do you think are the chances that AI can replace the role psychologists serve? Considering my Psych PhD but want to be realistic about how potential job displacement. I'm also curious how psychologists might use AI in providing services. I can see it, possibly, during intake procedures.
0.00%

Do we really think that people are going to pay good money to 'hire' a Cylon / robot as a 'psychotherapist?'

Would you?

If people want a free or cheap way to do 'self-help' there are already plenty of excellent client workbooks for things like anxiety, depression, PTSD, etc.

What people are paying for when they hire a therapist is someone who is an actual person (who can actually listen) and possibly even empathize with you as you gather your thoughts and attempt to communicate your suffering to another living human being (at least initially). I don't really think that this has any chance of being 'replaced' by a book, an algorithm, an 'artificial intelligence,' or any other form of dead simulacrum.

As far as the technical tools, for example, the computer algorithm that serves as an aid in CBT-i (for insomnia) that automatically helps calculate a 'prescription' for a client's modifications in sleep schedule after they've fed it data from a week or two of sleep logs...I mean...it's a fantastic tool for therapist and client to utilize in the context of a collaborative relationship but it's not 'therapy' per se. Same with some of the excellent 'behavioral activation' books for depression or even the 'Feeling Good' book by David Burns, for example.

I think that two factors generally lead people to propose or speculate about a computer algorithm / cyborg 'replacing psychotherapists' and they are:

1) an unrealistic quasi-religious faith in 'technology' to do (all) things 'better' and/or
2) an lack of appreciation of how hard good therapy actually is to accomplish and what a polymath you have to be to effectively do that job; protocols don't implement themselves; mentally ill people don't just sit in the chair while you go down a checklist of pre-fabricated 'agenda' items; you have to develop a working therapeutic relationship before you even really get into the nuts-and-bolts of interventions and every single intervention you do takes place in a sometimes fragile rapport where things could easily be misconstrued or mis-timed.

We shouldn't be too hasty to give up control to AI algorithms under the mis-guided faith that things will be 'better' as a result.

“Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” ― Frank Herbert, Dune
 
Last edited:
  • Like
Reactions: 3 users
Top