ChatGPT

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

voxveritatisetlucis

Membership Revoked
Removed
2+ Year Member
Joined
Jun 22, 2021
Messages
4,923
Reaction score
4,497

Thoughts?

Members don't see this ad.
 
I mean, I could certainly pass the boards with Google and the processing speed of a computer.

Passing a board exam and being and working as a physician are two different things.

The main thing is I wouldn't sleep on how valuable a tool AI may be for many physicians in the future, like the internet/EMR/etc it's going to be something that people will be "left behind" if they dont keep up. Would bet my next loan disbursement CME on AI integration in your practice will be available in the next 3 years if it isn't already.
 
I mean, I could certainly pass the boards with Google and the processing speed of a computer.

Passing a board exam and being and working as a physician are two different things.

The main thing is I wouldn't sleep on how valuable a tool AI may be for many physicians in the future, like the internet/EMR/etc it's going to be something that people will be "left behind" if they dont keep up. Would bet my next loan disbursement CME on AI integration in your practice will be available in the next 3 years if it isn't already.
Yes but say in the future that AI increases productivity such that 1 radiologist can do the job of 5, I’m guessing it would lead to massive pay cuts. In my opinion, all med students, residents, physicians should be investing in AI companies related to health care as a back up plan. This would not be restricted to medicine however and would likely lead to massive deflation due to declining wages across the board
 
Members don't see this ad :)
Yes but say in the future that AI increases productivity such that 1 radiologist can do the job of 5, I’m guessing it would lead to massive pay cuts. In my opinion, all med students, residents, physicians should be investing in AI companies related to health care as a back up plan
In the future we'll probably need 5x as many radiologists anyways. Also, I imagine in this scenario the AI company would have to take the brunt of any legal actions so they are certainly going to want a human being check each scan.


At the end of the day, if AI are able to phase out physicians to an appreciable degree we either are going to live in a utopia with universal basic income, or a dystopia where people are out of jobs, no one is able to make money, and the economy is completely in shambles. Maybe that's a bit dramatic.


Though to your point, I do have LEAPS in an ETF that is essentially that. And ChatGPT has actually made me a decent amount of money in a way, though I'd take that discussion to DMs heh
 
Last edited:
That's the thing, AI can certainly help reduce a lot of the administrative scut. The folks who should be worried are scribes and the like.
There is at least one scribe ai tool on the market.

 
There is at least one scribe ai tool on the market.

It's honestly things like that that has me back to being interested in specialties I was iffy on because "oof the admin work" or "oof the note writing" but have liked the actual medicine of
 
I’m a big ChatGPT fan and actually use it quite a bit now in my practice. It’s not yet at the point where it can do notes for me but will surely get there.

I think this highlights the fundamental issue of what practicing medicine actually is. It isn’t passing a board exam - that’s basically a knowledge test and assessing minimum competency. But having knowledge is not really what medicine is all about.

For me, what I do that nobody and no bot can yet do is build a rapport with a patient, elicit a story, integrate that with exam and other data and synthesize that into a diagnosis or differential, and then engage the patient to aid them in making the right informed decisions for the next step in their care.

I’m sure a bot can read a note and make a good differential. It will probably be able to mimic much of the history as well. But it’s that interaction with another human combined with high level synthesis that AI will struggle with for some time. Making a diagnosis is ridiculously easy 99% of the time; convincing a patient to trust you to perform a surgery on them or their loved one is much harder. And thats where the rubber hits road - it’s the management after diagnosis has been made that really counts.

I’m fond of saying that AI will replace doctors right AFTER it replaces patients.

For now, ChatGPT is saving me a ton of time and energy and I look forward to when it helps make tedious documentation a thing of the past. Where it listens to my encounter and then documents the whole thing and all I have to do is tweak the A&P/MDM. The first company that develops that along with maximizing coding for each visit will corner the market.
 

Thoughts?
Artificial intelligence requires perfect clear cut scenarios to interpret... Like Step vignettes.

Artificial intelligence will never be able to compensate for human error, and when I say human error I am referring to the history and information a patient will give a doctor.

It takes abstract thinking to interpret abstract thought.
 
I’m a big ChatGPT fan and actually use it quite a bit now in my practice. It’s not yet at the point where it can do notes for me but will surely get there.

I think this highlights the fundamental issue of what practicing medicine actually is. It isn’t passing a board exam - that’s basically a knowledge test and assessing minimum competency. But having knowledge is not really what medicine is all about.

For me, what I do that nobody and no bot can yet do is build a rapport with a patient, elicit a story, integrate that with exam and other data and synthesize that into a diagnosis or differential, and then engage the patient to aid them in making the right informed decisions for the next step in their care.

I’m sure a bot can read a note and make a good differential. It will probably be able to mimic much of the history as well. But it’s that interaction with another human combined with high level synthesis that AI will struggle with for some time. Making a diagnosis is ridiculously easy 99% of the time; convincing a patient to trust you to perform a surgery on them or their loved one is much harder. And thats where the rubber hits road - it’s the management after diagnosis has been made that really counts.

I’m fond of saying that AI will replace doctors right AFTER it replaces patients.

For now, ChatGPT is saving me a ton of time and energy and I look forward to when it helps make tedious documentation a thing of the past. Where it listens to my encounter and then documents the whole thing and all I have to do is tweak the A&P/MDM. The first company that develops that along with maximizing coding for each visit will corner the market.
How are you using it in your practice now? Like giving it key points and it writes a history for you or something like that?
 
One thing that comes to mind:

"Write an explanation of what psoriasis is for a patient". *Print* *Hand to patient*. It's probably not too difficult to have a pre-written thing that serves the same purpose, but a colleague had it do that and said it was better than what he would have come up with.
 
How are you using it in your practice now? Like giving it key points and it writes a history for you or something like that?
Patient education materials have been especially good. Revising and refining post op instructions and info as well.

I use it to write emails all the time. I’ve used it to design meeting agendas and staff training programs.

I’ve even had it read some of the longer patient messages and summarize it for me. I just copy/ paste and ask for a summary.

I keep finding new uses as well. I’ve had colleagues use it to write insurance denial appeals, but I don’t really get many of those so haven’t done that yet.
 
Members don't see this ad :)
Are there any potential HIPAA issues with that?
Thats the thing with this software; technically, selling ChatGPT as a licensed software for healthcare companies to make it HIPAA compliant is going to be quite hard since the AI uses the entire internet's data, a recipe for lawsuits from the original data owners.
 
I'm less worried now.
 

Attachments

  • Screenshot 2023-01-30 235023.png
    Screenshot 2023-01-30 235023.png
    167.5 KB · Views: 166
You have to be really careful with ChatGPT. As @futureapppsy2 informed me. It completely makes up believable sounding references, It’s language based AI, not so much a knowledge center.



on the one below scroll down a bit to read responses

https://www.reddit.com/r/ChatGPT/c...utm_source=embed&utm_name=&utm_content=header
 
Last edited:
You have to be really careful with ChatGPT. As @futureapppsy2 informed me. It completely makes up believable sounding references, It’s language based AI, not so much a knowledge center.



on the one below scroll down a bit to read responses

https://www.reddit.com/r/ChatGPT/c...utm_source=embed&utm_name=&utm_content=header
Yeah I’ve run into this when playing with it. It’s odd how sometimes it picks really good references and other times hallucinates nonexistent ones. I’ve noticed better refs when having it write humanities papers; the medical refs were completely fabricated.
 
Yet to see - ChatGPT calling out ChatGPT for trolling SDN. I guess some things are just purely reserved for human beings…
 
I tried asking it to write a simple discharge instruction/patient education for a relatively common pathology in my field. It was fine, would probably use it, but I think the premade Epic ones are better.

Haven’t asked it to formulate a ddx yet.
 
I've thought a lot about this, and I've also used ChatGPT pretty extensively for studying, writing, data analysis, and other menial tasks. I think physicians should be both excited and scared, but mostly excited.

Reasons to be excited:

1) Decreased administrative scut for physicians. People might say, "now physicians will be more productive and it will decrease wages," but that's very unlikely. Given a bit of breathing room, most physicians will take it rather than try to squeeze 20 minute visits into 10 minute visits, and patients will revolt if doctors start spending even less time with them. Look at the typical office worker's day. Work fills the time and expectations provided. Workforces as a whole relax once their menial tasks get automated. Expect a strongly buffered effect while completely cutting out lots of menial work.

2) Decreased administrative burden. Low level admins will undoubtedly be cut by this sort of technology, while physicians will remain essential for practical and legal reasons. Private practice will fall eventually, but in the meanwhile it will increase PP income, and it will also increase employed physician income, if only slightly.

3) Fewer lawsuits if you use the technology effectively.

4) Extra-clinical opportunities. This is probably the biggest thing to roll into healthcare in the last 20 years. At least as big as EMRs if not significantly bigger. If you are savvy, you can probably make some money as a consultant.

Reasons to be scared:

1) Your deep investment in the specific skills of a physician. Physicians are unique in the workforce in that our training is 7+ years, our skills are poorly transferrable, and our compensation is dependent on physician legal protections and scarcity. If Congress decides that AI + NP is as good as a physician (because it's more convenient for their campaign donations than actually taxing the wealthy to fund safe healthcare) then you're SOL. Physicians need to make ~2x the salary of their desired lifestyle to make up for opportunity cost of training. If you get rug pulled at 35+, you will never make up the income and you'll be solidly middle class despite over a decade of toiling and sacrifice.

2) You're young and work in pathology (and maybe radiology). Pathology is one area where faster charting and quicker reads will actually increase productivity. Unlike rads, it feels unlikely that we'll just increase biopsy frequency. Maybe in some cases we'll increase the amount of work (e.g., read the entire margin of a basketball-sized liposarcoma to assess margin status post-resection), but path seems troubled. The field is already a bit saturated, and professionally they've shown they don't have the sway to limit residencies and keep the job market decent. In 20+ years this could be trouble.

3) Medical education will fail to keep up with advances to keep us relevant. We're still memorizing the Kreb's cycle, and our licensing exams are still ~75% memorization. It will be on the individual to keep their skills relevant to the changing landscape.

Finally, reasons NOT to be scared:

1) Diagnosis/decision making takes basically no time for most physicians. M1s and M2s get freaked out because you can put a clinical vignette into a ChatGPT and it comes up with some scary good answers. However, the hard part isn't coming to the diagnosis, it's getting all the info from the patient, reading past the subjective/irrelevant nonsense, synthesizing it into a vignette/note, and then actually carrying out care. That's why even a 270+ step 2 MS4 is still bewildered and lost as an intern. Most of what a doctor does all day can't be emulated by an AI/ML model. Try putting in something closer to an unfiltered patient history into ChatGPT. It comes nowhere close to where it's supposed to be.

2) Physician is one of the safest jobs from automation. Few jobs combine empathy, subjective interpretation, rational thinking, and physical labor in such big way. When is the last time you spoke to an average person? Seriously, next time you're sitting in coach go ahead and talk to the person next to you. Most people think their jobs are tough/require tons of skill, but realistically just need a warm body to follow simple instructions. They harken back to times when life was "hard" in college because they had to study 5 whole hours to pass a statistics test. Idk if being a physician is the automatic ticket to a suburban McMansion and twice yearly trips to Hawaii that it used to be, but we've got a long way to fall and a lot of much easier targets below us.
 
Top