Don’t use ChatGPT during rotations

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

notEinstein

Full Member
Joined
Apr 2, 2024
Messages
97
Reaction score
41
Just an FYI. Just watched a 4th year get roasted for showing up first day to our floor and immediately pulled up ChatGPT on the pod computer and started punching in patient symptoms.

Maybe other places will allow it, but this top 30 med school/hospital does not like it one bit.

Members don't see this ad.
 
Just an FYI. Just watched a 4th year get roasted for showing up first day to our floor and immediately pulled up ChatGPT on the pod computer and started punching in patient symptoms.

Maybe other places will allow it, but this top 30 med school/hospital does not like it one bit.
ChatGPT is not HIPAA compliant either. You might inadvertently be giving away identifying patient information. If you want to use AI to look up weird symptoms and complaints, I would suggest maybe using the Doximity AI which is HIPAA compliant. At the same time, I also recommend not to be too reliant on it. You are not doing yourself a favor by looking for shortcuts as a medical student.
 
Eh, I think it depends how you use it. ChatGTP has been invaluable for my learning as a medical student. I only use it on my personal laptop. I use the paid version which uses peer reviewed literature for information and can cite sources. And of course I never put in identifying information. I typically use it for extra clarification on anki cards or unfamiliar topics brought up during rounds. As far as specific patients and symptoms go, I still think there is plenty of utility as a learner. As far as problem lists and next steps, I always challenge myself to think about them on my own, but sometimes use chatgtp to come up with other differentials and ideas down the road. Again, using only vague terms and nothing that could be remotely PHI revealing. If I'm truly going to suggest something chatGTP said to a resident or attending, I have no issue disclosing that I used AI to help me find the UpToDate paragraph that gave me the idea. I'm not smart enough to go to a ~top 30~ med school, but it was invaluable for crushing step 2 and my SubI's. I don't feel like my chatGTP use has led me to cheated myself out of learning or misrepresented my knowledge base, but maybe I will be proven wrong once I hit residency.

I personally do not think that AI is going to take over medicine. I do think that the learners and providers who learn how to use it ethically and effectively will have a leg up on patient care relative to those who fight against it.
 
Members don't see this ad :)
Just an FYI. Just watched a 4th year get roasted for showing up first day to our floor and immediately pulled up ChatGPT on the pod computer and started punching in patient symptoms.

Maybe other places will allow it, but this top 30 med school/hospital does not like it one bit

use "open evidence" it's legit AI for medical questions. i use it all the time on rounds.
 
Just want to point out that software / web sites / services can advertise that they are "HIPAA compliant", but that does not make them so. What it means is that the software would be HIPAA compliant (i.e. evrything is encrypted in transit and in storage, etc), if there was a BAA (Business Associate Agreement) between the vendor and an end user. So without a BAA with Doximity, its software is no more HIPAA compliant than anything else. Chat GPT would not be HIPAA compliant even with a BAA.
 
If you've followed my last few posts, you know I'm an AI fanatic. That said, it is worrisome when medical students punch in patient symptoms instead of using their critical thinking skills to find diagnoses. I don't think AI should be used to circumvent the critical thinking process of making a diagnosis but as an aid for more tricky situations.

I'm a big fan of AI for its potential as an aid, but not at the cost of your clinical skills.
 
Am I the only one who, after reading the original post, thought “In other news, sky is blue.”

Maybe being in my 40s is giving me a case of old fogey syndrome, but it seems really obvious that this is a ridiculous thing for a learner to do on rounds.

I agree.

On the other hand, at one point older attendings would have said "don't use you phone to look up the answer--use a book!" (even if you used UpToDate or just a google search to see "What is condition x." Pocket references were quite popular for a while.

Presumably at some point AI will be as indispensable as google, and maybe UpToDate.

But for now it frightens me how often AI is wrong and it doesn't know.

But as pointed out by many others, medical students need to learn to to think and reason. That's what makes them different than an NP or PA. NPs can google too. But they don't have the medical thinking skills physicians do. Lets not dumb down our profession. Having an external brain is great (medical textbooks, pocket references, UpToDate, journal articles, other colleagues), but one needs to develop the one they have first and foremost.
 
Kind of stupid as chatgpt is about to have prescribing authority

Well, while I agree that politicians are generally idiots about things involving healthcare and make a lot of questionable decisions, I’d be surprised to see this bill make it out of committee let alone get to a vote.

But certainly something to follow. Not sure that I’d jump to the idea that AI is about to be prescribing drugs. I think it’s a long shot to be made federal law at all, but even then it would have to be made a law in a given state. States regulate medical licensing, not the federal government.
 
Well, while I agree that politicians are generally idiots about things involving healthcare and make a lot of questionable decisions, I’d be surprised to see this bill make it out of committee let alone get to a vote.

But certainly something to follow. Not sure that I’d jump to the idea that AI is about to be prescribing drugs. I think it’s a long shot to be made federal law at all, but even then it would have to be made a law in a given state. States regulate medical licensing, not the federal government.
i think that it would make it very easy to get controlled substances. There would be like a script to get norco, adderall, Xanax etc.
 
You can and should use it on rotations, and I wish it existed when I was a third year. Just don't get caught using it, as there are clearly still Luddites who look down on AI despite that it is accurate the vast majority of the time. If you don't use it, you're going to be behind everyone else. This will only get more and more true in the coming years.

The obvious disclaimer is to double-check the information it gets through more 'reliable' sources, but often times, ChatGPT is literally pulling its information from those sources in the first place.
 
If they can give a medical license, why can’t they give a DEA license. You really don’t think trump would be willing to give Grok full authority?
What's your endgame here? If you really think AI is coming for us all (it won't) and you also apparently think the last group of people to graduate with an MD will be this cycle's applicants (they won't) then why are you even still in medical school, if that's even true???
 
You can and should use it on rotations, and I wish it existed when I was a third year. Just don't get caught using it, as there are clearly still Luddites who look down on AI despite that it is accurate the vast majority of the time. If you don't use it, you're going to be behind everyone else. This will only get more and more true in the coming years.

The obvious disclaimer is to double-check the information it gets through more 'reliable' sources, but often times, ChatGPT is literally pulling its information from those sources in the first place.

So definitely use AI, but be careful because you'll get in trouble for using it, and you should cross reference with traditional sources.

Um...
 
Top