Separate names with a comma.
Discussion in 'Radiology' started by new2018, Mar 27, 2018.
Because The AI hype is so overblown
The Uber death should show how absurd it is to think AI will replace radiologists. For the past 5 years, Autonomous driving has been accepted as fact to be safer than humans. But at the first sign of a tech failure, testing the product was banned and multiple companies ceased testing on their own. The public’s faith is destroyed and the Timeframe for the autonomous driving was probably set back 5 years. And this was from a scenario where the pedestrian was doing something extremely dangerous. How’s that gonna work when the tech fails in the light of day?
How do you think the FDA is gonna react when early AI rollouts lead to a patients death? You can’t blame the patients for being reckless in this scenario.
And this is completely ignoring the fact that AI is light years away from being considered equal to radiologists at the simplest of tasks.
Because AI can potentially make our jobs easier not eliminate them. Also, for the people with engineering background, this is the best time to start training so that by the time they are finished with their residency, they can start making meaningful contributions to the field when its development infra becomes more structured.
I think you are greatly overestimating the extent to which the Uber accident will delay autonomous vehicles. Companies deliberately chose to invest tens of billions of dollars into this technology and regulators deliberately gave the green light to testing. Both the companies and the regulators knew well ahead of time that accidents and deaths were inevitably going to be a part of the process yet chose to proceed anyway. Testing will continue apace once the recency goes down since this was an eventuality that was factored into the business plan before a single dime was even spent on the tech. They might tighten up the hiring process on their human backup drivers to higher than "mentally unstable ex-felon" as a result, though.
Furthermore, autonomous vehicles face a far greater obstacle to commercial adoption than automation of radiology does, because a car either has a human driver, or it doesn't. That's a huge leap to make. On the other hand, you don't have to make that leap for image recognition to disrupt radiology, since you can merely augment radiologist efficiency rather than cut them out entirely. Instead of having 20 radiologists to cover a hospital, you might need only 10 augmented by AI. A vehicle on the other hand has but 1 driver: it's either entirely human or entirely AI, and you have to get both the tech and the public acceptance to the point where you can make that bold, binary leap.
If medical image recognition is developed and adopted in a significant way, it will indeed make radiologists' jobs much easier, to the huge detriment of radiologists. Also, a very, very tiny fraction of radiology applicants have engineering backgrounds and the intention of making a career out of developing image recognition technology. The vast majority just want to do traditional radiology, so the OP is very correct to question why radiology is getting more competitive despite the threat of AI. Whether or not AI is going to make a large impact in the short to medium term is anyone's guess, but the uncertainty is there, and logically speaking, uncertainty should be factored into the price of a stock, whether the stock is an actual equity or a medical specialty.
True. Can’t argue with that logic
The push for automation and AI in radiology in today's world is driven primarily by computer scientists and entrepreneurs who unfortunately don't understand radiology. There are far more radiologists with some background in computer science/engineering than vice versa, yet all the noise comes from the latter group. I myself have an engineering degree and briefly worked as an engineer prior to medical school. Unfortunately, these non-radiologists do not seem to understand that the field is more than just pattern recognition but involves human cognition. While I don't doubt that one day AI will get very close to that point, it will not happen anywhere close to our lifetimes, and by that time, many if not most other jobs and specialties both in and out of medicine will have significant parts taken over by AI. It seems conceptually hard to understand to non-radiologists, as since radiology is computer-based, most people assume it will be the easiest to introduce automation, but it involves far less algorithmic thinking than is seen in many other specialties.
If AI can be shown to do the radiologist's job as well as or better, then I am all for it taking over where it can. It will reduce costs and drive up efficiency tremendously. I have no interest in maintaining human control on any field merely out of self-interest in preserving careers, it would all fall under the broken window theory of artificially supporting the job market. The problem is we are nowhere near the technological capability of doing that, despite what the non-radiologists hope for and imagine.
Would love for AI to spit out unintelligible ICU chest film reports and do a few PICCs/para/thora so I can get a real lunch break!
I always tell new residents to learn to first think as a radiologist rather than search for image patterns, and that is the difference between a smart radiologist and an average one. But I also think that AI will definitely change the way we practice medicine, may be not today but it will be faster than we think.
Yes, it won't be something specific to radiology but let's face it, Radiology is the most technical field in medicine, the infrastructure is already there (digital images and extensive networks, servers, PACS). It is a perfect starting point for AI in medicine. Radiology will be cannibalised first . It will make our job much easier but will definitely decrease the number of radiologists needed at least for conventional reading, may be it will open other career options, We (all medical specialties) must adapt.
I'm going into radiology, but currently doing general surgery internship. My mindset has shifted somewhat over this past year regarding the role of AI in radiology. I also work as a software engineer training ML models for healthcare and have a reasonable idea about what AI can accomplish.
I believe AI/ML is capable of doing some tasks better than a radiologist within the domain of pattern recognition. We are in a renaissance of pattern recognition; it's not like the 80s, it's not like the 90s; it's new, it's state-of-the art, and it's super-human. Here are some examples of these tasks:
a. Is the central venous catheter in the right atrium?
b. Is the nasogastric tube in the stomach?
c. What is the bone age of this patient?
d. Is this lung lesion pre-cancerous?
e. Is there free air under the right hemi-diaphragm?
Each of these results in a critical decision that drives patient care, but is ultimately binary in nature: Should we advance the catheter/tube? Does the patient need an endocrine workup? Should we biopsy the lung? Does the patient need an emergent ex-lap? I believe that highly optimized machine learning systems can read these radiographics better than any individual radiologist, if we constrain the system to answer a single question. Of course, the radiograph may contain a findings beyond the scope of the asked question. Because of this limitation, computers will not replace radiologists anytime soon.
However, I think that radiologists should be aware that AI/ML tools will not stay in the reading room. For example, at least once a week I have to confirm placement of a line/tube/catheter, or verify that a chest tube hasn't caused a pneumothorax when put to water-seal. About 75% of the time, I don't call the reading room and just rely on my own ability to read images. The times when I'm in-doubt I consult my chief resident or fellow (if they are around). If we're still not certain, we call radiology. If there was a freely available tool that could answer our question, we would likely try it first, if only to save time during a busy day of surgery.
You are still in training. In the real world, 99% of lines are read by the referring docs.
If portable chest x rays are where AI will charge the field, then I will be right here holding the gates wide open for them
The experience is very different depending on the practice set up.
In some hospitals, IR does a lionshare of central catheters.
In some hospitals, there is a PICC team and they rely on radiology report to confirm the lines.
Right now, cardiac Nucs software is semi-automatic and its interpretation sucks. And we are talking about a 0 or 1. In Cardiac stress test, they are pretty much only look for changes in myocardium signal after stress compared to rest imaging. Reading a CT abdomen and pelvis is a totally different beast.
To be honest, nobody can predict the future. It may or may not happen. But most of the time, the future is not what even experts predict.
Name a medical field and I will give you good reasons that why it can go down the drain.
There's data in these scans that have clinical implications that humans can't HOPE to comprehend without some sort of new processing/software/AI.
Fortunately, for our jobs anyway, I truly don't think that in our lifetimes, AI will be able to comprehend it either.
I think that, in our lifetime, will augment our jobs, not take our jobs.
Self-teaching is a human trait.
If AI can learn the human way, they are prone to make the same human mistakes like we do.
Mid-levels. Much cheaper for an established derm practice to hire one rather than a new dermatologist...last time I went to a dermatologist I was being seen by their NP until they found out I was a MD. Also was asked by the receptionist if it was ok if the "medical" student (NP student) was allowed to observe
Agree with above.
1- Mid-levels already do a lot in Derm.
2- Let's say AI becomes so sophisticated that it becomes capable of interpreting a CXR. Don't you think the same AI will also be capable of looking at a skin mole and characterizing it? Especially it has the advantage of magnifying the lesion. A lot of work is being done on a technology that can decrease the need for skin biopsy. It is not yet good enough, but who knows what happens in the future.
3- Cosmetic derm: Already a lot of family doctors are doing it.
4- Dermatopathology: If AI becomes capable of reading MRI, it will also be capable of doing dermatopath.
5- Mohs surgery: First of all, it used to pay very well in the past. It pays well now. But it is just a matter of change in reimbursement.
Second: If AI becomes very capable, this process will become semi-automatic and the reimbursement will definitely go down.
Overall, don't think too much about the future of a field. It is just a matter of a new technology that a field can become totally different in a decade. Do what you like and the rest will come.
For anyone following AI developments in healthcare, the FDA just approved the first device that uses ML to screen for diabetic retinopathy without an optometrist reading the image: FDA permits marketing of artificial intelligence-based device to detect certain diabetes-related eye problems
The significance is that there is no ocular expert examining the image, it's a sign that the FDA is willing to bypass "expert providers" if automation is shown to be superior.
Curious if you actually read the article?
That article has nothing to do with replacing anyone because the machine is “superior”...? It specifically talks about being for populations who are not able to see their eye doctor as often as they’re supposed to, and if the image is deemed “more than mild diabetic retinopathy” they still go see their eye doctor who does the same exam and determines if treatment is necessary. The article also says it’s designed for clinics with healthcare professionals that aren’t used to dealing with eye diseases- aka a FM doctor/midlevel in BFE that can use a machine to screen his/her patients and ship them to an eye doctor if needed. It’s an automated screening tool, no different than the wonderful EKG machines. The fearmongering on here is ridiculous sometimes.