From Diagnosis to Dictation to Writing to Searches - How I use AI in 2024
AI is quickly changing the way we read, report, search, dictate
Audio
Text
In 2016, Geoffrey Hinton said radiologists are like coyotes at the edge of a cliff, about to fall off, but unaware, and predicted that AI would soon replace radiologists. Now in 2024, despite new AI journals, thousands of articles and papers, and new companies trying to create image interpretation apps, sucking away investor money, there’s still no relevant AI application for daily radiology reporting.
This is funny. Really. In 2019, the RSNA hosted over 200 AI companies on a separate floor devoted just to AI. You’d think these companies would create products and revolutionize our field. But in 2024, what do we have? Some products that can read chest X-rays and bone age, and a few that can provide lung volumes, coronary calcium scores on ungated scans and bone density from vertebrae. Everything else, including radiomics, works best only in published articles…in short, most radiology AI and radiomics applications are just...vaporware.
And things are about to change so much that virtually every AI company currently in the market is likely to have severe existential angst.
If you’re a radiologist, try this: drop a .jpg chest X-ray image into ChatGPT 4o and write a prompt like, “This is a 20-year-old with fever. What do you think is happening?”. Or take an arterial phase CT of a liver mass and do the same. See how a large language model (LLM) like ChatGPT can read images reasonably well…and they will only continue to improve. LLMs are easily accessible to all…patients and doctors and radiologists and lawyers…and they will transform how radiology images are interpreted, managed and sorted. One use case? If you are stuck for a differential diagnosis, you can drop a few images into the LLM and ask for help.
Besides image interpretation, AI has revolutionized report dictation. Three decades ago, I started with live dictation to typists sitting next to me. I then transitioned to dictaphones and transcription devices for asynchronous processing. I was among the first in India to use Dragon, but found it tedious and stopped after a few weeks. These days, I just speak into my MacBook Air, into an AI-based app called MacWhisper, which spits out text with a lag of 50-60 seconds and an accuracy of >95%. My dictation/transcription challenge is almost solved.
AI-powered speech-to-text software solutions have become popular, charging Rs. 2,000 to 3,000 per month/year.However, you don’t even need them now. If you have a Mac, you can download MacWhisper, which uses OpenAI’s Whisper automatic speech recognition (ASR) model and with a one-time payment of 29 Euros (under Rs. 3,000), you can access its large language model for voice-to-text transcription and you are done.
I have an app called Whisper Memos on my phone and Apple Watch. Whenever I think of something…a patient-related issue, an article, a patient follow-up, or a task, I just speak into the watch and it sends a transcribed email for me to review later.
Last month, I was reporting a PET/CT scan and needed to know if capecitabine causes pseudoprogression. Normally, I’d search PubMed, but these are questions even PubMed can’t answer well and it would have taken quite some time to search for the answer. ChatGPT gave me the answer in 3 seconds (no, it does not cause pseudoprogression) and I quickly signed off the report.
Last week, I moved to Perplexity, which gives more focused answers than ChatGPT. When I asked “Does paclitaxel cause pleuroparenchymal fibroelastosis,” I got my answer in 15 seconds…No.
The other day, I had a French-speaking patient whom I had to counsel before a biopsy, without a human translator. We used Google Translate. I spoke in English into the app, which spoke in French to the patient. Then the patient spoke in French into the app and I read the translation…and so on. It took extra time, but it was a Eureka moment. In the future, with Indian patients speaking languages I don’t understand, I can use Google Translate to improve comprehension for both…the patient and me.
AI helps me in non-radiology and non-medical areas too.
I used lex.page’s AI to edit and shorten this article. The images were created using DALL.E 3.0.
ChatGPT and Perplexity are my default search engines…no cookies, ads, or grading of sites based on SEO or other opaque criteria…in short, no nonsense…my Google searches have dropped by 90%.
LLMs do tend to hallucinate and sometimes give weird or wrong answers, but it’s not difficult to catch them. I can always check the same query with another LLM or ask the LLM if it’s lying when the answer doesn’t make sense, or ask the question again and the issue is usually resolved.
I’m not an expert in sociology, economics, or philosophy, so I cannot comment on AI’s current and future impact on the job market, labor force, radiologists, radiology, physicians and medicine.
However, AI is here to stay and change will come, not via dedicated radiology AI companies, but gradually and then suddenly, through commercially available software, accessible to anyone and everyone…doctors and patients alike. It may be a good idea to stay on top of these, experimenting with new apps and software as they are released, to better understand their capabilities.
My New Book - Kindle Version is Also Out
Bhavin's Writings Newsletter
Join the newsletter to receive the latest updates in your inbox.