By Paul V. DeMarco
Guest Columnist
Like many of you, I am struggling to understand the repercussions artificial intelligence might have on my life and on my neighbors around the globe.
I’ve read some articles and listened to many podcasts about AI. I’ve heard opinions from the sanguine to the apocalyptic. Experience being the best teacher, I have observed how AI has affected my life and my practice, and thus far, I am cautiously optimistic.
That said, it’s frustrating to have such a small window on the impact AI is having on so many of us. I know it’s eliminating some jobs, while creating others. I sense from a distance that it is transforming the way we educate our children. How do you teach children to think critically and write insightfully when AI can do both for them? AI is already changing the way we interface with the world. Have you called a business and spoken to an AI assistant yet? If not, you soon will.
There are myriad ways AI could influence my corner of the world, primary care medicine. Two are currently top of mind. First, I hope that AI will eventually do most of my documentation. Although there has been some buzz on this front, I have yet to see a system that is anywhere close to a human scribe. However, it’s possible to imagine an AI scribe that would be faster, less expensive, and more helpful than a human one (albeit less enjoyable to work with).
Second, and more interesting to me is how AI will be used in the exam room. A recent visit to my mechanic may give a clue. Several months ago, a vibration began emanating from the front passenger side of my trusty 2016 Ford Escape. The noise had some peculiar characteristics – it was loudest when I first started the car and tended to improve as the engine warmed up and achieved high gear. Once I was at cruising speed, it was barely noticeable.
I took the car to my local mechanic, in whom I have absolute trust, for a regular service. The noise had just began and I had not listened carefully to it at that point. Based on my vague description, he replaced a sway bar link. There was no improvement. Since the noise wasn’t diminishing the car’s performance, I waited several months to return to him. Then I did what I tell my patients not to do. I went to the internet. Prior to AI, I found searches for questions like this one to be mostly unhelpful. In my patients’ hands, medical searches have often led to inaccurate and needlessly anxiety-provoking results. AI has changed the game. Well-constructed prompts can return genuinely useful answers in seconds. I described the noise in detail, and ChatGPT gave me a differential diagnosis. After several rounds of back and forth, the leading candidate was a faulty engine mount.
My mechanic called me that afternoon with a different diagnosis involving the axle. But because the noise was loudest with the car in park, I was dubious. He wondered if we were each hearing different noises. “Let me come first thing tomorrow morning,” I said, “and we can talk about this.”
As I sat with him in the car the next morning, I told him about my AI research. I was uncomfortable as a true amateur (I had no idea what an engine mount (or a sway bar link was until ChatGPT informed me) disagreeing with an expert. But we had a relationship, and I asked if he would replace the engine mount first. If that didn’t fix the noise, he would investigate the axle. Two days later (the mount had to be ordered), I was back on the road and the noise had disappeared.
This could be a guide to how AI will affect my practice. As a generalist, I accept that there are many specialist physicians who know more about a particular aspect of my patients’ illnesses than I do. I expect to need help from them and other parts of the medical team (nurses, pharmacists, social workers, counselors, therapists, etc.). AI could be another member of the team. Since AI can be accessed from both directions– by the patient and the provider– it could also be a bridge to improve patients’ engagement in managing their chronic medical conditions. ChatGPT is imperfect, but often provides reasonable answers to well-written lay medical questions. Providers have access to an AI-powered tool called OpenEvidence that is even more reliable than ChatGPT.
I’m curious about how AI could alter my conversations with patients. I sometimes use OpenEvidence in the room with a patient and let the patient know what I’m doing. I haven’t yet used it to try to change a patient’s mind – for example, to urge acceptance of a vaccine of which the patient is skeptical. But it would provide an authoritative, neutral voice in that discussion.
I don’t perceive AI to be a threat. Nor do I believe primary care doctors could be replaced by AI. Human beings need other human beings to care for and about them. I hope that AI can be successfully incorporated into the doctor-patient relationship to better inform and connect both parties.
A version of this column appeared in the Nov. 14th edition of the Post and Courier-Pee Dee.





































