Search This Blog

Saturday, January 10, 2026

Beyond the Bots: How Clinicians Can Adapt to AI Lab Reviews

 As patients increasingly turn to artificial intelligence (AI) technology for medical needs, ChatGPT has become a convenient second opinion for many on blood workups. It can simplify lab results, provide summaries, and point out irregularities. It can also make mistakes.

Zaid Fadul, MD, CEO of Bespoke Concierge MD, a private personalized healthcare service, told Medscape Medical News that one of his patients initially turned to AI for guidance interpreting results she received from a private lab work service and delayed a doctor’s visit because it gave her vague feedback.

photo of Zaid Fadul
Zaid Fadul, MD

“The AI likely included lupus on a broad list of possibilities, but it didn’t convey urgency or integrate the patient’s symptoms. That created a false sense of reassurance,” he said, adding that he later diagnosed her with lupus during an office visit. “Because AI only analyzed lab data — not the clinical presentation — the patient felt comfortable delaying care, which ultimately led to a 6-month delay in diagnosis.”

Fadul estimated that roughly half of his patients are using AI to review bloodwork or medical results before coming to see him. As quick ChatGPT consults become more common, physicians should know how to guide their patients in using this type of technology to review blood test results.

“Patients are curious, they have access, and they want to understand what’s happening in their bodies,” Fadul said. “That isn’t going away.”

Earl J. Campazzi, Jr, MD, concierge doctor in Palm Beach, Florida, who wrote the new book Better Health with AI: Your Roadmap to Results, said about 10% of his patients “are using AI regularly and bringing us things.” That’s not overwhelming — for now. But Campazzi predicted a deluge is on the horizon. “In, say, 2 years, I don’t think you’re going to be able to say no AI, don’t talk to me about AI,” he said. “You are just going to be left behind.”

photo of  Earl J. Campazzi MD
Earl J. Campazzi, Jr, MD

Experts said that apps such as ChatGPT can often lack context or the full picture of a patient’s health when it reviews or summarizes a patient’s blood work. A partnership with a clinician is important for patients, said Barry Stein, MD, physician and chief clinical innovation officer at Hartford HealthCare in Connecticut, and so is caution when using it for healthcare purposes.

photo of Barry Stein, MD
Barry Stein, MD

He has told patients “It may give you incorrect information. It may give you correct information. It doesn’t have all your contextual information,” he noted. “Be careful about sharing information that’s private because once it’s out there, it’s not private. Educate them and also say, ‘I’m available to discuss information whether it’s conflicting information, whether it’s aligned information.’”

Fadul said when patients have told him they’re turning to AI for advice, he keeps the conversation light to direct them to the real issues. He said he sees himself as a coach and that his role is to interpret information responsibly, helping them create a thoughtful plan for their health based on the full picture.

“I don’t want to sabotage trust by judging someone for looking things up, whether that’s Google or AI. When patients admit, often sheepishly, that they ran their labs through AI, I try to meet them where they are and incorporate what they learned into the conversation so their effort feels valid,” he said. “Sometimes I’ll joke that AI is great at telling you that you might have cancer based on a single abnormal value. That usually breaks the tension and opens the door to a real discussion. From there, we can quickly get to what actually matters: context, symptoms, and trends over time.”

Campazzi said if patients are bringing AI-generated material to their appointments, front office staff should help set expectations and tell the patients what’s helpful to present to the doctor. “If you can, you can take your symptoms or your wearable data and bring us trends, or if you can, you know, record your symptoms or your diet, and then we can read over the [AI] summaries. That is very useful, but again, voluminous or kind of gotcha data is not,” he said.

Campazzi said AI can help physicians in practice. In the book, he details how patients can use AI technology as a “health assistant” to help them summarize symptoms or labs in preparation for a doctor’s visit, among many other things. He writes that AI can spot meaningful shifts in bloodwork by comparing trends and flagging subtle changes if a patient provides it the right information.

Using the prostate-specific antigen (PSA) test for men as an example, Campazzi said some doctors, especially if they’re working quickly, will just look for numbers outside the normal range, whereas AI could pick up on trends if the right data are provided.

“Doctors that have to see four patients an hour; it breaks down to like 7 minutes with the patient, 7 minutes documenting, or 4 extra minutes an hour to go to the bathroom,” he said. “It’s an assembly line, and in that situation, things can be missed. So AI really can be useful.”

Fadul noted that context — including symptoms that a patient may or may not know are important — is key. If a patient using AI to interpret a PSA result doesn’t note that they’re having difficulty peeing or a weak urine stream because they don’t know these symptoms are of concern, apps such as ChatGPT may not raise a flag.

“If you don’t know that those symptoms are connected to your PSA and you just see PSA on a lab form, AI is not going to know what to tell you and say, ‘Hey, you might have benign prostatic hyperplasia. We need to do an evaluation,’” he said.

A full clinical picture is a critical component of healthcare that AI technology lacks. Stein wants clinicians to partner with patients with these new technologies at their fingertips.

“Doctors won’t get replaced by AI,” he said. “But they will get replaced by those doctors that are comfortable with AI.”

Fadul, Campazzi, and Stein reported having no relevant financial conflicts of interest.

https://www.medscape.com/viewarticle/beyond-bots-how-clinicians-can-adapt-ai-lab-reviews-2026a10000oi

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.