National

Hospitals Are Deploying AI Chatbots. Doctors Are Divided

Healthcare systems across the U.S. are launching new artificial intelligence (AI) chatbots to help patients ask medical questions and access appointments faster amid growing demand for quick health guidance, even as doctors remain divided over the roll‑out.

Some see the positives of patients having access to clinically-focused AI platforms to ask questions, while others caution that strict standards must be maintained to limit potential risks.

Hartford HealthCare recently launched its Patient GPT, a chatbot created by clinical AI company K Health, for its patients in Connecticut, while California-based Sutter Health and Reid Health, which serves Indiana and Ohio, have rolled out Emmie, a chatbot built by the healthcare software company Epic.

Around 25 percent of Americans have used an AI tool or chatbot for health information or advice, mainly as a supplemental tool for their care, according to a Gallup poll, and users in “underserved rural communities” send an average of nearly 600,000 healthcare messages a week, according to a January report from OpenAI.

At the same time, hospital waiting times in the U.S. are worsening. Patients are now often having to wait a month on average to see a doctor, a 2025 report by AMN Healthcare showed, marking a 19 percent increase from 2022.

OpenAI has said it is launching its own dedicated health bot, ChatGPT Health, integrated in ChatGPT, as demand for quick, accessible health guidance grows, and now hospitals are getting in on the technology, turning to in‑house AI tools designed specifically for clinical settings.

Safe embed will be rendered here

How U.S. Adults Have Used ChatGPT For Healthcare Related Questions In The Last Three Months

Service URL: https://flo.uri.sh/visualisation/28649010/embed

How The New Clinically-Created AI Chatbots Work

Emmie, launched by Sutter Health and Reid Health, is an AI assistant built into Epic’s secure patient portal MyChart, Trevor Berceau, Director of Research and Development at Epic, told Newsweek, which he said was designed to “help patients understand and manage their care.”

He said that Emmie addresses several challenges encountered by the millions of Americans using commercial AI chatbots to answer their medical questions, such as not having their medical history considered in responses, as well as concerns about how AI companies might use their medical data.

As Emmie is integrated with patients’ medical records, its answers “take the patient's history into account,” Berceau said. This also means that the information patients share is protected by the Health Insurance Portability and Accountability Act (HIPAA), a federal law protecting individuals’ medical information with strict standards for privacy and security.

Sutter Health's Chief Digital Officer, Laura Wilt said patients will therefore have access to “answers specific to them and their health within Emmie,” as opposed to other commercial AI assistants.

Emmie’s main uses are for lab results and post-consultation questions, general health inquiries and requests for advice, Berceau said. Though he added, that like any AI tool, “people shouldn't make treatment decisions based solely on Emmie's output.”

“What Emmie does is help them better understand their health so that they can prepare for informed discussions with their caregivers,” he said.

Reducing Appointment Wait Times

Allon Bloch, the chief executive officer of K Health, told Newsweek that Patient GPT also has access to a patient’s medical records. It allows patients to build a profile with their medications, recent tests results, and medical history included in different sections, a demonstration video of the platform shared with Newsweek showed.

Bloch also said that the platform gives patients the ability to book an appointment online with a doctor 24/7. He said a patient could, depending on demand, get an appointment in as little time as 15 minutes if they wanted to, or they could more easily get an appointment in the evenings or outside of clinical hours.

He said the reason this is possible is because “we do a lot of the work for the doctor,” as the platform can gather patient questions and pull from their medical record, making it easier for the doctor to review their health concern.

“A doctor might take 20 minutes just to read through a medical record, let alone do the medical intake,” he said. “Now it’s all there and ready for the doctor.”

He added that he thought this was a “global opportunity,” as the technology has the ability to streamline issues felt in healthcare industries worldwide, like coordinating patient documents, appointments and prescriptions.

Dr. Padmanabhan Premkumar, president at Hartford Healthcare Medical Group, told Newsweek that “unlike general chatbots that can lack clinical context or a definitive endpoint, our platform is designed with structured clinical pathways,” pathways which he said means the chatbot understands the patient's specific medical history within a “secure ecosystem” to “triage them if needed toward a physician appointment or the appropriate clinical resource.”

 A screenshot of K Health’s Patient GPT software (left) and a stock image of a doctor showing a female patient something on a phone in a consultation room (right).
A screenshot of K Health’s Patient GPT software (left) and a stock image of a doctor showing a female patient something on a phone in a consultation room (right).

Hospitals Cite Early Positive Feedback

The roll-out is still in its early days, but Premkumar said that they have already received some feedback indicating that patients “value the increased accessibility and the ability to simplify complex healthcare navigation.”

“I have personally used the platform and found it very beneficial, and I have even recommended it to my family members,” Dr. Ajay Kumar, executive vice president and chief clinical officer at Hartford HealthCare, told Newsweek.

He said he believed Patient GPT could “add significant value for patients if privacy and HIPAA standards are fully upheld.”

“When implemented responsibly and securely, AI can help patients better understand their medical information and guide them to appropriate care,” he said. “Under those conditions, I support this type of innovation.”

What About Privacy, Hallucination Concerns?

Americans have become increasingly skeptical about AI, and more than half of the population is concerned about the risk of the technology spreading misleading content and information, according to some polls. In the healthcare space, misleading medical information could have serious implications, particularly if patients are asking for advice without opting to speak to a doctor as well.

When asked about these potential concerns, Bloch, the K Health CEO, said it’s important to remember that even “doctors make medical mistakes,” and that “we’re not talking about a system with zero mistakes,” but that if patients don’t have access to high quality medical information or avoid speaking to doctors about a problem because of long waiting times, that outcome could be “way worse.”

Berceau said that at Epic, “we put a lot of effort into prompt and context engineering for accurate outputs,” and that the company has “a robust automated testing suite that runs thousands of permutations of these tests.” He said these measures let them “monitor and validate Emmie's output on a daily basis and avoid issues like model drift that can affect accuracy.”

Regarding potential data privacy concerns, he added that HIPAA’s provisions protect the information patients share with Emmie, as under the law, healthcare systems are required to keep patient information safe, “just as they are with the rest of the medical record,” which sets it apart from consumer chatbots.

Kumar acknowledged that it “would be concerning” that these chatbots were being used to answer medical questions if “strong governance and appropriate safety measures were not in place.”

“However, our approach requires rigorous analysis through established research protocols, as well as oversight by a multidisciplinary AI governance structure,” he said. “This gives me confidence that risks are carefully evaluated and managed.”

What Doctors Think

Doctors appear divided over whether they see this as a positive step forward for the healthcare industry, or whether it raises concerns.

Nigam Shah, professor of Medicine at Stanford University, told Newsweek that he thought this was a “net positive” move for the healthcare industry.

“Care needs do not stop when the clinic closes, and the emergency room is not the most appropriate venue for a lot of care needs,” he said, adding that these tools “bridge the gap between when urgent care closes, and when the clinic opens the next day.”

Though he said these tools “are not perfect and make errors,” and that questions needed to be asked how these tools would be effectively monitored.

Suchi Saria, a professor of computer science and health policy at Johns Hopkins University, told Newsweek that what also matters is the standard these tools are held to.

“The same technology can either meaningfully improve care or quietly introduce risk, depending on how rigorously it's built, validated, and monitored,” she said. “In healthcare, the question isn't whether errors happen but rather how quickly were these detected and corrected before harm could occur.”

This means that monitoring is particularly important, and Saria said “we need to move away from the idea that patient-facing AI can operate under a lighter standard,” as “if a tool is influencing patient understanding, decisions, or behavior, it should be held to the same bar as any other clinical tool,” which she added today is not always the case.

Whether these AI chatbots ultimately ease pressure on an overstretched healthcare system or introduce new risks may depend less on the technology itself than on how carefully hospitals monitor, govern and integrate them into clinical care.

newsweek photography

In a polarized era, the center is dismissed as bland. At Newsweek, ours is different: The Courageous Center-it’s not “both sides,” it’s sharp, challenging and alive with ideas. We follow facts, not factions. If that sounds like the kind of journalism you want to see thrive, we need you.

When you become a Newsweek Member, you support a mission to keep the center strong and vibrant. Members enjoy: Ad-free browsing, exclusive content and editor conversations. Help keep the center courageous. Join today.

newsweek photography

2026 NEWSWEEK DIGITAL LLC.

This story was originally published April 26, 2026 at 4:00 AM.

Get unlimited digital access
#ReadLocal

Try 1 month for $1

CLAIM OFFER