Understanding the present, shaping the future.

Search
09:29 PM UTC · SUNDAY, APRIL 26, 2026 XIANDAI · Xiandai
Apr 26, 2026 · Updated 09:29 PM UTC
AI

Researchers warn against AI scribing tools in medical appointments

The newsletter Buttondown argues that automated transcription systems in healthcare pose significant privacy and accuracy risks to patients.

Alex Chen

2 min read

Researchers warn against AI scribing tools in medical appointments
Digital transcription software on a medical device

Researchers Emily M. Bender and Decca Muldowney are advising patients to decline the use of automated 'scribing' systems during medical appointments, according to a report from Buttondown.

These tools use audio recordings of patient encounters to generate draft medical notes for electronic charts. While providers market the software as a way to reduce administrative burdens, the authors suggest the technology brings hidden risks to patient care.

In a recent account, Bender noted that a physical therapist requested permission to trial an automatic scribing system. The authors observed that such tools are appearing in settings ranging from small private practices to large healthcare conglomerates like Kaiser.

Risks to privacy and care

The report identifies several critical flaws in the deployment of AI transcription. A primary concern involves privacy, as these systems rely on third-party software to process sensitive audio and transcripts. Even if recordings are deleted quickly, the authors warn that software providers may lack sufficiently strong security protocols.

Informed consent also remains a major issue. The authors question whether patients are being given enough information regarding how their data might be used for future training of 'AI' doctors or quality assurance.

Beyond data security, the technology may fundamentally alter the doctor-patient relationship. The report notes that physicians using these systems often shift to a technical 'doctor-to-doctor' register to ensure the software captures specific details. This habit can confuse medical interpreters and alienate patients.

Automation bias presents another danger. The authors argue that doctors may struggle to verify the accuracy of a pre-generated note, particularly when it is difficult to notice what information is missing from the draft. This error risk was highlighted by a recent error in the television drama The Pitt, where an AI transcription tool compromised a patient's well-being.

Finally, the authors argue the promise of efficiency is a false one. They suggest that instead of allowing doctors more time with patients, these tools will likely be used to increase patient volume in an underfunded healthcare system.

As Aliaa Bakarat wrote in Stat News, 'the writing of chart notes... is part of the care.' The authors conclude that skipping the reflective process of manual charting degrades the quality of medical care over time.

Comments