Concerns raised over AI scribe tools for doctors

Angela Ballantyne. PHOTO: ODT FILES
Angela Ballantyne. PHOTO: ODT FILES
Major concerns have been raised about the security of private patient information in New Zealand after new research found 40% of the doctors surveyed are using AI scribes to take patient notes.

Lead researcher and University of Otago (Wellington) primary healthcare and general practice bioethicist Professor Angela Ballantyne said AI was being rapidly taken up by primary care practices to transcribe patient notes during consultations, despite ongoing challenges with their legal and ethical oversight, data security, patient consent, and the impact on the doctor-patient relationship.

"Most AI-scribes rely on international cloud-based platforms — often privately owned and controlled — for processing and storing data, which raises questions about where data is stored, who has access to it, and how it can be protected from cyber threats.

"There are also Aotearoa-specific data governance issues that need to be recognised and resolved — particularly around Māori data sovereignty."

Prof Ballantyne said 197 health providers working in primary care were surveyed in February and March 2024, providing a snapshot in time of the use of AI-scribes in clinical practice.

Most of the respondents were GPs, but others included nurses, nurse practitioners, rural emergency care providers and practice managers.

Of those surveyed, 40% reported using AI-scribes to take patient notes.

Only 66% had read the terms and conditions on the use of the software, and 59% reported seeking patient consent, she said.

Most of those surveyed who used AI scribes found them helpful, and 47% estimated it saved them between 30 minutes and two hours a day by using it in every consultation.

However, "a significant minority" said the software did not save time overall because it took so long to edit and correct AI-generated notes, she said.

"[Doctors] need to be vigilant about checking patient notes for accuracy.

"However, as many survey respondents noted, carefully checking each AI-generated clinical note eats into, and sometimes negates any time savings."

Many had concerns about the accuracy, completeness and conciseness of the patient notes produced by AI-scribes, and some were concerned about its inability to understand New Zealand accents or vocabulary and te reo Māori.

Others using an AI scribe felt it enabled them to focus more on their patients and build better engagement and rapport through more eye contact and active listening.

But there was concern among those surveyed about whether the use of an AI-scribe complied with New Zealand’s ethical and legal frameworks.

Prof Ballantyne said it could not be assumed that patients consented to their use.

"Patients should be given the right to opt out of the use of AI and still access care, and adequate training and guidelines must be put in place for health providers."

In July, the National Artificial Intelligence and Algorithm Expert Advisory Group at Health New Zealand Te Whatu Ora endorsed two ambient AI scribe tools — Heidi Health and iMedX, for use by its clinicians in New Zealand.

Prof Ballantyne said the Medical Council of New Zealand was expected to release guidance about the use of AI in health later this year, which was likely to require patients to give consent to the use of AI transcription tools.

And there was still a need to track and evaluate the impact of AI tools on clinical practice and patient interactions, she said.

john.lewis@odt.co.nz