Artificial Intelligence and Professional Regulation: Can AI Take Chart Notes?

This blog is Part 1 of our series, Artificial Intelligence and Professional Regulation

Artificial Intelligence (“AI”) and machine learning are hot topics that have been receiving increasing attention across many industries, including healthcare. The health professions have already integrated AI to increase speed and accuracy across many healthcare systems. AI algorithms that analyze medical imaging data (such as that of x-rays, MRIs, and CT scans) are frequently used to assist healthcare professionals in making accurate, fast diagnoses. AI is also being used to streamline administrative tasks in healthcare organizations such as scheduling, and can even be used to forecast patient admissions and optimize resource allocation.

There has also been a rise of “AI Scribe” services being marketed and used to assist health professionals with summarizing and transcribing patient conversations into detailed chart notes. However, health professionals should proceed with caution, and cannot allow AI to entirely take over their charting responsibilities.

How Do AI Scribe Services Work?

Most AI Scribe services operate on a speech-to-text model, meaning that a conversation is recorded and then the AI transcribes the recording into written text. In a healthcare setting, this can allow a physician or nurse to spend less time taking notes when meeting with a patient. Depending on the model used, sometimes these are intended to be verbatim transcriptions of everything said at the appointment, and other times the intention is to create a summary of the patient encounter.

Generative AI (such as OpenAI’s ChatGPT and Whisper) trains on data and generates human-like responses based on what it has learned. For example, Whisper analyzes audio signals to identify linguistic and acoustic patterns, and then compares the sounds that it “hears” against its knowledge base to determine the most probably sequence of words, thus generating a written chart note.

What are the Risks?

The generative aspect of the AI creates a risk that the AI will provide information that is false or inaccurate, which is commonly referred to as a “hallucination”. The Associated Press recently reported that in a University of Michigan study, hallucinations were found in eight out of every 10 Whisper audio transcriptions. While this is more of a risk when there is background noise or music playing in the recording, another recent study by computer scientists found 187 hallucinations in more than 13,000 clear audio snippets examined. This is a concern because over 30,000 clinicians in the United States, including many large hospitals, have started using Whisper to transcribe and summarize patient interactions.

In addition, the CMPA warns that AI can misinterpret information or introduce biases into chart notes. If AI generated content is not reviewed and corrected, this incorrect information can become part of a patient’s record, and could then be relied upon in making future decisions that affect the patient’s health. This could have severe consequences for both the healthcare professional and the clinic or hospital where they work: a patient who is injured due to an AI-generated chart note that was not reviewed for incorrect information could result in a hospital complaint, a College complaint, a human rights complaint, or a lawsuit. Healthcare professionals can mitigate these risks by ensuring that all AI-generated chart notes are carefully reviewed to ensure accuracy and eliminate bias.

Takeaways

AI has the potential to improve the quality of the services that healthcare professionals are able to provide patients in profound and transformative ways, including through increasing efficiency and optimizing resource allocation. However, caution must be exercised. When using AI Scribe services to assist in charting, healthcare professionals should take care to closely review all chart notes, because hallucinations, misinterpretation, and bias can run rampant if left unchecked.

Stay tuned for Part 2 of our blog series on Artificial Intelligence and Professional Regulation, which will focus on best practices for healthcare professionals on incorporating AI into their charting practices.

Previous
Previous

Rosen Sunshine Presents at Children’s Mental Health Ontario Conference

Next
Next

Lonny Rosen, Elyse Sunshine listed in Post City's Top Lawyers