Ontario Audit Finds AI Medical Notetakers Generate Inaccurate Patient Records
Provincial audit reveals AI transcription tools used by healthcare providers are creating fabricated therapy referrals and incorrect prescription records.

An audit conducted by Ontario provincial authorities has identified significant accuracy problems with artificial intelligence notetaking systems used in medical settings, finding that the technology frequently generates false information in patient records.
The audit documented instances where AI transcription tools created fabricated therapy referrals that were never discussed during patient visits, as well as incorrect prescription information that did not match what physicians actually prescribed. These errors represent potential safety risks for patients whose medical records may contain inaccurate treatment information.
The findings highlight growing concerns about the reliability of AI-powered documentation tools that have been increasingly adopted by healthcare providers to streamline administrative tasks. While these systems are designed to automatically transcribe and summarize patient encounters, the audit suggests the technology may be generating content that was not actually part of the medical consultation.
Healthcare AI notetaking systems have gained popularity as a way to reduce the administrative burden on physicians and improve efficiency in clinical settings. However, the Ontario audit results indicate that accuracy issues with these tools could potentially compromise patient care if healthcare providers rely on AI-generated records without proper verification.
The provincial government has not yet announced specific regulatory responses to the audit findings, though the results may prompt closer scrutiny of AI documentation tools used across Ontario's healthcare system.