This FAQ is designed to support GP supervisors, registrars, and educators in navigating the use of AI scribes in clinical training. It summarises common questions and professional perspectives from recent discussions and training sessions.
AI scribes are digital tools that automatically generate clinical notes by recording, transcribing, and summarising doctor-patient consultations. The process typically involves:
Some tools also use real-time redaction and natural language processing to format content appropriately.
AI scribes offer several advantages:
Risks include:
Training organisations like RACGP and ACRRM do not currently endorse specific tools but do not prohibit their use. Supervisors and registrars are encouraged to:
Recent college guidance strongly discourages use by GPT1 (first-term) registrars to support core skill development.
Positives:
Cautions:
Supervisors should regularly:
Supervisors should apply the same principles as with any clinical note:
As supervisors are often not present during the consultation, it’s crucial that registrars verify and edit AI outputs. Supervisors must reinforce that AI is an aid—not a replacement—for clinical judgement.
Key areas of concern:
Failure to obtain consent may breach surveillance legislation and privacy law, with legal consequences.
Verbal consent is sufficient if:
Best practice:
Before adoption, evaluate:
Test tools during free trial periods and seek legal/MDO advice before signing user agreements.
Yes, including:
Guidance is evolving. Check college websites and newsletters for updates.
Recording supervision or teaching sessions raises similar medico-legal and privacy concerns. Supervisors must:
Currently, there is limited legal precedent. In one coronial case involving AI-generated notes, the absence of a stored transcript was noted, though not criticised.
Key point: AI notes must be verified. Courts may scrutinise errors, especially if the practitioner did not review or correct them, potentially undermining the entire record’s credibility.
AI may fabricate or misstate information—known as “hallucinations.” Examples include:
Always review AI output. Inaccurate content, if unchecked, can result in serious consequences—including harm to patients and legal repercussions.
Yes. Under Australian law:
No. AI scribes are support tools—not replacements. The clinician remains responsible for:
Retention varies by state:
Trusted sources include:
AI scribes offer real promise—particularly in reducing administrative load and modelling high-quality documentation. However, they bring risks: legal, educational, ethical, and clinical. Supervisors play a crucial role in guiding their use.
Golden rules:
Date reviewed: 07 August 2025