FAQ: AI Scribes in General Practice Training – Risks and Rewards

This FAQ is designed to support GP supervisors, registrars, and educators in navigating the use of AI scribes in clinical training. It summarises common questions and professional perspectives from recent discussions and training sessions.

What are AI scribes, and how do they work?

AI scribes are digital tools that automatically generate clinical notes by recording, transcribing, and summarising doctor-patient consultations. The process typically involves:

  • Recording audio of the consultation.
  • Transcribing speech to text.
  • Summarising the transcript into structured documentation (e.g. progress notes).
  • Integrating outputs into clinical software like Best Practice.

Some tools also use real-time redaction and natural language processing to format content appropriately.

What are the potential benefits?

AI scribes offer several advantages:

  • Reduced documentation time and administrative load.
  • More comprehensive and structured clinical notes.
  • Increased time for patient interaction.
  • Potential for improved registrar learning through exposure to high-quality documentation.
  • May enhance clinician satisfaction and workflow efficiency.

What are the risks and limitations?

Risks include:

  • Inaccurate or incomplete notes if the AI output isn’t reviewed.
  • Over-reliance by registrars, impeding development of foundational skills.
  • Legal risks relating to privacy, consent, and documentation accuracy.
  • Ethical concerns in sensitive consultations (e.g. mental health, intimate exams).
  • Increased time needed to verbalise clinical observations during consultations.

Training organisations like RACGP and ACRRM do not currently endorse specific tools but do not prohibit their use. Supervisors and registrars are encouraged to:

  • Discuss scribe use during orientation and supervision.
  • Ensure foundational skills are established before introducing AI support.
  • Monitor use via regular review of notes.

Recent college guidance strongly discourages use by GPT1 (first-term) registrars to support core skill development.

Positives:

  • Serve as models of well-structured documentation.
  • Help registrars learn consultation flow and note format.

Cautions:

  • May reduce critical engagement if relied upon too early.
  • AI outputs may contain errors or hallucinations that registrars must detect and correct.

Supervisors should regularly:

  • Review documentation quality.
  • Discuss AI use during teaching sessions.
  • Emphasise the registrar’s responsibility for final note accuracy.

Supervisors should apply the same principles as with any clinical note:

  • Is it accurate, contemporaneous, and sufficient for continuity of care?
  • Does it reflect what happened in the consultation?

As supervisors are often not present during the consultation, it’s crucial that registrars verify and edit AI outputs. Supervisors must reinforce that AI is an aid—not a replacement—for clinical judgement.

Key areas of concern:

  • Consent: Patients must be informed that a recording will occur. Verbal consent should be obtained and documented at the start of each consultation.
  • Privacy: Recorded data is classified as health information. It must be stored securely and used in compliance with the Privacy Act and state-specific health records laws.
  • Accuracy: AI outputs must be reviewed for correctness before being added to the patient record.

Failure to obtain consent may breach surveillance legislation and privacy law, with legal consequences.

Verbal consent is sufficient if:

  • The patient is informed that the consultation will be recorded.
  • They are given the option to opt out.
  • Consent is documented in the clinical record.

Best practice:

  • Obtain verbal consent at the start of each consultation.
  • Do not rely solely on signage or written consent forms.
  • Reception staff can assist but should not be the sole source of consent.

Before adoption, evaluate:

  • Data security: Where is the data stored? Is it encrypted? Is offshore storage involved?
  • Privacy compliance: Does the tool meet Australian legal standards?
  • Integration: Can it integrate with practice management systems?
  • Consent workflow: Is it built into the tool?
  • Reviewability: Can clinicians verify and edit the output
  • Vendor transparency: Are training, documentation, and support available?

Test tools during free trial periods and seek legal/MDO advice before signing user agreements.

Recording supervision or teaching sessions raises similar medico-legal and privacy concerns. Supervisors must:

  • Know if they are being recorded.
  • Confirm consent for any recordings, even in non-clinical settings.

Currently, there is limited legal precedent. In one coronial case involving AI-generated notes, the absence of a stored transcript was noted, though not criticised.


Key point: AI notes must be verified. Courts may scrutinise errors, especially if the practitioner did not review or correct them, potentially undermining the entire record’s credibility.

AI may fabricate or misstate information—known as “hallucinations.” Examples include:

  • Incorrect medications or dosages.
  • Made-up referrals or diagnoses.
  • Fabricated citations in reports or submissions.

Always review AI output. Inaccurate content, if unchecked, can result in serious consequences—including harm to patients and legal repercussions.

Yes. Under Australian law:

  • Cross-border data transfers require patient consent or confirmation that the destination has equivalent privacy protections.
  • Open-source or consumer-grade AI tools (e.g. ChatGPT) may expose data or reuse input for training.
    Use tools with local data storage and clear privacy safeguards.

No. AI scribes are support tools—not replacements. The clinician remains responsible for:

  • Accuracy and completeness.
  • Clinical appropriateness.
  • Legal compliance.

Retention varies by state:

  • VIC, NSW, ACT: 7 years after last consultation, or until age 25 for children.
  • Other states: Destroy/de-identify when no longer needed.
    Challenge: Some scribe tools delete transcripts within 7–30 days. If transcripts contain information not captured in final notes, premature deletion may breach legal obligations.

Trusted sources include:

Final Thoughts

AI scribes offer real promise—particularly in reducing administrative load and modelling high-quality documentation. However, they bring risks: legal, educational, ethical, and clinical. Supervisors play a crucial role in guiding their use.

Golden rules:

  • Always obtain and document patient consent.
  • Verify every AI-generated note.
  • Prioritise registrar learning and patient safety.
  • When in doubt, consult your college, MDO, or legal advisors.

Date reviewed: 07 August 2025

Please note that while reasonable care is taken to provide accurate information at the time of creation, we frequently update content and links as needed. If you identify any inconsistencies or broken links, please let us know by email.
This website uses cookies. Read our privacy policy.