Using AI to write clinical notes and reports

The ethics of using artificial or augmented intelligence (AI) to write clinical notes and reports has been raised in conversations with the Ethics Officer and members of the HPCCB Board.  Although no complaints have arisen from the used of AI to date, this appears to be an emerging topic of interest to members.

You are responsible for the actions of any AI you use

The ethical perspective on the use of AI to write clinical notes and reports is clear:

As the healthcare professional, you remain responsible for clinical notes and reports developed using AI, which means that you will also be held responsible for the actions of any AI you use.

The American Medical Association (AMA) has published a range of resources on AI, including the publication ‘ChatGPT and generative AI: What physicians should consider’ (1).  This includes a summary of the known current limitations of Large Language Model (LLM) natural language processing tools like ChatGPT, namely:

  • Risk of incorrect or falsified responses.
  • Training dataset limitations.
  • Lack of knowledge-based reasoning.
  • LLMs are not currently regulated.
  • Patient privacy and cybersecurity concerns.
  • Risk of bias, discrimination, and promoting stereotypes.
  • Liability may arise from use.

The Australian Alliance for Artificial Intelligence in Healthcare 2023 National Policy Roadmap for Artificial Intelligence in Healthcare (2) notes particular issues with AI used in clinical note-taking and report writing:

“While clinical AI is subject to TGA software as a medical device (SaMD) safety regulation, non-medical generative AI like ChatGPT falls into a grey zone, where it is being used for clinical purposes but evades scrutiny because they are general purpose technologies not explicitly intended for healthcare. Uploading sensitive patient data into a non-medical AI like ChatGPT hosted on United States servers is also problematic from a privacy and consent perspective.” (2)

Any one of these limitations could lead to multiple potential breaches of the Code of Conduct for audiologists and audiometrists, including, but not limited to those relating to:

Standard 1 – Members must provide hearing services in a safe and ethical manner
Standard 2 – Members must provide hearing services in a respectful manner and not discriminate against anyone they interact with in a professional capacity
Standard 4 – Members must promote the client’s right to participate in decisions that affect their hearing health
Standard 16 – Members must comply with all relevant laws and regulations
Standard 17 – Members must adhere to appropriate documentation standards
Standard 18 – Members must be covered by appropriate indemnity insurance 

Before you use AI, you need to understand AI and be able to understand how it works and how it will impact your clinical note taking and/or report writing.  This means that before you use AI in your clinical practice you have to:

  • acknowledge and accept the limited evidence on AI, and
  • put in place processes and systems to ensure that any potential risks are addressed.

Furthermore, you need to be able to explain to your clients how AI is used in your clinical practice and how this may affect your clinical decision making processes (e.g. how it may effect the information of your clinical notes and/or reports and how this, in turn, may effect advice given).  This relates to a client’s right to participate in decisions that affect their hearing health as required under Standard 4 above.

There is no evidence on the accuracy or potential risks of AI

A quick internet search results in numerous software products that claim to provide allied health professionals with clinical note taking and report writing tools supported by AI.

The New South Wales Government Agency for Clinical Innovation has a living table available on its website titled ‘AI: automating indirect clinical tasks and administration: living evidence’ (3).  This living table is updated with relevant results from weekly PubMed searches.  When assessing the publications on this site, it is important to consider the source, relevance, design of the study, quality of the study and strength of the outcomes in the studies.  This is similar to the approach that would be taken in a Cochrane Systematic Review.  You can read more about assessing acceptable evidence in the context of promoting hearing services here on the HPCCB website.

If you do not fully understand how AI works, how it stores and uses your data, and how the use of AI may impact your adherence with the Code of Conduct for audiologists and audiometrists, you should not use it in your clinical practice.

At this stage, it is likely that few audiologists or audiometrists practicing in Australia currently have the skills and expertise, or resources, to address the significant risks explored above given the lack of evidence on AI and its impacts.

“To prepare the sector for the increased use of AI, we will need to support the creation of national consensus on foundational clinical competencies, scopes of professional practice, and codes of professional conduct to use AI, and provide a basis for patient safety, service quality and practitioner credentialling.” (2)

References

(1) American Medical Association, 2023. ChatGPT and generative AI: What physicians should consider. Available from: https://www.ama-assn.org/system/files/chatgpt-what-physicians-should-consider.pdf

(2) Australian Alliance for Artificial Intelligence in Healthcare, 2023. A National Policy Roadmap for Artificial Intelligence in Healthcare. Available from: https://aihealthalliance.org/wp-content/uploads/2023/11/AAAiH_NationalPolicyRoadmap_FINAL.pdf

(3) Critical Intelligence Unit, Agency for Clinical Innovation, 2024. AI: automating indirect clinical tasks and administration: living evidence. Available from: https://aci.health.nsw.gov.au/statewide-programs/critical-intelligence-unit/artificial/automating-indirect-clinical-tasks