Using ChatGPT to help with paperwork could breach patient privacy

Avant media

Wednesday, 5 July 2023

Using ChatGPT to help with paperwork could breach patient privacy

Avant has warned doctors that typing patients’ names or medical details into ChatGPT has the potential to constitute a breach of confidentiality or privacy.

ChatGPT, a free online AI program, has attracted global attention for its ability to quickly produce convincing responses to questions, including medical fellowship exams such as the RACGP’s Applied Knowledge Test (AKT).

But there are concerns over how the program stores or reuses the information typed into it, says Avant’s General Manager of Advocacy, Education and Research, Georgie Haysom.

She is urging doctors to be careful about using ChatGPT — which is owned by a Microsoft-backed not-for-profit called OpenAI — for work, given the potential privacy and confidentiality risks.

“Doctors’ duty of confidentiality to patients and privacy and security obligations requires them to protect patient information from unauthorised use and disclosure,” she says.

“Uploading information to an AI tool can breach these requirements.

“While patient details such as name or date of birth may not be uploaded, the patient may nevertheless be identifiable from the details of the medical and health information shared with the AI tool.”

It follows a warning from Perth’s South Metropolitan Health Services (SMHS) last month, sent to staff at five hospitals, to “cease immediately” using AI to help write up patient notes.

“It has recently come to our attention that some staff have started using [AI] bot technology, such as ChatGPT, to write medical notes which are then being uploaded to patient record systems,” said the directive from SMHS chief executive Paul Forden.

“Crucially, at this stage, there is no assurance of patient confidentiality when using AI bot technology, such as ChatGPT, nor do we fully understand the security risks.”

Ms Haysom added that doctors needed to know whether patient information uploaded to AI programs was being sent to overseas computer servers.

“If doctors wish to improve their efficiency in making good medical notes, they should explore other options, such as dictation software and effective use of medical records software, rather than using AI tools,” she said.

Following the Perth hospital directive, AMA WA said it was confident no privacy breach occurred at the five hospitals.

But its president, Dr Mark Duncan-Smith, said the situation showed the importance of developing “proper protocols, ethics and guidelines” for doctors using AI.

In response to questions from AusDoc, a spokesperson for the Medical Board of Australia said doctors were welcome to use AI technology for simple administration.

But they warned any doctor thinking about using ChatGPT to do their CPD was taking a big risk.

“Using technology to make routine administrative tasks more efficient makes perfect sense,” they said.

“[However] using technology to cheat or wriggle out of professional development would be short-changing the doctor and their patients.

“Any doctor who did so would also need to be able to account for their behaviour to the board.”

This article was originally published in AusDoc  on 13 June 2023.

Disclaimers


IMPORTANT:
This publication is not comprehensive and does not constitute legal or medical advice. You should seek legal or other professional advice before relying on any content, and practise proper clinical decision making with regard to the individual circumstances. Persons implementing any recommendations contained in this publication must exercise their own independent skill or judgement or seek appropriate professional advice relevant to their own particular practice. Compliance with any recommendations will not in any way guarantee discharge of the duty of care owed to patients and others coming into contact with the health professional or practice. Avant is not responsible to you or anyone else for any loss suffered in connection with the use of this information. Information is only current at the date initially published.

Share your view

We welcome your feedback on this article.

To Top