Australian Medical Association calls for national regulations around AI in health care

Australian Medical Association calls for national regulations around AI in health care

Doctors in Perth have been ordered not to use AI bot technology, with authorities concerned about patient confidentiality.

In an email obtained by the ABC, Perth’s South Metropolitan Health Service (SMHS), which spans five hospitals, said some staff had been using software, such as ChatGPT, to write medical notes which were then being uploaded to patient record systems.

“Crucially, at this stage, there is no assurance of patient confidentiality when using AI bot technology, such as ChatGPT, nor do we fully understand the security risks,” the email from SMHS chief executive, Paul Forden said.

“For this reason, the use of AI technology, including ChatGPT, for work-related activity that includes any patient or potentially sensitive health service information must cease immediately.”

Since the directive, the service has clarified that only one doctor used the AI tool to generate a patient discharge summary and there had been no breach of patient confidential information.

A woman sits in front of a computer open to a screen showing purple and green colours

Chat GPT has become an everyday tool used by people in a variety of professions.(ABC News: Gian De Poloni)

Call for national regulations

However, the case highlights the level of concern in the health sector about the new models of unregulated AI that continue to be let loose on the market. 

Australia’s peak medical association is urging caution and calling for national regulations to control the use of artificial intelligence in the healthcare system.

The Australian Medical Association’s WA President, Mark Duncan-Smith, said he did not think the use of tools like ChatGPT was widespread in the profession.

A man in a white shirt with a black and gold tie looks to the side while sitting at a desk with a glass top.

AMA WA President Mark Duncan-Smith said the use of Chat GPT by medical professionals was inappropriate.(ABC News: Keane Bourke)

“Probably there are some medical geeks out there who are just giving it a go and seeing what it’s like,” Dr Duncan-Smith said.

“I’m not sure how it would save time or effort.

“It runs the risk of betraying the patient’s confidence and confidentiality and it’s certainly not appropriate at this stage.”

That view is shared by Alex Jenkins, who heads up the WA Data Science Innovation Hub at Curtin University.

man sitting in front of lap top with blurred class of students in background

Alex Jenkins agreed ChatGPT was not a suitable platform for sensitive information. (ABC News: Claire Moodie)

Mr Jenkins said the developer of ChatGPT had recently introduced a way that users could stop their data from being shared, but that it was still not a suitable platform for sensitive information like patients’ records.

“OpenAI have made modifications to ensure that your data won’t be used to train future versions of the AI,” he said.

“But still, putting any kind of data on a public website exposes it to some risk of hackers taking that data or the website being exposed to security vulnerabilities.”

Potential for AI in health 

link

Leave a Reply

Your email address will not be published. Required fields are marked *