GPs use AI to write ‘fake apology’ letters to patients who complain, study finds

  • Have you been a victim of medical professionals using AI to communicate with you? Email john.ely@mailonline.co.uk

British doctors are using AI to respond to patient complaints, making their jobs easier, according to a medical group.

A report from the Medical Defence Union (MDU), which provides legal advice to doctors, warns that ‘some doctors are using artificial intelligence programs such as ChatGPT to draft complaint responses for them’.

According to the agency, doctors have been “seduced” by the opportunity to “make everyday tasks easier”.

But by doing so, the MDU not only risks handing over sensitive patient information to an AI, it could also lead to inaccuracies and further irritation for patients.

The group told MailOnline it had seen a ‘small number of cases’ of doctors using AI in this way, and was issuing a general ‘proactive’ warning to its members.

A report from the Medical Defence Union (MDU) warns that ‘some doctors are turning to artificial intelligence programs such as ChatGPT to draft complaint responses for them’

They say doctors should be particularly concerned about AI offering “false apologies,” where they respond to a complaint in general terms, such as “I’m sorry you feel the care was poor,” rather than addressing specific points a patient raises.

According to Dr Ellie Mein, medical-legal advisor at the MDU, it is understandable that doctors, such as GPs, are turning to AI as a potential tool to save time.

“Given the increasing complaints and the enormous pressure on healthcare, it makes sense that healthcare professionals want to find ways to work smarter,” she said.

‘AI technology is being used in many ways to improve the quality of patient care, for example in health screenings.

But when it comes to addressing patient concerns, there is no substitute for the human touch.’

She said the MDU was aware of cases where patients found out their doctor had used AI to respond to their complaint.

Before using such tools, doctors should consider how they would feel if a patient confronted them with them, she added.

“There are known cases where recipients who were suspicious about the wording of a complaint response were still able to reproduce the same text by asking AI to compose a similar letter,” the researcher said.

‘Would you feel comfortable in this scenario and would the patient feel that you take his or her complaint seriously?’.

Dr. Mein added that while it is acceptable to use AI as a tool to respond to a complaint, there are some pitfalls to avoid. Don’t rely on the answer given.

She said doctors should beware of inaccuracies or spelling mistakes. Many AI tools have American roots and can therefore make it clear that a doctor is not responding authentically to a complaint.

Furthermore, under no circumstances should doctors provide confidential medical information about a patient to an AI, as this could breach UK data protection law.

Dr. Mein also cautioned that AI-driven responses could miss information a doctor is obligated to provide, such as who a patient can contact (e.g., a supervisor) if he or she feels his or her complaint has not been addressed.

This comes as the number of official complaints from patients to NHS doctors has reached a record high.

In 2022-23, almost 229,500 written complaints were received by the NHS, according to the most recent data.

This is an increase of 41 percent compared to the 162,000 recorded ten years earlier.

Of the 229,500 patient complaints recorded in 2022-23, the majority (around 126,000) were to GPs or NHS dentists.