Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

GPs use ChatGPT to help them treat patients, Harvard study warns

A survey of British family doctors found one in five using AI software to suggest treatments or write letters

GPs have been using ChatGPT to treat patients, a Harvard study has warned.
Researchers at the American university found one in five family doctors in the UK had used artificial intelligence tools while treating patients, despite a lack of regulation.
The survey of 1,006 GPs found dozens were using AI to help diagnose conditions and find treatment options.
A quarter of the 205 who admitted using machine-learning tools to help them do their jobs said they had asked the software to suggest treatments.
Almost three in 10 said they had used AI to help diagnose a patient. Others admitted they had used it to write letters, generate documents after an appointment with a patient, or create patient summaries and timelines based on past records.
Experts warned that unregulated use of tools such as ChatGPT, Microsoft’s Bing AI or Google’s Bard, could “risk harm and undermine patient privacy”.
The study, which involved disseminating a survey to family doctors through doctors.net.uk in February this year, was the largest of its kind to assess the use of AI in medical practice.
ChatGPT was the most commonly used AI tool, with 16 per cent of GPs admitting to using the chatbot, which launched in 2022.
AI is already being used in other NHS settings, for example helping radiologists to interpret scans or building personalised 3D images of tumours, as well as assisting with administrative tasks such as booking-in patients. 
But the researchers warned there was a “lack of guidance” and “unclear work policies” for AI in general practice. They cautioned doctors about the technology’s limitations because it “can embed subtle errors and biases”.
The study was conducted by an international team led by Dr Charlotte Blease, a healthcare researcher at Harvard Medical School and associate professor at Uppsala University in Sweden.
“These findings signal that GPs may derive value from these tools, particularly with administrative tasks and to support clinical reasoning. However, we caution that these tools have limitations since they can embed subtle errors and biases,” the authors wrote.
“They may also risk harm and undermine patient privacy since it is not clear how the internet companies behind generative AI use the information they gather.” 
The researchers said it was “unclear” how legislation to regulate AI in medical practice would work in reality and called for doctors to be trained about the benefits and risks.
Prof Kamila Hawthorne, chair of the Royal College of GPs, said AI “must be closely regulated to guarantee patient safety and the security of their data”.
She added: “For general practice, AI could help to solve a longstanding problem – high levels of unnecessary bureaucracy and administrative processes are a significant drain on GP time.”
Other studies have shown that GPs can spend a quarter of their time on admin, and so using AI to alleviate this could free up time for patients, Prof Hawthorne added. 
“Technology will always need to work alongside and complement the work of doctors and other healthcare professionals, and it can never be seen as a replacement for the expertise of a qualified medical professional,” she said.
The Harvard-led research was published in the BMJ Health and Care Informatics journal.
It comes after a separate study published yesterday revealed that GPs were contracted to work just 26 hours a week on average in 2022, based on analysis of NHS data.
The study in the British Journal of General Practice found that family doctors had been reducing their working hours despite a growing number of patients. 
The average contracted hours of a GP fell 10 per cent between 2015 and 2022, and GPs worked fewer hours in total, despite their number increasing 5 per cent in that period.

en_USEnglish