19 hours ago
One of NHS's biggest AI projects is halted after fears it used health data of 57 MILLION people without proper permissions
NHS England has paused a ground-breaking AI project designed to predict an individual's risk of health conditions after concerns were raised data from 57 million people was being used without the right permissions.
Foresight, which uses Meta 's open-source AI model, Llama 2, was being tested by researchers at University College London and King's College London as part of a national pilot scheme exploring how AI could be used to tailor healthcare plans for patients based on their medical history.
But the brakes were applied to the pioneering scheme after experts warned even anonymised records could contain enough information to identify individuals, The Observer reported.
A joint IT committee between the British Medical Association (BMA) and the Royal College of General Practitioners (RCGP) also said it they had not been made aware that data collected for research into Covid was now being used to train the AI model.
The bodies have also accused the research consortium, led by Health Data Research UK, of failing to consult an advisory body of doctors before feeding the health data of tens of millions of patients into Foresight.
Both BMA and RGCP have asked NHS England to refer itself to the Information Commissioner over the matter.
Professor Kamila Hawthorne, chair of RGCP, said the issue was one of 'fostering patient trust' that their data was not being used 'beyond what they've given permission for.'
She said: 'As data controllers, GPs take the management of their patients' medical data very seriously, and we want to be sure data isn't being used beyond its scope, in this case to train an AI programme.
'We have raised our concerns with NHS England, through the Joint GP IT Committee, and the committee has called for a pause on data processing in this way while further investigation takes place, and for NHS England to refer itself to the Information Commissioner.
'Patients need to be able to trust their personal medical data is not being used beyond what they've given permission for, and that GPs and the NHS will protect their right to data privacy.
'If we can't foster this patient trust, then any advancements made in AI – which has potential to benefit patient care and alleviate GP workload – will be undermined.
'We hope to hear more from NHS England in due course, providing definitive and transparent answers to inform our next steps.'
Katie Bramall, BMA England GP committee chair, said: 'For GPs, our focus is always on maintaining our patients' trust in how their confidential data is handled.
'We were not aware that GP data, collected for Covid-19 research, was being used to train an AI model, Foresight.
'As such, we are unclear as to whether the correct processes were followed to ensure that data was shared in line with patients' expectations and established governance processes.
'We have raised our concerns with NHS England through the joint GP IT committee and appreciate their verbal commitment to improve on these processes going forward.
'The committee has asked NHS England to refer itself to the Information Commissioner so the full circumstances can be understood, and to pause ongoing processing of data in this model, as a precaution, while the facts can be established.'
'Patients shouldn't have to worry that what they tell their GP will get fed to AI models without the full range of safeguards in place to dictate how that data is shared.'
An NHS spokesperson confirmed that development of the Foresight model had been paused for the time being.