‘Big data’ can give new insights into population health and provide doctors with decision-making tools. It can also, if handled badly, lead to catastrophic breaches of privacy and shatter patient trust. Tim Tonkin assesses some of its successes and shortcomings, and how doctors can best navigate the digital future
‘Big data’ might not have many friends beyond the glass palaces of Silicon Valley. As the popular view of tech giants has shifted from a position of awe, to routine convenience, to hostility at their power and tax practices, many think we have given too much of ourselves to people we shouldn’t trust.
The NHS collects a huge amount of data, and for it to fall into the wrong hands would lay bare the most intimate details of the entire population. So, it’s not surprising the NHS patient-safety strategy, brought out earlier this year, warns that ‘failure, design flaws or incorrect use’ of IT systems has the potential to cause patient harm.
‘Frail, older adults are never included in clinical trials’
However, reading the strategy, this feels like a caveat. Generally, it sees data as a way of improving, and not threatening, safety. For example, the NHS needs to ‘improve understanding of safety across the whole system by drawing intelligence from multiple sources of patient-safety information’.
It’s not only self-evidently true that mass patient data could help the health service but that large parts of it could not function or advance without it. Discovering the link between smoking and cancer would not have been possible without whatever ‘big data’ was called back then, and nor would virtually any other epidemiological research.
Learning from patients
So how can the NHS harness the benefits without patients coming to harm?
Imperial College Healthcare NHS Trust in London is attempting to analyse and learn from a vast and previously under-utilised source of data.
Each month, Imperial sees thousands of records generated via the NHS’s Friends and Family Test – a feedback platform that allows those who have used the trust’s services to comment on their experiences.
Determined to learn from this data, the multidisciplinary team spent more than a year devising an algorithm able to analyse the free-text elements of the Friends and Family Test feedback, giving particular focus to what had been done well and what could have been done better.
Consultant urological surgeon Erik Mayer, who has been involved in the project, said through the algorithm patient data could now be processed in near real time and – thanks to an associated dashboard – be accessed by staff across the trust.
Speaking at a conference on patient safety hosted by the Westminster Health Forum last year, Mr Mayer said the algorithm had enabled clinical benefits through its analysis which would not have been possible to achieve manually.
‘We can determine what effects drugs that are commonly used in younger populations have on older populations’
‘We get 20,000 comments a month at Imperial; no one can get through all those and use them for quality improvement,’ he explained.
‘What we’ve done is trained an algorithm to analyse these comments in near real time for us, so we’re getting through these comments as they’re submitted – be that electronically or manually – then uploaded on to the system to assign a domain and a sentiment, where it’s positive, negative or neutral.’
Driving up standards
He said that the system had led to technical efficiencies in the way staff worked, saving time and identifying patient-derived improvement opportunities; from simple things such as taking steps to improve signage in wards to introducing a discharge checklist, saving patients from having to repeat information to different members of staff.
He said: ‘This is fully open to the whole organisation, so any ward Sister can look at [the performance] of another ward and try to understand why they are doing well or doing badly compared with their peers.’
Clinical senior lecturer honorary consultant and BMA medical academic staff committee co-chair David Strain (pictured below) says that big data is likely to play a role in the way healthcare is delivered and safety standards improved.
Spending half of his time in academic research and the other in clinical practice, Dr Strain has conducted research into healthcare for frail and older patients – a group often bypassed by pharmaceutical and charitable research sectors.
During a study into the use of proton pump inhibitors – medications that are generally safe and effective in younger patients – he was able to demonstrate their use in older adults increased the risk of pneumonia and fractures.
‘Frail, older adults are never included in clinical trials,’ he says.
‘There are databases such as Clinical Practice Research Datalink that include anonymised patient data from across the country. From that data you can identify populations that wouldn’t normally be included in clinical trials. Using a prior-event rate-ratio analysis, we can then determine what effects drugs that are commonly used in younger populations have on older populations. A lot of medicine is very target driven, particularly chronic disease management [and] those targets were derived from studies done 30 years ago. [Since then] medicine has changed, the population has changed, and the drugs have changed so we now re-evaluate those targets in model patient populations.
‘As medicine becomes more complex and with patients with multiple co-morbidities, we are going to require either a whole series of new specialists that don’t exist today or AI [artificial intelligence]-supported physicians.’
Dr Strain says that ensuring patient data is handled with the utmost levels of privacy and security is essential with big-data enterprises. He adds that transparency with patients about how their data is used is critical, as is ensuring the result of data research is for the benefit of patients.
The track record of big-data ventures in healthcare, however, is far from unblemished or without controversy. Six years ago NHS England announced its care.data programme. It was an ambitious plan that sought to collect patient data from GPs in England. The intention was to link GP data with hospital data sets within a central database. The data would be held in pseudonymised form and the plan was for the database to provide a valuable resource which researchers could apply to access.
Despite its stated intentions, however, the project was regarded with suspicion by many doctors, primarily because patients had not been adequately informed about how the data would be used, who might have access to it and how rights of opt-out could be meaningfully exercised.
These concerns were further compounded by confusion surrounding whether data could be shared with third parties for commercial purposes.
Three years after its launch and following a review of data security by Dame Fiona Caldicott, care.data was cancelled.
Concerns about the protection of patient data were also raised around a big-data partnership between the Royal Free hospital in London and Google-owned technology company DeepMind.
In 2015, the Royal Free and DeepMind formed an agreement that saw the latter process around 1.6 million partial records containing patient-identifiable information, for the purpose of developing and safety testing an app for detecting and diagnosing acute kidney injury.
The collaboration eventually led to the development of a mobile-based platform known as Streams, which was launched in early 2017.
However, an investigation by the ICO (Information Commissioner’s Office) published in July that year highlighted concerns around the development of the app.
The ICO demonstrated how patients had not been adequately informed that their data was being used as part of the clinical testing of the Streams app. It concluded that, in its partnership with DeepMind, the Royal Free had not fully complied with the requirements of the Data Protection Act.
Information commissioner Elizabeth Denham stated that, while the use of data had clear potential for clinical improvements and patient care, ‘the price of innovation does not need to be the erosion of fundamental privacy rights’.
In July this year, the Department of Health published a framework to help ensure clinical innovations underpinned by data research are beneficial to patients, with five principles at its centre.
‘No algorithm or computer system can replace the actual direct face-to-face contact between doctor and patient’
These include a requirement that any use of NHS data unobtainable in the public domain must be explicitly for the purpose of improving the health and welfare of patients.
Other principles include NHS organisations entering into arrangements involving their data must do so on ‘fair terms’ and be communicated clearly to the public to ensure transparency and, by extension, trust and confidence in the health service.
The health service will also need to consider possible unintended consequences of big data in healthcare; namely who will ultimately own NHS patient information that will power healthcare apps, and the potential financial costs posed by an increasing reliance on AI algorithms developed by the private sector.
Dr Strain says, irrespective of the potential in big data, AI is not a magic bullet capable of compensating for all the other challenges facing the health service such as under-staffing and a lack of resources.
He adds that big data and the AI applications that can be derived from it are not infallible.
‘Big data offers an opportunity to do research much cheaper and for creating decision-support tools, but no algorithm or computer system can replace the actual direct face-to-face contact between doctor and patient.
‘One of the biggest problems with AI development is the information that is put in.
‘The same term can mean different things to different people. A great example would be the term “chronic pain”, to one patient it means severe pain, to another it means pain that has been going on for a long time.
‘All of these [big data] systems are only as good as the information that’s put in.
‘Apps get it wrong, and when apps do get it wrong, they do so big style. Having the AI supporting the doctor can work perfectly well.’
It’s indisputable that patient data is a valuable commodity, with enormous potential. The challenge in future years is less likely to be one of technology, than in safeguarding whose commodity it is, and whose potential it serves.
Find out more about the BMA's work on health records
Read more from Tim Tonkin and follow on Twitter.