In November 2015, the Royal Free NHS Foundation Trust in London began a collaboration with Deep Mind, Google’s artificial intelligence subsidiary, to develop a new app for the detection, diagnosis and treatment of acute kidney injury. Data relating to 1.6 million patients, in identifiable form, was streamed across to Deep Mind – initially for the purpose of safety testing the app, but as the project evolved, additional functions were performed. Although recognising that patient data is personal data and therefore subject to the Data Protection Act regime, Royal Free and Deep Mind believed that their collaboration and the transfer of data was permitted because, in their view, it was for the purpose of direct patient care.
The Information Commissioner’s Office (ICO) has now concluded its investigation and released its findings: it disagrees. In consequence, Royal Free (as data controller in the arrangement) has been required to sign up to undertakings regarding future changes and compliance with the laws.
What did they get wrong?
Data relating to patients undergoing hospital care are necessarily and inevitably shared between all members of the care team who need access – that is how direct patient care can be delivered at all. Although the patient is never asked whether she minds the triage nurse sharing her notes with the A&E doctor or the radiologists sharing their results with everyone consulted on the case, of course the patient does consent. She’s come to hospital precisely to have the best and most appropriate team look at her condition, not only the first person she meets at the door. Therefore implied consent to data sharing for patient care is universally recognised.
The questions that Royal Free’s arrangement throws up is, what is the limit of that implied consent? At this point, the nature and extent of the data concerned have to be considered. Here, the data potentially included the most self-evidently sensitive information about individuals’ symptoms, treatment and responses. It is likely that most patients would want at least the chance to consider anyone outside the immediate treatment team being given access. We might agree, say, to a trusted consultant sharing observations with a consultant at another hospital to whom he thinks the information may be of use for another patient but, if the patient’s experience is that the consultant is distant, maybe condescending and keeps her in the dark about what is wrong and why one treatment is being selected over another, then even that extent of consent might not be forthcoming. More abstract degrees of sharing are more likely to be challenged; and the concept of sharing data with any technology company which provides the lowest price bid to process data for the hospital, but has a questionable reputation for respecting privacy itself, might well cause a majority of patients to think twice.
Similarly, an arrangement permitting strictly essential data to be shared will elicit consent from all but the most paranoid of patients. However, an arrangement giving apparently open-ended access to millions of patients’ data amongst which some will in fact prove relevant to the question being asked – but much may not – will cause people to respond differently. In other words, fact and degree have to be taken into account if the principles of data privacy are to be complied with.
What happens now?
The first stage of obtaining consent is providing information to the patients, and Royal Free has had to improve transparency through its website in response to the objections raised. Its assumption that ‘direct patient care’ included using data to develop future treatment aids was a step too far for the ICO, which, rightly, did not agree that patients would anticipate their data being used for such purposes unless told. The fact that the ultimate objective of improving diagnosis and care is universally laudable does not mean that an ordinary person has this in mind when dropping in for an appointment relating only to themselves.
The simplicity of the approach taken by Royal Free and Deep Mind, of including all patients’ data in the stream, also needs review. It is true that computer processing power means that this is an efficient way to identify possible relevant material, and ensures that, for instance, repeat incidents relating to the same patient are captured. However, efficiency is not the ultimate and only consideration: patient privacy is just as important. Patients have a right to opt-out of such vast research programs and a mechanism needs to be found which enables them to do so.
Finally, the rush to get started processing the data led to corner-cutting, in terms of the documentation and controls in place during the first part of the project. Although those deficiencies have now been corrected, the ICO’s view is that a full privacy impact assessment should have taken place before the project even started.
How to do better
The ICO is at pains to stress that it shares the general desire to see health data fully exploited in the name of health research and improving treatment. Nothing in the report suggests a ‘computer says no’ attitude, even though to businesses taxed with complying with data privacy rules it can often feel they can do nothing right. However, the lessons to be learned from this report are simple:
- Test your assumptions, rather than letting unbridled enthusiasm for the possibilities sweep doubts aside.
- Consider the patients’ potential range of attitudes, and engage and communicate with them accordingly.
- Even though the preparatory stages may feel like no more than lengthening delays on the hoped-for timeline, take the time to get the project assessment and documentation right before transferring any data.