Ethics for healthcare data is obsessed with risk – not public benefits

January 5, 2018 by Tim Spector And Barbara Prainsack, The Conversation
Credit: Shutterstock

How many times a year do we tick a website or phone app's box saying "read and approved" – without having read the terms of service at all? While a user's tick of the box is sufficient to allow businesses offering web services and smartphone apps to use "anonymised" customer data for their own purposes, the same doesn't apply to most health research.

Consider the difference between creating that tick-box by cutting and pasting a standard legal disclaimer and writing a 40-page research ethics submission that rigorously undergoes a dozen revisions. Ethics has a bad image among many scientists and, for some, it raises images of finger wagging and obstacles to research projects.

Health researchers working with human participants – or their identifiable information – need to jump through lots of ethical and bureaucratic hoops. The underlying rationale is that poses particularly high risks to people, and that these risks need to be minimised. But does the same rationale apply to non-invasive research using digital data? Setting aside physically invasive research, which absolutely should maintain the most stringent of safeguards, is data-based health research really riskier than other research that analyses people's information?

Many corporations can use data from their customers for a wide range of purposes without needing research ethics approval, because their users have already "agreed" to this (by ticking a box), or the activity itself isn't qualified as health research. But is the assumptions that it is less risky justified?

Facebook and Google hold voluminous and fine-grained datasets on people. They analyse pictures and text posted by users. But they also study behavioural information, such as whether or not users "like" something or support political causes. They do this to profile users and discern new patterns connecting previously unconnected traits and behaviours. These findings are used for marketing; but they also contribute to knowledge about human behaviour.

Unintended consequences

One of us recently applied to use individual-level data that had been collected by a personalised health company, where paying consumers had consented online for their information to be used for research (without any requirement for research ethics approvals). But in order to use exactly the same anonymised data, academic health researchers had to apply for ethical approval first. It was a process which took six months' of paperwork. Despite all the time and effort it took to obtain this, the approval was never used, as there was an extremely costly stipulation that participants had to reconsent.

Another example involved a UK national bioresource of 100,000 blood samples collected for NHS research, where the name and purpose was slightly changed to a biobank. The research ethics committee decided that every participant had to provide their consent again, or else their DNA and blood samples couldn't be used. In addition to the cost to the taxpayer, it's expected that the decision will result in the destruction of about 30,000 samples – some from a tiny number who wouldn't want their samples to be used, but the vast majority from people who couldn't care less.

Many people are happy for their data and samples to be used for health research – if it creates public benefits, according to empirical research. How many of those people who failed to reconsent would have agreed to the destruction of their samples? And in whose interest is this?

The institutionalisation of medical and research ethics has created a plethora of bodies and local governance groups, with increasingly onerous conditions required for research involving human participants or access to potenitally identifiable personal information. Many jurisdictions, such as the US, understand research as systematic investigations designed to contribute to "generalisable knowledge".

Power imbalance

This means that most socially valuable health research carried out at universities and reputable institutions requires research ethics approvals, which is a significant obstacle. But corporations don't face the same scrutiny; ironically because they're not seeking to spread knowledge.

The EU's General Data Protection Regulation, which comes fully into effect in May, will make some improvements by giving citizens more control over the use of their data – but it will not really change this imbalance.

In the UK, the 2016 Data Security Review by Dame Fiona Caldicott, the National Data Guardian for Health and Care, made a number of very important suggestions to improve data security, increase people's support for valuable health research and give patients more meaningful control over their data. Last year, the government approved all of the recommendations. It's a step in the right direction, but it fails to address key structural problems that health research faces in the digital era.

Research ethics committees have been key to protecting patients from greedy drug companies and invasive experimental research. But obstructing publicly valuable research or destroying samples was never part of their mission. Ethicists aren't to blame. As a society, we have allowed the idea of risk management to take on a life on its own. Rather than us managing risk, risk is now managing us- and a state of what we call "uber-ethics" has emerged. And, what's worse, the current frameworks for research ethics are unable to deal with one of the biggest ethical challenges in the era of digital health: the power inequalities between the corporations that use data, and those whose data are used – patients and citizens.

To address the quasi monopolistic status of commercial corporations using personal data, ethics is more important than ever. But ethics must become political again – a project that supports all of us in systematically considering how specific policies, institutions, technologies and practices impact on the distribution of burdens and benefits within and across societies.

How do we get there? For academics, reminding research ethics committees of the importance of facilitating socially valuable research would help. But we also need policy changes to prioritise public benefits, especially where there is minimal risk. Regulators should pay more attention to whether or not data use has value for societies. If yes, it should receive public support and be freed from many of the onerous bureaucratic requirements that are in place. Research that does not have societal value besides lining the pockets of shareholders should be allowed to proceed, but with stricter safeguards. In addition, mechanisms must be put in place to ensure that some of the profits made with people's data come back into the public domain.

Explore further: Researchers call for transparent ethic committees

Related Stories

Researchers call for transparent ethic committees

September 29, 2016
Poor regulation of research can cause direct harm to patients, suggests a new research study led by the University of Dundee.

Consent process for medical research conflicts with standard UK practice

March 26, 2015
A major investigation into the views of volunteers on the consent process for medical research has been found to conflict with the standard practice required for consent in the UK.

Policy experts advocate mutual recognition for reviews of data-intensive international research

March 25, 2016
Genomic research holds great potential to advance human health and medicine. But for the millions of data points now being collected through large-scale sequencing efforts to be truly valuable, they must be analyzed in aggregate ...

Recommended for you

Women run faster after taking newly developed supplement, study finds

January 19, 2018
A new study found that women who took a specially prepared blend of minerals and nutrients for a month saw their 3-mile run times drop by almost a minute.

Americans are getting more sleep

January 19, 2018
Although more than one in three Americans still don't get enough sleep, a new analysis shows first signs of success in the fight for more shut eye. According to data from 181,335 respondents aged 15 and older who participated ...

Wine is good for you—to a point

January 18, 2018
The Mediterranean diet has become synonymous with healthy eating, but there's one thing in it that stands out: It's cool to drink wine.

Sleep better, lose weight?

January 17, 2018
(HealthDay)—Sleeplessness could cost you when it's time to stand on your bathroom scale, a new British study suggests.

Who uses phone apps to track sleep habits? Mostly the healthy and wealthy in US

January 16, 2018
The profile of most Americans who use popular mobile phone apps that track sleep habits is that they are relatively affluent, claim to eat well, and say they are in good health, even if some of them tend to smoke.

Improvements in mortality rates are slowed by rise in obesity in the United States

January 15, 2018
With countless medical advances and efforts to curb smoking, one might expect that life expectancy in the United States would improve. Yet according to recent studies, there's been a reduction in the rate of improvement in ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.