Six months into the HomeSense project, Professor Nigel Gilbert outlined the “interesting ethics issues” then needing to be answered before the required data collection could responsibly get underway in participants’ homes.
As described in his presentation, ‘The Ethics of Sensors’, for the 2016 NCRM Research Methods Festival at University of Bath, the ambition of HomeSense is to enable social researchers to use digital sensors alongside self-reported methods or observations. The project is also assessing the extent to which householders might accept sensors in their homes for research, and the final output will be a set of guidelines for use in such studies.
More refined or automated applications of digital sensors in social research could, it’s hoped, lead to more effective enabling of assisted living and tele-care services, or more efficient use of energy. But before there could be any rush to implement sensors in homes, some obvious, and less obvious, questions of ethics needed to be addressed.
Ethical issues that came to light
As featured in Nigel’s presentation (in July 2016), five main categories of ethical issues were presented, all of which then required mitigation procedures to be drawn up to avoid any potential risks to participants or researchers and satisfy the University’s ethics committee.
- Informed consent
- Inferred meaning
- Data protection
Some thoughts on each were discussed by Nigel in his presentation, that highlighted some of the ethics issues then unresolved:
A standard requirement for studies on human subjects is obtaining informed consent, but, as Nigel described, “it becomes difficult to understand what informed consent means in this context, when participants also include other family members in homes, especially children. Then there are the issues concerning consent from visitors to these homes during the study, and how that is delegated to the householders.”
Even more problematic is the issue of describing how the data would be used. “Since the project is all about finding what we can do with data and for a range of purposes, how, in such circumstances, can we explain it?”
Then there’s an issue about, as the study progresses, whether a sufficient degree of informed consent can continually apply, implying that reminders about what has been consented to would be sent periodically, and for there to be a continual ability to freely withdraw from the study.
It was considered important that all collected data be encrypted in the devices placed in homes, and participants also be reassured that this promise is met, and the meaning of that made clear.
Another security issue concerns data retention, about which it was unclear how to square as the Data Protection Act states that data should be retained for the minimum time necessary, whereas RCUK policy suggests data should be retained for 10 years.
As is normal in human subject studies, all households would by anonymised. However, it was highlighted that there could still be a risk of individuals within households identifying each other in anonymised data or discussions of the data. “So”, said Nigel. “there’s an issue we need to think about carefully there…”
“Also, as we’re collecting data 24/7 in people’s homes, there may be things going on that, perhaps, they wouldn’t be very keen on us knowing about, at all”.
“One obvious example of that is sex”, said Nigel. An even more difficult category (of risk) is what if evidence is found of domestic abuse, or violence? This is something where social research has answers, but we need to think in advance of what we would do if we found data that indicated somebody was abusing somebody else in one of our households.”
“Should we ethically report such things?”, questioned Nigel. Adding… “How sure can we be?”
According to the 1998 Data Protection Act, data should not be exported outside the EU, That would exclude the use of certain data storage services, such as Google Drive or Dropbox.
To get the ‘go-ahead’, the fieldwork in homes was subject to ethics approval, requiring that it would operate within standard legal and ethical constraints. So, before the start of the data collection stage of the project, answers were required to all these and more questions.
Ethical concerns on the ground
HomeSense has three main research strands: adapting and developing sensors for social research purposes; developing data collections methods and trialling them in homes; and creating tools for analysing the data that these sensors generate.
Fixed and wearable sensors have been installed in volunteer households already and the data is now being ‘triangulated’ with time/use diaries, open-ended interviews and questionnaire responses, to form meaningful interpretations.
The static sensors detect noise levels, temperature, humidity, light, energy consumption and movement nearby, while the wearable sensors monitor participants’ location within their households.
According to lead researcher Dr Kristrún Gunnarsdóttir, “There were two stages to obtaining ethics approval. Firstly, consulting all relevant academic material on performing research ethically and all relevant university guidelines on data management.”
“Secondly, the unique circumstances of this work with potentially intrusive technology produces issues about how to explain it in an understandable way, and no one is going to want to read an essay.”
“So all the questions that arose presented quite a headache.”
A Participant Information Sheet was provided to all participants, designed to clearly describe the burden of participating, what would happen and what ethical and technical issues can arise. It outlines how consent, privacy and data security are to be obtained.
Ahead of actual installations in homes, the project team set up demonstrations of the sensor equipment to familiarise volunteer participants with what they could expect would happen, what sort of data is collected and what it looks like to the researchers. Here they had an opportunity to see data being recorded live and have a conversations about what can be inferred from the data streams.
“This conversation became a turning point”, said Kristrún.
“We had a sense that participants would want some sort of demonstration, and the degree of understanding we could achieve that way shows the power of demonstrating — or demystifying the technology. ”
“I got the sense, in my first visit to participants’ home that they were waiting for me to produce the technology from my bag.”
“They wanted me to show them our toys.”
”Technology is largely mysterious to people, and our thesis is the more people know what the technology is doing, the less concerned they will be.”
“Often the fear is that being too transparent about what the technology is doing will cause people to run away, but actually it was quite the opposite. When it was explained they found it fun.”
“Discussion helps people, and helps to build confidences. If you enter their environment it’s essential to risk a conversation and do a demonstration.”
By the new year of 2017, the HomeSense team had communicated with the ethics committee all relevant ethics issues, and conducted a risk assessment, concluding that the level of the risks to participants and researchers was ‘low’ in all cases, provided that all aspects of the fieldwork would be carried out according to the fieldwork plan that was elaborated in the ethics review application.
All things considered, the project’s ethics application to the University of Surrey Ethics Committee weighed in at 54 pages in length, including a risk assessment tabulated landscape across six of those.
Within that, in terms of ‘Right to choice and self-determination’, participants were to be informed about seeking their voluntary consent with the appropriate assurances that they are free to make choices about researchers accessing their household, where to install sensors, to end an interview or questionnaire at any point, and revoke consent to participate at any time up until a month after the trial. Children would also be given the opportunity to assent.
Additional measures were in place to confirm everyone’s consent at the time they participated in the hands-on demonstration, along with a conversation about the sort of information that can be gleaned, and to what extent this method is acceptable to them. Children living in the households would be involved at this stage so they had full opportunity to assent.
To address anonymity, it was decided that a random numerical household ID would be assigned at the point of collection of data from sensors, questionnaires and time use/diaries and the participants taking open-ended audio-recorded interviews would be advised not to name others on tape. All transcripts, diaries, questionnaires and transmitted (encrypted) data would be tagged in the same way, and all data would be stored on a secure university data server.
To maintain confidentiality, researchers would be required to assure participants about the anonymity of the data and build trust for how the data and potentially sensitive information would be handled, while also ensuring participants felt free to change their minds about participating.
The risk assessment recognised that participants, on realising what has been monitored, might become distressed, and feel a heightened awareness about their general household situation. Participants would be given the opportunity to ensure that sensitive areas of the house were not included and all fixed and wearable sensors could be turned off at any time, or disconnected from data capture.
To safeguard ‘internal confidentiality’ of households the intimacy of household relations would be respected, and impaired autonomy in those relations recognised. Time would be taken to learn from preliminary observations to anticipate activities and interactions involving any member of a household that may not be known or appreciated by another member, including the key respondent.
Further, to ensure respect for the internal confidences of households, the researchers would not share data views with participants that involve the presence and activities of other household members. The researchers would also follow guidelines recommending never to communicate materials they would hesitate or not be willing to present to participants in person.
Risk involving publication and dissemination of results from analysis of the sensor-generated data were to be addressed by researchers considering other forms of discussing the data analyses, and use pseudonyms with care and consideration, and only if other means were exhausted.