At IF, we work with companies that want to use data in ways that are ethical, practical and creative.
We’re interested in the plans the Information Commissioner’s Office (ICO) has for creating a regulatory sandbox. The sandbox will mean that organisations can draw on ICO guidance and expertise to ‘develop products and services using personal data in innovative ways.’
Last week, we responded to the ICO’s consultation on the sandbox. It’s important that the sandbox provides opportunities for businesses to test some hard problems, and develop practical solutions.
How can legislation be a barrier to innovation?
The General Data Protection Regulation (GDPR) is a step in the right direction, but it’s not easy to understand. It’s too focused on compliance and wasn’t written with the needs of product teams in mind. As an engineer, designer or product manager, it’s hard to understand how GDPR applies to a product you’re developing. Some of the challenges we’ve found are:
GDPR does not reflect how technology works. The practical implementation of some parts of GDPR is difficult, particularly sections related to automated decision-making. For example, under GDPR individuals have a right to be forgotten, but this is not always possible in cases where personal data has been used to train a machine learning model.
Data often describes multiple people. GDPR and the Data Protection Act (DPA) focus on individual rights in relation to personal data. There is no guidance on group rights, or how individual rights compete when data describes more than one person (for example, medical records can contain information about a whole family). This means that practical implementation of new rights like data portability is difficult.
Data can only be used for one purpose. ‘Purpose limitation’ prevents companies using existing data for new purposes. If a company trains a machine learning algorithm with personal data, and then spots an opportunity to apply that algorithm in new and innovative ways, it may not be possible to go back and get consent from all the individuals represented in the data.
Which areas should the sandbox focus on?
The ICO should prioritise areas that challenge organisations across different sectors, for example:
Fixing terms and conditions: finding ways to build consent, and develop understanding, throughout the journey of a service.
Making automated decisions easier for people to understand: communicating how decisions are made over a lifetime of a product or service.
Making data collection by internet of things devices visible: showing how people can understand how data is collected by technology that is integrated into their environment.
Defining data trusts: how data trusts can be used to increase data access while protecting trust, and what the models of consent should be.
What kind of mechanisms would be useful?
The ICO made some suggestions about the kinds of mechanisms that could be helpful for the sandbox. We support the following:
Advice or ‘informal steers’ early in the development process addressing data protection and information rights issues: this would be helpful for tricky areas, like those identified above. Guidance should be practical, demonstrated through prototypes and design patterns.
Anticipatory guidance on addressing data protection challenges in specific areas of emerging technology and innovation: services are designed using repeating patterns. We have collected design patterns for data sharing. This needs expanding to include areas like automated decisions, group consent, data about multiple people and data collected in public spaces.
Adaptations to regulatory guidance, assurance and enforcement approaches: the impact of any adaptations on people’s rights and lives must be made clear. Any adaptations should be repeatable so others can also benefit from learnings.
We’re looking forward to see how this work develops and how we can be involved. If you’re also interested in the opportunities the sandbox presents, or have feedback on what we’ve written, let’s talk. Drop us an email on email@example.com.
Co-written by Dr Eloise Elliott-Taysom and Grace Annan-Callcott