Last week I spoke at the #govdesign event on data and privacy. Here’s what I said (with a few additions from the discussion afterwards).
Showing the seams
This quote is from Arthur Balfour, British prime minister at the beginning of the last century. It’s also attributed to Geoffrey Howe who was a minister in Margaret Thatcher’s government.
“Democracy is government by explanation”
– A.J. Balfour / Geoffrey Howe
It talks to the idea that people need to have in a democracy to understand how government works and for that to be a conversation.
To me, it talks to the idea that in a democracy people need to have the opportunity to understand how their government works. For government to be a conversation.
While I firmly believe people should never need to understand the structure of government to access public services, not having to understand government should not mean obfuscating the workings of the system.
There are always times when people need to be able to understand the rules, for example when things go wrong. If things just magically work, how do we, as a society, know how to fix things when they break? As designer Matt Jones has said “magic is a power relationship”.
At IF, we think there are ways of designing services that can make that relationship a bit more balanced.
Design patterns that explain what’s happening
One way is through the use of design patterns that put people in control of how data is used. For example, giving people opportunities to prove facts about themselves, rather than sharing data automatically in the background.
A nice example of this is the UK Share driving licence service. It doesn’t predetermine how someone might want to share the fact that they are allowed to drive a particular type of vehicle. Instead, it gives them the means to share the data as they see fit.
That is not to say that people should have the final say over how all data about them is used. If someone is convicted of speeding, they don’t get to choose if the resulting points are added to their driving license or not. However, they should always be informed, and it should always be able to understand what is happening.
Accountability at the point of use
There is also an opportunity with digital services to help people understand more about how their democracy works and where accountability lies.
For example, this mockup shows how you might surface upcoming legislation changes relating to a service, who the minister ultimately responsible is and how the service is performing, all at the point of use.
This requires thinking beyond the immediate needs of someone using a service and considering what someone might need to know if things go wrong or how citizens might engage in helping improve the services they rely on over time.
Digital in public spaces
These questions of understanding, accountability and legibility get even more important in the context public spaces, as our streets start to become more connected.
For example, how will people understand the sensor networks on their streets if they can’t see them or understand who operates them? How might they opt out?
The fairness paradox
The second thing I’d like to talk about is what I’ve been calling the ‘fairness paradox’.
Better use of data (and better quality data) can improve services, making them more responsive and real-time. It can also make it possible to deal with multiple parts of government at once, with services designed around needs that are broader than the activities of a single arm of government. Things like moving home, or a child starting school for the first time.
Data can also make services ‘fairer’ by allowing politicians to pursue policy objectives that target a particular group - for example families with small children on low incomes.
I’ve put ‘fairer’ in quotes because definitions of ‘fairness’ vary between parties and political contexts. For example, welfare policies can be put in terms of being ‘fairer’ for taxpayers through better targeting of public funds, or ‘fairer’ for those in most direct need. The principle stands though - data gives politicians choices by letting them choose how to target available resources.
The paradox is this though: collecting data to personalise or means test a service comes at a cost to people’s time and privacy. Time because, there will probably be a longer form to fill out. Privacy for reasons I’ll talk about now using an example from history.
The Household Means Test
In the 1930s, there was a thing called the Household Means Test which the government used to decide who got certain types of unemployment relief. It was a response to spiralling unemployment figures and limited resources during The Great Depression.
In some cases, the means test included government inspections of people’s homes to see if they qualified for additional payments:
“The discretionary element in the investigators’ roles, whereby they could authorize one-off payments for worn bedding, clothes, or kitchen utensils, led to unwelcome investigations into bedrooms and a discrimination against clean and tidy homes.”
– Stephanie Ward, ‘The means test and protest in 1930s south Wales and north-east England’
There were riots – the one in this photo happened at Old Market Street, Bristol.
Utility, privacy and safety
I think this illustrates that there is a trade-off when designing government services between utility, privacy and safety. What does it mean for today’s data-driven services?
While I’m not saying something as simplistic as universal basic income is inevitable, I do wonder if concerns about privacy might start to move policy makers away from today’s hyper-means-tested benefits like Universal Credit, towards services that are more universal and less data hungry.
The other risk to privacy from data-driven services comes from data breaches.
Earlier this year Aadhaar, the huge Indian identity platform, saw personal data being sold by government employees via an encrypted whatsapp account.
In South Korea, the ID numbers and personal details of an estimated 80% of the country’s 50 million people have been stolen since 2004. And in 2014, a data breach at the Oregon Employment Department led to the employment records of 850,000 people being accessed.
It makes me think about UK datasets like the National Pupil Database, with it’s 20 million records. (Anyone that went to school after 2000 is probably listed on it).
This will keep on happening, so data minimisation needs to become a core design principle of government services. If you don’t collect the data in the first place, it can’t leak.
Data access, not data sharing
The final thing I’d like to talk about is how government uses data at the macro level, and the language it uses to talk about it.
Last year, a ‘memorandum of understanding’ was set up between NHS Digital and the Home Office to share data in an effort to trackdown immigration offenders.
A memorandum of understanding is an administrative procedure between departments. There was no debate in parliament before the agreement between NHS Digital and the Home Office was set up (although questions were asked).
It was finally scrapped in January after a warning from Public Health England that “it could present a serious risk to public health” that was “comprehensively ignored”.
Putting aside that this was essentially government prioritising short-term chasing of immigration figures over a risk of epidemics, the key thing here is the lack of oversight that exists to weigh up the risks and benefits of joining data together in this way. It was just an administrative process.
I think we need to move from a world of ‘MoU’s’ to a data infrastructure that is designed to put people and their representatives in the loop. We need to move to from a world sharing to a world of data access.
Every policy area from getting people into work, to making them better when they are sick, to how prisoners’ sentences are decided is becoming a question of data access and data quality.
As government becomes more digital, the question of accountability and oversight of the use of data will come up again and again.
Who should be responsible for the stewardship of critical data? In turn, who should those responsible for data be held accountable to? Should it be a minister, as with the NHS/Home Office memorandum of understanding? Parliament? Citizens directly, as has been trialed with medical records in Manchester? New regulators or auditors? The answer is probably a mix of these things.
For these reasons, I believe that the question of how we should structure government to allow society to make the most of data, while also being accountable and safe, will become the public policy question.
Better use of data can give us better public services. But that is not enough. Those services need to have accountability, safety and trust designed in from the beginning.
Is the private sector that different?
Today, technology companies are, wrongly or rightly, starting to take on characteristics that have generally been the reserve of governments.
The Google account system is an identity system with a similar number of users to that of Aadhaar. Apple’s cut of AppStore purchases has some of the qualities of a tax system. The impact of the choices that Facebook makes about deciding what content is OK to publish is similar to that of governments. At the same time, digital products and services are collecting more data and using it to inform ever more important decisions.
Are the issues of accountability, trust and privacy in the public sector that unique? I would suggest not. In many respects, the issues the public and private sectors are facing are becoming indistinguishable. Many of the answers may be the same too.