When people talk about data protection or security it’s easy to slip into vocabulary associated with the military, such as breaches, hacks and attacks. In this context, it’s difficult to understand how an individual has agency; it is a topic disconnected from the everyday reality of life today.

This puts a heavy burden of responsibility onto each individual’s shoulders. Taking responsibility for home security may be as simple as buying a burglar alarm, but taking on the challenge of information security is much more complex. We’re bombarded with pop-ups asking for consent as we use the internet, and it’s not practical to read the terms and conditions that come with every app. As Cambridge Analytica’s use of Facebook’s data made clear, it can be almost impossible for individuals to have a clear enough idea of how data about them is being used to be able to make meaningful decisions about what they share.

Changing the way we talk about security

I started thinking about this topic in early 2013, just before Edward Snowden threw a spotlight on issues of data privacy. I was designing a fictional alternative to the internet that would be created, controlled and owned by everyone who used it: the Alternet. As part of the Alternet, I imagined what a different power dynamic around data might look like.

Rather than people having to subscribe to each company’s data policy, for example, I created data licences. Data licences let individuals set parameters for how data about them could be used, similar to Creative Commons licences, removing the need for multiple permissions. The idea was to build care, consideration and community into systems that are used to protect information.

Router boxes from the Alternet network were designed with an individual fingerprint, that Alternet users could scan to verify that it could be trusted. To change the router inside, the box would have to be physically broken. A broken router box was a physical indicator to the community that something was wrong. Photo: Sarah Gold, CC-BY.

Railway safety systems as a point of departure

The challenge of how we talk about data stayed on my mind over the years as I launched the technology studio IF. I’m eager to look at how other types of systems protect their users and communicate relevant information about how they function. That’s why I organised a team outing to a railway signal box in St Albans that had been preserved in the same state since the pre-digital era.

Photo: Levers at St Albans signal box, Sarah Gold/IF, CC-BY

On the visit we learned how the design of analogue mechanisms diminished the risk of human error and kept people safe. These included interlocking lever cages that didn’t allow two trains coming in opposite directions on the track at the same time and physical tokens that were released when two people had agreed on an action. We also heard how these safety measures had been adapted for the digital era without increasing the risk of failure. The IF team has previously looked at what can be learnt from the history of railway regulation for digital services today.

Video: spotted on my holiday in Sri Lanka: a train guard using a physical token (Sarah Gold, CC-BY).

Thinking in terms of safety, rather than security

While safety and security are closely related fields of practice, they have a subtly different emphasis. Talking about railway safety, for example means considering systems that people don’t need to understand on a deep level in order to trust them. We can learn what a flashing light at a railway crossing means without knowing the mechanism behind it. The responsibility for assessing the safety measures and putting them reliably in place lies with the railway staff.

Similarly, airline passengers don’t need to understand engineering in order to trust that they will be protected during a flight. There are organisations, such as the United Nations’ International Civil Aviation Organization (ICAO), that employ experts and are able to go into this type of depth. The ICAO creates standards, recommends best practices and defines protocols that should be followed after an accident. In order to increase safety and passenger trust, airlines communicate the most important information for individuals to understand, for example, how to put on a seatbelt and what to do in the case of an emergency.

Focusing on language is just the first step

There are plenty of other systems that have a similar approach to safety, from nuclear engineering to aerospace. We don’t expect the average person to understand all the systems involved, but we do make sure that there are expert organisations set up in the public interest along with regulators to protect everyone. Food packaging is another example of how safety standards have been put in place by organisations, and communicated transparently using colour-coded charts to help people understand and make effective decisions. Responsibility has shifted from the individual to a professional group, incentivised to do their work, and we use regulators to ensure that those actors do their jobs well.

First, it’s important to assess the concepts we use and how they can shape our thinking, in conscious and unconscious ways. The idea of safety suggests connotations of society and care that can be lacking in security. It’s a crucial piece of a complex puzzle, and a good place to start when starting to imagining a better future for data protection. So rather than considering a micro individualistic approach to trust, the bigger question is are there organisations that will help people to understand and trust the digital services they use?

Thanks to Luke Church for prompting me to share thoughts on this. Edited by Jess Holland.