People don’t trust the things we’re making for them. We don’t take people’s need for privacy into account when we design and produce connected products and services. As an industry, we keep showing people that we aren’t worthy of their trust.
Take My Friend Cayla. A doll marketed as a smart companion for children. Now banned in Germany, where it’s been classified as a surveillance device. Cayla records everything said to it, transmitting it across the internet to be stored indefinitely in servers in the U.S. Along with i-Que, a similar ‘smart toy’, it kicked off the hashtag #toyfail, a campaign highlighting toys that violate european consumer law.
My Friend Cayla and i-Que, toys investigated by the Norwegian consumer council and later banned in Germany. (Photo: Forbrukerrådet)
None of that was visible at point of use. The only way to find out what was really going on was to read the tome of terms and conditions. Even then, jargon and generalisations hide the specifics of what’s being collected, who it’s been shared with, and what happens next.
Samsung Smart TV (Image: Smart Trends)
But this isn’t unique to children’s toys. Anything you say in front of your Samsung TV may be recorded and sent to third parties for voice recognition purposes’. So you need to be careful of what you say in front of your TV, because it’s going to gossip more than your next door neighbour!
Of course the options to change any of the default privacy settings is buried deep within the settings menu. None of this is obvious to someone who buys a TV to keep up with their favourite program. The same goes for products like Alexa, or for Teslas. These have powerful roles in owners lives, and they’re impossible to interrogate. That’s not good enough.
We need to make trust as important to design as accessibility
As digital rights become mainstream consumer rights issues, and as more products access more data, we believe that every person in a product team needs to be thinking about trust.
Data, privacy and permissions need to be taken as seriously as accessibility and usability.
We do a lot of user research in our work. Every time we talk to people, they tell us they’re suspicious of the products they use. Partly that’s down to headlines like the ones above, but it’s also because the things we make hide so much from people.
As we found in our work on the broadband monitor, people want more control over the things they use but they don’t have the means of doing that.
The broadband monitor (Photo: Projects by IF CC-BY)
Regulation will force services to change
Things like the General Data Protection Regulation are a response to this lack of consumer trust. The rights granted by the GDPR pointedly attempt to restore some of this power balance.
Rights given to people by through the GDPR
As a studio, we’re nervous that these rights are just going to be buried by services in their Terms and Conditions agreements, signed away into obscurity. Instead, we think these rights show opportunities for companies to build innovative services. They need to become part of the way things work, not conditions services grudgingly comply with.
Terms and conditions is a broken design pattern. They’re longer than War and Peace, written in gobbledygook and just don’t serve the purpose they’re made for. Can anyone really be said to have consented to something if they’re just clicking ‘agree’ to a 10,000 word document written in legal jargon? That’s a contemptuous way to treat people.
As the things we use get more complex, these documents are only going to get longer. It’s going to become even harder for people to understand what they’ve agreed to.
We need a convincing set of for data sharing, permissions, security and transparency. And we need your help to make them.
Our catalogue of design patterns for consent (Photo: Projects by IF CC-BY)
We need new patterns
Last year, we made an open source catalogue of design patterns for consent. It’s got around 36 different methods in there, some of them digital, some physical.
Active request, a pattern for consent in the catalogue (Image: Projects by IF CC-BY)
The patterns include things like active requests, granular agreements that people make to share specific data (like location) in the moment. Or having control of a physical object, like a Yubikey. Or quorum, where a number of people have to contribute keys or tokens to do something or get something. It also includes some examples of patterns that don’t exist yet, like data licenses.
The point is that terms and conditions don’t have to be the only pattern we rely on. There are others we can use to build services. We’re not saying these are all good patterns (there are problems with lots of them), but understanding what’s out there at the moment is a good starting point.
We can also make new ones. GDPR, and similar laws, give us a window of opportunity to come up with better, more empowering patterns. Because once they’re made, they’ll be copied. Just like the cookie banner, terms and conditions or card payment interfaces.
People making services need access to a whole set of patterns, a suite, to be used depending on specific contexts, specific risks, and to meet specific needs.
We need your help to make new patterns
Making new patterns needs to be a common endeavour. That’s why we’re starting the meetup – to discuss what the possibilities might be for data sharing, permissions, security and transparency.
But what we find can’t just stay in that room. We’re going to iterate the catalogue, and turn it into a set of tools for people working on products and services.
This isn’t just about the UI, it’s also about tools and code repositories - like when openpgp launched. That enabled teams to develop widespread browser cryptography tools for the first time…all because an open code repository existed. What are the code repositories we need for good quality, legible permissions to be mainstream?
We don’t know what that’ll look like yet – it’s something we’ll start to work out after the first meeting – but we the catalogue needs to be more than just a list. Accessibility has become a team sport, something designers and developers, policymakers and product people all have to take into account. With the right tools, we think we can do the same for trust.
Trust isn’t just about compliance – it’s about design
Security and privacy can’t be things we engage with at the end of a project. We can’t think of them as regulations we have to meet anymore. We have to take a holistic approach to privacy and security, understanding what people need and build services that have a fundamental respect for users.
We need to build things that are worthy of people’s trust. We think that means making services that empower people, that ask for meaningful consent, and take digital rights seriously.
That’s what we’re here to do.