Over the last couple of years, stories about how Internet of Things products have undermined people’s trust have been in the news almost every week. If you’re looking for more inspiration, the classic @internetofshit account tweets a wealth of examples.
Take the brilliant investigations the Norwegian Consumer Council have carried out on children’s smartwatches or toys. They’ve shown that popular, mass-produced products aren’t clear about the data they collect. Not only that, they’ve also examined how these devices operate, and found that they haven’t taken steps to protect the data they’re collecting. As a result, some of these products have been banned in Germany.
Even in flagship IoT devices, transparency and control is a huge issue. When I walk into my friend’s house and I find out they’ve got an Amazon Echo, there’s nothing I can do to stop it listening to the things I say. That brings up all kinds of issues around trust and security, when what I really want to be doing is chatting to my friend.
Challenges like these are fundamental to our work at IF. We’ve written about this before: trust is a design problem. Trustworthy products should be governed by strong principles about how data is used, and how that is communicated.
Addressing that means accepting that the current approach to permissions is broken. Patterns like data licences are much more proactive, and offer people a meaningful choice about how data about them is used.
It also means accepting that most devices in most places where people live and work are shared. One person shouldn’t set the terms for everyone around them. There’s never going to be a blanket permission model that everyone agrees to all the time. And yet, that’s how most products are built at the moment.
Ultimately, it means addressing problems like these together.
Take updates. IoT devices are really poor at updating securely. Unlike the web, where shared standards and technologies make secure communication much easier, IoT startups keep reinventing the wheel.
I know these devices aren’t like internet browsers. They’re low-powered, and have much smaller memories, which makes doing something like implementing Transport Layer Security (TLS) much harder. But that makes community-scale efforts to find ways of solving this problem even more important. It’s something we’ve been thinking about at IF, and will be publishing more about in the new year.
Regulators are catching on to how important this is too. That’s why things like the EU’s General Data Protection Regulation are coming. People are having their rights strengthened, and for businesses who don’t respect those rights the fines will be catastrophic.
At IF we see the GDPR as a huge opportunity for businesses to demonstrate they respect people. Embracing these new rights and putting them at the heart of new services is an opportunity for designers to build innovative, empowering products. The regulation is a collection rights that describe trust in the digital age. It has the power to change the services we use for the better. It’s also a direction of travel – this won’t be the last piece of legislation securing digital rights.
There are good examples out there. By keeping access local, IKEA’s smart lighting is more secure than many other entries into the smart home market. Hoxton Analytics have built real-time monitoring technology that obscures identity at source. Finally Signal’s new approach to contact discovery steps towards a world where data is regularly handled in a verifiably secure way.
The products and services we use should respect our rights, now more than ever. We’re looking forward to helping more organisations do that in 2018 and beyond.
Huge thanks to Heather Corcoran at Kickstarter, Nat Hunter at Machines Room and to Alexandra Deschamps-Sonsino of the IoT London meet up for having me.