We’ve had a busy start to spring, with a couple of different client projects (under NDA for now), which have had us running workshops, carrying out user research and prototyping to work through some knotty problems.

María, George, David, Emily, Felix, Ella and I have been updating IF’s data patterns catalogue. IF’s work is pretty much in three categories: patterns, prototyping and production. Ethical guidelines are fine, but they increasingly seem to exist to make organisations look good without having to change. And less cynically: anyone who has worked in a software team knows that translating lofty notions into code and designs is…hard. Patterns are a way of giving teams the practical tools they need to design services that respect people’s privacy.

We’ve carried out research with designers, developers and product managers that use the catalogue, to find out what works well and what’s missing. We’ve incorporated what we learned into the updates to the catalogue, as well as adding new patterns and more detailed analysis of each pattern.

We haven’t published the updates yet but have been testing them internally. Including a competition for the best use of the patterns.

Members of the IF team workshopping how to use the patterns catalogue for designing a healthcare app. Photo by David Marques/ IF CC-BY

Sarah’s also written the first in a series of posts about how to earn trust through design. She argues that companies need to know how to win users’ trust. It’s a moral imperative, and it’s also becoming an essential component of business success. Improving the UI of digital products is just the first step. Organisations must go further, and redesign their products in a fundamental way, so that trustworthiness is built into the technical architecture at every level.

That’s why we’ve recently built a tool to monitor data stored in Amazon Web Services. The tool is called ‘S-3 monitor,’ and it uses open source software called Trillian, which produces datasets that are transparent and verifiable, meaning no one can break into the system and change or add anything without leaving footprints that alert others to their presence. Dave wrote about what we did and why and Emily wrote up the technical details.

Lots of people said nice things about the work too, which was great. Here’s a couple of things from twitter:

Earlier this month, Sarah was interviewed by Emma Tucker for an article on Surveillance Capitalism in Creative Review. Companies which are able to integrate emerging technology, transparency and accountability into their services now will stand out, and ultimately help shape the emerging ethical data market in the future. The article is behind a paywall but we’re excited about the opportunity to be able to share our point of view with Creative Review’s big design audience.

Cath and I spoke at the Turing Institute about the uses and misuses of connected devices. One of the things we’ve been arguing at IF is that it’s more helpful to look at data protection through a lens of designing for safety. Sarah’s written more about this argument recently.

This month I also had the privilege of speaking at Afrotech, a brilliant tech festival by and for people of African and Caribbean heritage. I spoke with Akil, Head of Research at COMUZI, about the work we did together researching young people’s attitudes to data, privacy and AI. Here’s my favourite slide which sums up the main point we wanted people to leave with:

Looking ahead, we’re excited to kick off another project with Citizens Advice next week and we’ll also be continuing with the client projects that are under NDA too.

Finally, a quick reminder that we’re hiring! Currently looking for six new colleagues, including a Business analyst, Senior developer and Researcher. All our jobs available on Workable.

Edited by Ella Fitzsimmons