What gets measured gets changed

The tradeoff of privacy for societal value is a false choice. We can compute useful metrics without succumbing to abusive and illicit surveillance. This is more important now than ever before; to address systemic racism in products and services we need to know how people are using them and we need to acquire that knowledge in a privacy preserving way. Metrics that acknowledge the different experiences of Black people is an example of how we can help organisations move from a space where they consistently under serve Black people to one where they innovate to better meet their needs.

The public no longer have to trade privacy for societal value

At the moment for organisations to see representation they need to collect sensitive data. In doing this they often ask the public to accept a tradeoff between digital privacy and societal value. This doesn’t have to be the case. There are ways for product teams to glean insights from data sets without being able to identify individuals, these are called privacy-preserving techniques. We’re working with Benchmark, an initiative exploring the ethical uses of location data, to show how to use privacy preserving techniques in practice.

Johnny Miller’s photography captures inequality from above.

How privacy-preserving techniques work

There are different privacy-preserving techniques to choose from. We’re using randomised response, a method that involves adding noise to data so that patterns can be seen from afar but it’s impossible to see which data points relate to individuals. It’s a more effective way of protecting people’s privacy than other methods like removing identifiers such as names or addresses, which still carries the risk that people can be reidentified. We’re also going to use synthetically generated data where possible. Given that Black people are already more surveilled than others, organisations need to put more effort into ensuring people are not further exposed.

We’re going to make prototypes to learn and will share our progress

Over the next couple of weeks we’ll make prototypes to show how privacy-preserving techniques could show up in services and document our process. To make the prototypes easier to understand, we’ll ground them in a real use case and apply the techniques to actual data. But we don’t want to handle any sensitive data if we don’t need to. Arups’ city modelling lab have been working on code to generate synthetic city data. This gives us another opportunity to show how you use synthetic data to develop tools without using sensitive data about people from the start.

Synthetic transport data from Arups’ city modelling lab.

In the next post we’ll talk more about why it’s going to be so important to see representation in city data, and more specifically in mobility data going forward. Get in touch if you’re using privacy-preserving techniques or want to find out more about seeing representation in datasets. We would love to hear from you!

Lastly, here’s a selection of our reading list. References about antiracist design within technology: