Wrapping up 2018

2018, the year everyone started catching up with what IF’s been talking about for the last three years. Facebook got caught up in the Cambridge Analytica scandal, GDPR featured heavily in inboxes in May and everyone started talking about AI and ethics.

Much of our work is under Non-Disclosure Agreement, so I can’t share it all. Luckily there's lots of other things we can talk about in public. Here’s 10 of them.

1. Building new tools for increasing trust in digital services

We did some work with DeepMind to explore the possibilities of Trillian as a tool to build trust in digital services in this blog post.

Hash

GIF showing hashing in Trillian. David Marques/IF: CC-BY.

2. Prototyping data and privacy in the humanitarian sector

IF started the year looking at a hugely important problem with Oxfam: consent and data minimisation for displaced people. With the rise in use of biometrics and other invasive data collection methods in this sector, building services with privacy, security and meaningful consent at the centre, is more important than ever. We investigated collective consent and consent delegation in this work, shown in the prototype below. Collective decision making is something we’ll be looking at more in 2019 - more evidence that designing for the ‘furthest first’ leads to better services for everyone.

Consent Tablet

Delegating consent prototype. Oxfam/IF: CC-BY.

3. A new approach to consent in healthcare

Working with Connected Health Cities in Manchester, we showed new ways patients could give consent, to build and maintain trust in how data is used. When and how data is asked for is important. People want time to consider their choices, and they want to be able to withdraw consent easily.

Giving Consent

Paper prototype showing how someone could give and withdraw consent. Image: IF CC-BY.

4. Making a big, bright orange exhibition explaining automated decisions

We worked with Dr. Alison Powell and her team at London School of Economics to combine academic and design research and look at how to make automated decisions understandable. Clarity about automated decisions is an area we’re interested in, so this was a great way to test how to best do that. We presented the prototypes from the project at an exhibition launch event in November. The team learnt loads from working together across sectors and we really enjoyed being able to show our work, in public, on a 12ft wall.

R0020868 Right

Understanding Automated Decisions Exhibition at LSE. IF: CC-BY.

5. Showing opportunities from GDPR and open APIs

We worked with the Open Data Institute on two reports that imagined future services built around Open APIs in the telecoms industry, and the new GDPR right to data portability.

Auto Swap Screen 67 V2

Autoswap prototype from [Open APIs report](https://openapis.projectsbyif.com/. ODI/IF: CC-BY.

6. Making rights more accessible

Even when people are aware of their rights around how data about them is used, the information they need to actually do anything about it is usually buried in privacy policies that few people read. Using a new, open source dataset of machine-readable privacy policies we worked with Open Rights Group to build a tool that makes it easier for people to find out how organisations use data and make GDPR-related requests.

7. Making data ethics practical

Conversations about data ethics have increased exponentially this year. At IF, we’ve assessed companies using our data ethics principles, and made recommendations for how they can innovate with data in ethical and practical ways.

Data Ethicsposters1

Data Ethics Posters. Image: David Marques/ IF CC-BY.

8.Looking back at privacy

This year we published our first policy report; the culmination of a research series to find inspiration for digital rights policy in historical events. From regulating railways, to digital identity in India, to the environmental movement of the 70s, we hope the research will inspire activists and advocates to continue to look to history for lessons for the future. The book is available in EPUB format, Kindle Mobi and as a PDF.

9. Talking about how to build safety and accountability into learned systems

Over the year the team spoke about IF’s work to different audiences. Picking one highlight: Sarah spoke on the ‘Cutting Edge’ Stage at Cognition X about building safety and accountability into systems that use machine learning. It brings together a lot of IF’s thinking in this area. If you missed it, have a watch below.

10. And...Drivetime

Finishing this list the way we finish every week at IF: with Drivetime. For the last two hours on a Friday, we add songs to a weekly-themed Spotify playlist. Our year in review? Halloween, Girl Power and, of course, GDPR.

Bring it, 2019

I haven’t covered everything the team got up to here. We also wrote a report about securing the internet of things, researched data leaks from machine learning models and built a bot to cheer on the ICO. There’s more work we’ve just finished which isn’t public yet, but will be soon…

After the whirlwind of 2018, we’re excited to see what 2019 brings! We’ll be back on the 2nd January.

If you're celebrating, happy holidays.



If you’d like to hear more about our work, email us at hello@projectsbyif.com.

Come work at IF! Our open roles are here: https://projectsbyif.workable.com/