Last week I spoke over Zoom for IAM 2020. I launched the manifesto for Society-centered design. Here’s what I said.

Since the 1970s, our socioeconomic design principle has been to innovate by focussing on individual needs, in order to satisfy profit or ourselves. It’s a design principle largely shaped by our political ideology of the patriarchy, that’s shaped much of Western society. Its characteristics of efficiency, speed, competition and profit worship has socialised the idea of individualism. Which in itself holds hierarchy and exclusion.

A worker rides a shared bicycle past a huge pile of unused shared bikes in a vacant lot in Xiamen, Fujian province, China, on December 13, 2017. Image: Reuters.

Innovation through individualism

The innovation frameworks we use, like design thinking, human-centered design or jobs to be done are optimised to help teams find new concepts that will win in the marketplace. The problem is that these innovation frameworks motivate teams to focus on this idea of user centered design which tends to be customer experience design. What will help this person work, buy, watch more? We’re not putting enough emphasis on the other actors who matter, or thinking about times when people are not consumers. We’re helping deliver commercial interests.

“Sleep is our competition” – Reed Hastings, Netflix’s co-founder and CEO. Image: Ethan Miller / Getty Images.

Individualism has shaped our civic frameworks

Our data protection frameworks, like GDPR or CCPA, express our digital rights as rights of the individual. The problem is that data rarely represents one person, it usually describes multiple people. Data isn’t even valuable until it’s in aggregate. But this individualistic lens has shaped how we now design for that data protection. Like, how we ask for consent. Our de facto model is one where consent is individually given, every time. It’s the ultimate divide and conquer attack on humanity.

A sea of cookie banners. Image: IF. CC BY-SA

The extractive business models

And then there are the business models, the blueprints of the market. They’re pretty limited, and predominantly the business model of digital products and services is to monetize data and use targeted advertising.

Mark Zuckerberg, smiling, walks through a crowd of people sitting with VR headsets on
Mark Zuckerberg at Mobile World Congress 2015, Barcelona. Image: Mark Zuckerberg / Facebook.

We’re causing chaos

Over time, this growth in data collection has had a secondary impact, and that’s on the climate. 2% of worldwide CO2 emissions comes from data centers. Training a single AI model can emit as much carbon as five cars in their lifetimes. If the Internet was a country, it would be the 3rd biggest energy consumer in the world.

Tanks containing coolant for servers at Google data center in Saint Ghislain, Belgium. Image: Yves Herman / Reuters.

At IF, we’ve had enough

We want to change the climate of ideas. We want to move beyond human-centered design to society-centered design. We must design for the collective. We must design for society.

Society as a sphere divided in layers. Next to the layers, there's a caption with the names of the components of society: citizen empowerment, civic commons, public health, equity and the planet.

Together, these components make up society-centered design.

The principles of Society-centered design


If we put care first and at the center of our efforts, we can move away from delivering solely for individual and commercial interests. Care lets us deliver for public health and the planet through compassion and reciprocity.

IF worked with a drone insurance company, to help them explain to their customers how their automated insurance policies were being decided. So we experimented with counterfactual explanations, at point of use, to explain how the policy price was calculated. You can see the explanations in blue - here we’re showing what impact the weather has on the policy. Here what we’re doing is bringing greater transparency into the automated process, to show care for a specialist user group who want to know they’re being treated fairly.
A couple of years ago we were approached by Oxfam, to work on consent and feedback mechanisms for displaced people in humanitarian crisis. From our research with Programme Managers in a camp in Gaza, we learnt how manual the feedback process was, and how open it was to corruption and bribery. The feedback process also left vulnerable people exposed. So we made a series of prototypes to show how data minimisation techniques, and tools like speech recognition, image recognition technology could help Programme Managers accurately report on the state of the camp to Oxfam faster, and in a way that protected people’s privacy. We showed how a digital service could provide greater care, not only to the Programme Managers but also to the people and families giving feedback.


As more of our lives are connected, we need to create systems that earn trust with people. Products, services, and standards that can be open, resilient and promote citizen empowerment.

Last year we worked with Google AI on a project to find out how the product teams could think about trust beyond individual interactions on a screen. We looked at how trust is created in industries, like aviation, IVF and farming. We synthesised characteristics of what makes a trusted system, and how these characteristics could be applied to technology like AI. These characteristics were designed to give teams a shared language about trusted systems. The key insight from the work is that it’s easy to overburden people with explanations, when really people understand what to trust from a whole set of different systems like culture, and from how the people around them act. We need to make the structures, approaches and, above all, actors that will provide ways of building and preserving trust with AI.


Empowering collective agency starts with radical inclusion of the most vulnerable. We should be creating a new civic commons by making economic opportunity for the many.


We can create new resources and standards that favor the civic commons and public health over commercial value and the success of the few.


Design is a political act and it’s our responsibility to design for people’s rights. Privacy is not a luxury for those that can afford it. Privacy is a human right. We must create systems that remove the imbalance of power and instead promote equity and citizen empowerment.


Without fairness and justice we cannot have equity. Too often, “the commons” is shorthand for “the majority”. So we need to place mechanisms for fair and just oversight inside our design systems, so society can hold the powerful to account.


The web is the greatest single distribution platform ever created. As a result, it has an outsized impact on everything, including causing harm to the most vulnerable and excluded. Design must seek to redistribute that power for citizen empowerment and equity.


AI and automation are rapidly changing the world. But currently they’re focused on commercial goals rather than societal needs. We have an opportunity to reshape AI and automation so they create equity and reinforce civic commons. People must be in control, always.


The climate emergency is an existential threat to humanity. We need to shift narratives and focus away from abundance and scarcity to regeneration. We need sustainable and regenerative design and business models for society, for public health, and for the planet.

Last year we worked with a medical insurance company in the US. They asked us to create an audit programme for their AI technologies, and a plan for practically running a trial audit. Something we wanted to highlight was the need for continuous assessment of an AI’s performance and impact. It’s not enough to just have spot audits, because AI models change regularly. Continuous audits will need new interfaces that teams can monitor AI performance, not only for ethics but carbon too. Like when you drive a car, you can see how much petrol is being used. How your decisions burn more or less fuel. We need the same for AI technologies.


The issues we face are intertwined, complex and ever-shifting. We live in radical times. And radical times require radical solutions. Doing nothing only favors a deadly status quo: we must act boldly and defiantly.

We stand on the shoulders of giants

The manifesto was not created without inspiration.

Cedalion standing on the shoulders of Orion. Blind Orion Searching for the Rising Sun (1658) by Nicolas Poussin

Here’s a selection of our references:

Co-sign the manifesto

These principles show what society-centered design must achieve, but we are not there yet. We are at the start of the journey. Join us by sharing this manifesto and email us to co-sign the manifesto.

Thanks to Dan Harvey, who we’ve had the pleasure of working with, who co-authored this manifesto. And to David, for his incredible graphics.