In the past few weeks, Amazon announced a one-year moratorium on their facial recognition technology, IBM wrote a letter to congress stating it ‘no longer offers general purpose IBM facial recognition or analysis software’ and Microsoft promised to not sell facial recognition tools to police departments until there is a national law in place to govern this technology. In addition, the Association of Computing Machinery (ACM) –the world’s largest scientific and educational computing society – has called for an immediate suspension of all FR technology for all “current and future private and governmental use.”

Tough month for FR tech.

However, for over two years facial recognition technology has been known to misidentify people who aren’t men and aren’t white by a margin of up to 30%. While this moratorium has been issued to “allow time for responsible regulation to be put in place”, the failure of regulation isn’t the only failure in policy, it’s a failure of its creators.

Many have argued that facial recognition technology must be redesigned to be human-centered. We have said before that anti-racist design is an active pursuit – one that today requires proactive interventions at every stage of the HCD process. In this series, we use FR technology as a catalyst to explore the shortcomings of HCD as an anti-racist design framework, and offer alternatives in its place.

HCD: A short history

For the last thirty years, tech innovation has been underpinned by design thinking – a cultish methodology that often goes hand-in-hand with its non-identical twin human-centered design. Popularized by global innovation studio IDEO, HCD is used across sectors to problem-solve anything and everything – from shopping carts to AI.

“[HCD] aligns what your users and your team members want (desirability), with what is technically feasible and financially viable.” DFV venn

This focused approach can work for very specific products, but when applied to a dynamic material like data, that is social and overlapping in its nature, HCD magnifies the biases of our social interactions. All our -isms-racism, sexism, classism, homophob-ism, xenophob-ism- encodes them into our technology (see Airbnb, Facebook, FR technology).

Rethinking HCD for technological innovation

As such, HCD has been no stranger to industry critique. Over the next few weeks we’ll examine HCD and identify key interventions that could result in more equitable outcomes in the tech products it has created. Inspired by

  • ARTICLE 1: Acknowledging Social Biases. In HCD, we research the problem, then design its solution. But, when we design with data, our research needs to acknowledge the social nature of it as a material. In this article, we identify key moments in HCD and questions during research to help us design and build technology for a systemically discriminatory world. We challenge designers and technologists to hold ourselves to a higher standard.

  • ARTICLE 2: The problem with user needs. Our relationship to the user needs to change. We need to be honest with ourselves where flattening personas and lean testing for efficiency goes beyond agile time planning and enters the territory of erasure. By creating a more realistic perspective of who is affected by your product and how it influences all stakeholders, we can create products that are future-proof and ensure they are more resilient to unforeseen externalities.

  • ARTICLE 3: Redefine success. In practice, our metrics of success are extremely one-dimensional, generic and never critically synthesised, ie conversion rates, net promoter score, time spent on site, task successes. Design principles are translated into growth language and this growth language is what fuels our shared definition of success. By thinking of product success in a more holistic sense, we can create better products that centre more ethical principles like public value and regeneration.

  • ARTICLE 4: How to use this in your work. Finally, some reflections on how we practice implementing these ideas and principles.


Calling for legal reform and better government policies around facial recognition is sensible and necessary. However, the weight of responsibility in creating fairer outcomes for the product lies equally with design and product. Technology companies don’t need to wait for governments to enforce change; they are in a position to create that change from within. We hope this series can serve as catalyst and inspiration for a much needed re-think in design thinking.