When you’re showing people how to improve their security, demonstrate what to do and why.

There’s a thin line between being frightening and memorable when talking about security. Instead of using scare stories, it’s better to show how someone’s security can be breached and then explain what they can do about it.

For example, there are tools like:

You can see the data that leaks from apps on your mobile phone

Open source tools can expose the contents of data packets sent from Android apps to external services, even when using HTTPS. An app by the Chinese Internet company Baidu was shown to leak timestamped data about a user’s surrounding WiFi networks. The specifics of this are written up on the IFF Wiki.

The internet is still a medium of consumption for most of the world

In Africa, the majority of citizens use mobile phones to access the Internet. This encourages a culture of content consumption rather than production. There is a stronghold of zero-rated mobile Internet services across much of Africa, partly driven by the disinterest in online activities beyond social media.

Empowering people through improving digital literacy could help people understand what the web is, beyond the narrow view of large Internet companies and why the web could be relevant to them. New teaching methods are needed to introduce the idea of the web being a tool that they can actively participate with, rather than just consume.

This is a small part of the larger challenges of connectivity in Africa and more work needs to be done to improve Internet infrastructure on the continent.

What feels secure to one culture can be extremely dangerous for others

The threat to LGBT communities in authoritarian countries means that privacy and security improvements in dating apps is becoming increasingly urgent. Some apps warn users about the dangers of using dating services in hostile places, like alerts when an app is opened, but more work needs to be done to make these apps more secure.

  • Some apps hide location reporting (e.g. user is 10 miles away), but only in the interface. When the data payload for the profile is sniffed, a user’s location can be found very easily.
  • Security practices like using burner phones have been shown to be ineffective, particularly in Bangladesh where biometric data is stored against SIM cards.
  • Services like TrueCaller can expose the identities of malicious callers, but the app simultaneously scrapes their user’s contact list to mine data.

Libraries are important information commons for the Internet age

Libraries are a brilliant place to talk about privacy and security because they are trusted institutions and reach a wide audience.

The Library Freedom Project works with the ACLU in the United States to help librarians incorporate privacy tools like Tor into their libraries. They give librarians the support they need to know their rights when it comes to information requests from law enforcement and provides librarians with tools to engage their patrons in privacy by using the library as a platform for civic rights.

It would be great to see something like this in the UK and it would be interesting to see the dynamic between libraries that campaign in this area and the party political councils that fund and run them.

The public should have a greater role in defining what makes a city “smart”

The “smart city” is a vague marketing term used to increase control and support gentrification in the Global South. The panel assessed how people’s trust in technology and our obsession with big data has led to the proliferation of smart city concepts. Companies in this industry are known to exploit the stereotype of the “chaotic” Latin American city, promising to bring order to this through their technology.

The most worrying negative effect is the growth of police presences within disadvantaged communities, created by historical crime data being used as an unaccountable predictor of future crime. To remedy this and other problems, the panel pushed for more public consent and participation in the development of city-scale technology and stronger legal frameworks for privacy.

We should talk more about the moral responsibility of developers towards security and privacy

A good anecdote from the beginning of this session was the tradition of engineering graduates in Canada receiving a steel ring from a collapsed bridge, as a reminder to be ethical in their professional practise.

In digital, what is the equivalent of a steel ring? Where does responsibility for secure and private software lie? Should the moral responsibility of developers be legally enforced?

The session recalled Ralph Nader’s car safety advocacy in the 1960s where public pressure led to the U.S. Government making seat belts compulsory. At the time, people didn’t ask for seatbelts, they just wanted safe cars. Today, people don’t want encryption, they just want privacy.

Designing participatory algorithms is a difficult problem

Data is not neutral. There are a number of sociopolitical decisions that have already happened by the time we collect data. Similarly, algorithms don’t just exist, they are purposefully designed and often reflect political or organisational principles.

However, services rarely uncover their algorithms to their users and this has a big impact. We rarely understand how a service or product works, we don’t know what data is collected or why it’s collected and what that data shows about us. We make up myths to help ourselves explain how a thing works, so many of us are mistrustful of products and services that use algorithms.

Is there another way of designing with algorithms? What if the services we use surface their algorithms and make them participatory, inviting user participation in a way that maintains user agency. Would participatory algorithms make us feel more empowered?

A couple of good resources mentioned in this session: