A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

index.md 54KB

title: Be Kind, Design - Nat Dudley url: https://medium.com/@natdudley/be-kind-design-d28324b7c348 hash_url: f613e8e2b8

When you walk down the traffic-clogged streets of New York City, often considered to be one of the most liveable cities in the world, you might notice something about the cyclists. About 80% of them are men. But if you wander down to the banks of the Hudson River, where there is a protected cycle-way, something very different happens. There are women. There are children. There are elderly people. In fact, over half of the cyclists using these spaces are not adult men.

Uptake of cycling is considered to be one of the signs of a liveable urban environment by almost everyone who rates and measures such things. But who is cycling is just as, if not more, important than sheer numbers. You see, women and children will only start to cycle when the perceived risk to their safety and wellbeing decreases. And as it turns out, that’s something you have to design for.

Here in Wellington, we’re currently discussing locations of cycleway paths. Some people have suggested that the cycleway to go through the tree-filled town belt. At a first glance, it seems the the perfect solution. It would be beautiful and peaceful and there’d be less risk of cars harming cyclists and of course, all the motorists would be happy that their precious roads and parking weren’t affected. But the town belt has regulations, including no lighting. A cycle path with no lighting, going through tree-filled areas in the Wellington winter is not a cycle path that women, children, and elderly people will use, and thus, if you care about these people, you realise immediately this isn’t an option.

Gil Peñalosa calls urban design like this 8–80 cities. He believes that if everything we do in our cities is great for an 8 year old and an 80 year old, then it will be great for all people. And so he advocates for a design approach where you prioritise the needs of those people and measure their usage of the things you design and build. The 8 year old and the 80 year old, and other marginalised groups in our society like women, people with disabilities, queer people, and people of colour are considered to be indicator species who help us understand what is working in our cities and what is not. Instead of considering them edge cases, and designing for the majority of car-using urban dwellers, we design for them first.

Peñalosa talks about needing a sense of urgency when we address urban design because the number of people growing up, living, and retiring in cities is exploding. Every time we build things that don’t centre the needs of the 8 year old and 80 year old, we’re creating a world that systemically excludes and harms hundred of thousands of people.

When I hear Peñalosa speak about designing for cities this way, I can’t help but think about the web and the things that we build. We talk loftily about disruption and changing the world, but in reality, we design and build tools and products that cater to the needs and risks of smaller subset of people, and we exist in a system that drives us towards this mode of thinking. We build MVPs that are delivered via sprints and story points and personas, and in doing so, we average out people, blurring their corners, and relegating the needs of the 8 year old and the 80 year old to that worst of worst term: edge cases. But people are not averages, and when we deny the reality of people’s existence, we make the web a hostile and unsafe place.

The result is an Internet where location details are unwittingly leaked.

One where women are banned from Facebook for saying ‘Men are scum’ after months of harassment, but where the harassers are not banned.

One where the dating profiles of queer people are used by oppressive governments to track them down and persecute them.

And one where even the most ubiquitous parts of our web experience still tells people who aren’t cis or western that this isn’t really made for them.

We need to design an 880 web. A web where our indicators for success are 8 year olds and 80 year olds and women and queer people and Māori people and people of colour.

We need to start thinking about inclusivity the same way as we think about mobile design. We realised with mobile, that we have to put that experience at the centre of what we do, otherwise it won’t be successful and we’ll fail mobile users. We realised we have to design mobile-first.

The same is true for inclusivity. Instead of it being an afterthought if it’s thought about at all, it needs to be our first thought. It needs to be central to our strategy, embedded in how we analyse and solve the problems we encounter. Designing inclusive-first is the only way we’ll manage to serve and protect all of the people who use what we build.

So, what does it mean to actually be inclusive in our design?

Inclusive design is more than just making sure your stock photos and illustrations include more than just straight white men. Although you should absolutely do that because representation and seeing yourself in the products you use matters, but you also need to make sure you’re addressing the deeper issues.

Inclusive-first design is design that:

  • Enables equivalent access for everyone
  • Enables equivalent experience for everyone
  • Is safe for everyone

And it does it as a priority, not an after-thought.

And it needs to be equivalent, not equal. You need to ensure that you’re providing everyone with the same outcomes, which means you need to understand the barriers they face.

I’m not going to go into the access part today, it’s vitally important, but the excellent Laura is covering that tomorrow. But it’s worth noting that access also includes performance. You can’t have equivalent access if your pages are so heavy it costs someone in the developing world or on a prepaid data plan in NZ far too much to access it.

But we’re going to look at some examples of the other two points. They tend to be more subtle and may not be things you notice because they don’t impact you personally.

An equivalent experience means that nobody who uses your product is made to feel less welcome or included, and where everyone who uses it has the same opportunity for success. This can cover everything from your form design to predictive algorithms based on biased datasets.

Safety for everyone means that your product or service doesn’t put people, and especially marginalised people, in a position of additional risk because of the way you handle their privacy and data. This means you need to think about people for whom information like location, name, gender, sexuality, marital status leaking could put them at serious risk.

A lot of the most important parts of this boil down to how we collect, use, and retain data about and for our customers.

Data Collection

When we collect data, especially data about identity, we have a tendency to simplify reality into something easy for a computer to understand. But identities are complex, and we often erase the reality of people’s lived experiences, making them feel unwelcome in the systems we create.

You might not think it’s a big deal. It’s just an abstraction, right? Surely they’ll understand? Well, micro-aggressions are small, often repeated instances of discrimination or harm, especially towards marginalized people.

You need to care about these in your designs because cumulatively, they take an emotional and cognitive toll on your marginalised customers. They impact the sense of belonging for people encountering your designs and can evoke past traumas.

2005 research from McDonald and Leary on social exclusion observed that

“When individuals are accepted, welcomed, or included it leads them to feel positive emotions such as happiness, elation, calm, and satisfaction. However, when individuals are rejected or excluded, they feel strong negative emotions such as anxiety, jealousy, depression, and grief, and even physical pain.

MacDonald, G., & Leary, M. R. (2005). Why does social exclusion hurt? The relationship between social and physical pain. Psychological Bulletin, 131, 202–223. doi:10.1037/0033–2909.131.2.202

Micro-aggressions are by their nature hard to spot if they don’t impact you personally. And they are everywhere in our designs, but it’s important to remember that just because it’s the norm doesn’t mean it’s right. We can apply critical analysis and do better.

So, let’s look at a couple of examples of data we routinely collect and routinely cause harm with: Name and gender.

Names

How hard can names really be, right? Well, we could ask Su Yin.

Our name is a cornerstone of our identity, so handling names with care and respect is super important. When someone’s name is not accepted or mishandled by our systems, it sends a message they aren’t welcome.

Decolonise name collection

We need to decolonise name collection. Much of the way we think about an approach names on the web is based on a western-centric view: that someone will have a one-word first name, maybe a middle name, and a surname.

That model doesn’t hold up when we want to include everyone. Like Su Yin, many people have two first names, and putting one into the middle name field isn’t going to cut it.

But it’s more than that. Mononyms, or single names, are common in parts of India, Indonesia, Tibet, Mongolia, Afghanistan, and Hollywood.

Building forms and database models this way means people either can’t sign up to your services, or have to kludge their name into your form fields, resulting in unexpected errors like Su Yin saw.

We also often make the assumption we can address people by whatever they put into the first name field. That’s based on Western name order. In Eastern name order, family name comes first, and whilst most people have become somewhat used to us forcing them into our patterns, it’s another example of colonisation. It also means we get mixed datasets where some people do it one way, and some do it the other way, which is a problem when we make assumptions about what we can do with the information we find in that field.

Not only that, in some cultures it is considered extremely impolite to address someone you don’t know by their given name.

So, what’s the alternative?

Just put one field. One single, unbounded field. No length restrictions. No restriction to alpha. Unless you have a compelling reason like compliance or horrific legacy systems integrations, enough with the multiple name fields already.

Then, if you need to address them directly anywhere in your app whether that’s on screen or when you email them, ask them how you should address them.

But non-western folks aren’t the only people impacted by our poor name design decisions.

Think about non-binary people and women’s needs.

People who are non-binary or trans may have a name they were given at birth that is not the name they use, and it can cause great distress to have to provide that or be identified using that name. It’s called deadnaming, and it’s tremendously offensive because it denies the reality of their identity. This can contribute to gender dysphoria, depression, and anxiety, which is a very real and significant problem given trans people, especially trans youth, face far higher rates of mental illness and suicidality.

The Trans Pathways study released in 2017 in Australia showed young trans people are ten times more likely to experience serious depression and anxiety relative to other young Australians. They also found that one in every two gender-diverse young people they heard from has attempted to end their life. Again, you may think that your name field isn’t that big a deal. But that’s not how microaggressions work. This stuff is cumulative.

So what does this mean for our name design? It means we need to be very, very careful when asking for a legal name. We should only ask for a legal name if we have a legitimate reason to do so. Things like government sites, banking, airlines etc fall into this category. If we’re asking for a legal name, we should tell customers why we need this. If we don’t have a compelling reason that will benefit our customer, then we shouldn’t be collecting it.

We also need to ensure that it’s as painless as possible to update names we hold. If we didn’t verify someone’s photo ID when they started their account, we probably don’t need to view their name change documentation, for example. We also need to make sure that any updates are immediately promulgated to all other systems — cards, ticketing, invoicing, emails, support systems.

As an added benefit, you’ll make life a lot happier for the other large group of people who often change their name — people getting married.

Turns out collecting people’s names is a bit harder than you think when you care about making sure everyone is included and you’re not causing harm. So, you could just not. If you don’t actually need someone’s name, consider just collecting a username. For anything other than legal or compliance requirements, a consistent pseudonymous identity will almost always suffice just as well.

Gender

And then there’s gender and sexuality. Like name, the way we collect and enforce our rigid ideas of gender and sexuality on people who use our sites is often hugely harmful to trans, non-binary, and queer people.

Almost everywhere, we enforce a cis-hetero-normative view of gender and sexuality.

In case you’re not up with the play, it’s widely acknowledged that gender is not a binary state or even a continuum. So, it’s not OK for us to design and collect data that reinforces the outdated notion that it is. This means no more forms that collect ‘female’ and ‘male’ which, by the way, aren’t genders anyway.

But Nat, I can hear you think, we need that information for marketing and reporting and feeding our sweet, sweet, machine learning algorithms.

Just say no. Aside from anything else, it’s plain lazy to fall back on the assumption that somebody’s gender tells you what you can sell to them. From technology to tampons, it is not a safe assumption that if someone ticks male or female, they want to know or care about a particular subset of products.

The best advice I can give you when it comes to collecting gender information is to not do it. Most of the time when you think you need it, you really don’t. But like anything else, there are some exceptions.

If you’re designing a product where it’s important for your customers to be able to share and communicate their identity, then use a freeform field and let them define it however they want. This also applies to surveys you might conduct to establish the demographics of your customers, with one huge difference — if you need demographics on your customers, conduct a survey, but don’t keep the individual results. You don’t need to retain this data, and storing it creates additional risk.

There are some products and services where you may legitimately need to ask a more restrictive question. If you’re designing a medical product, for example, you may legitimately need to know someone’s sex assigned at birth and their actual gender. A great example of this is passports, where their interest is making sure they’ve correctly identified a person entering the country. So, passports in NZ have an ‘X’ determination of gender which indicates to border control that the passport holder’s gender presentation may be unconventional.

If your product or service is like this, communicate why you need this information and what you’re planning to use it for. This means you need to understand why as well.

Queer people will understand if you have system limitations. We’re used to it. But don’t present it as the default with no explanation. If you feel bad or stupid saying you can only collect gender as man and woman because of your system limitations, then consider sitting with that discomfort and maybe trying to fix the system instead. Consider providing an additional non-mandatory text field for people to provide their actual gender to you also.

What we collect, and what we do with it.

So, collecting personal data is fraught with peril. Turns out trying to encode the complexity of humanity into something computers can understand is non-trivial. But there’s a deeper question about data collection when it comes to inclusive-first design, and that’s about how we use, retain, and share data about the people we serve.

At the moment, our industry treats personal data something like this.

We tend to collect as much as we can, and just store it away in case we need it later. Often it’s not data we actually need. But we collect it anyway, because data has become currency in our industry.

We’ve tended to be a bit cavalier about the risks this approach is creating. We’ve aggregated it and shared or on-sold it. We’ve used it to create predictive algorithms. And we’ve pretended we can store it, or anything else, securely.

That approach has enabled us to profit handsomely, but of course, it’s also put our customers, and particularly our marginalised customers, at greater risk.

Storing name and gender data, especially historical data or ‘real name’ data, about trans people can put them at huge risk if that data leaks. Storing identifying information for activists working under pseudonyms can put their safety at risk if that leaks. Storing information that can point to the location of people who escaped from or are experiencing domestic violence can put their safety at risk if it leaks. You cannot design an inclusive-first system if you’re not thinking about the threat models of your most vulnerable users.

The baseline is that if you have this stuff in your database and you leak it and someone gets hurt, the cops are going come talk to you. So, don’t store anything you’re not 100% willing to go to jail for breaching. Make friends with your security team. Talk to them and understand the risks you’re taking.

But it gets worse. It’s not just about stuff that obviously identifies them. Our quaint notions of personally identifiable information and safe aggregation of data are almost irrelevant in a world where machine learning means seemingly innocuous data points can be used to identify and locate individuals, and where deaggregation of aggregate data is increasingly possible.

The time of reckoning for our approach to data collection and management is coming.

It’s entirely likely that at some point soon, we’re going to see legislation around this, but if you care about inclusivity and your customers, I’d like to encourage you to take a different approach right now.

  1. Only collect data you immediately need to provide a direct benefit to your customer.
  2. Only store that data as long as you need it to provide those benefits
  3. Stop publicly releasing or selling aggregate data. It’s not safe. You don’t know what other data it can be combined with and what that will uncover.
  4. Stop pretending that doing any of this is OK because of a line in your terms and conditions. What we are doing has the potential to harm people. Just ask Strava.

But y tho?

You might be asking yourselves why we’re the ones who have to care about this. After all, everyone else is treating their customers poorly, so why should we be different.

It’s a matter of scale. Like Penalosa’s urgency for good urban design in cities, we need to care because our work has reach. The work we do is part of every industry on the planet. We are defining or redefining the interaction models for every part of society, and we’re doing it at a scale we’ve never experienced before. Changes we make can affect millions of people in seconds without their knowledge or consent. Decisions we make can reinforce existing power structures and biases, or they can break them down.

That puts us in a position of a lot of responsibility, responsibility you may not have thought you were signing up for if you were just someone who liked playing with computers a lot.

But it’s also an opportunity for us to choose what kind of people we want to be and what kind of industry we want to be. Do we really want to turn a blind eye to the harm we cause people? Or are we better than that? MLK said that the arc of the moral universe is long and bends towards justice, but it only does so if people are actually doing the work to make it that way.

We’re in the perfect position to do something. We can choose to stop perpetuating systemic inequalities in the products we build. We can choose to build inclusive-first and create that better, kinder 880 web for the rest of the world, and in doing so, focus on the privacy and ethics that are so key to delivering on that promise.

The bad news for you is that there is no quick-fix way for us personally to be able to see all the ways our designs are excluding marginalised people. That doesn’t excuse us from learning — we can read and listen to the things that marginalised people say about their experiences and make sure we interview them as part of our research and testing, but we’ll never naturally spot the risks and issues the same way as someone who has experienced them will. Call it the byproduct of millenia of evolution making us pretty great at protecting our own skins.

Fortunately, we don’t build software in isolation, so what we can do is hire people into our team who have difference experiences to us who are going to be good at spotting the things that impact people who share experiences with them.

But hiring them isn’t enough, because when we hire someone into our teams who is going to point out things no-one else sees, it turns out it can be pretty awful for them if the rest of the team don’t want to listen because it doesn’t align with their life experiences. We need to be aware that it’s not their job to teach our teams about diversity and marginalised identities. Their job is to be a designer or developer or QA person or product manager, not a diversity consultant. So when they raise things our teams don’t understand, we need to listen, believe them, and do our own research into why.

The truth is that real work, the real design, is about sacrifice, not glory. This stuff is hard, but it’s the tiny every day things you do in your job to prioritise the needs of marginalized people that make a difference. It’s when you support your coworker when they tell you something is unsafe or that the language isn’t quite right or that you’re leaving people out with your design or engineering decisions. It’s when you advocate for those positions yourself to people in power within your organisation, leveraging your social capital so they don’t have to. It’s when you band together and refuse to implement something unethical because even though it won’t impact you, it will impact other people.

You’re might be hoping that I’m I’ll to give you a business case that you can take back to work with you to convince your bosses why they should let you do this work. And you could make a business case for this work, like you can make ones for accessibility and diversity in tech in general. That business case would probably centre around increasing your customer base and reducing your risk to your reputation or that someone might sue you for discrimination.

Fuck your business case, with its perverse incentives and profit-at-all-costs vc investment-driven rush to exploit users. Our business models are broken and we focus on the wrong things.

So yeah, I’m not going to give you one, because honestly it makes me feel a little bit sick to even say those words — a business case for inclusion. We have to change our entire business models because we shouldn’t need a business case to do the right thing. To be ethical. To use our positions and privilege and power to do the right thing and build a more equitable world. Those shouldn’t be optional things. Those should be required things.

I believe we are better than this. Let’s prove it.