Sara Wachter-Boettcher discusses the inherent bias in the apps we use every day

Contributed by
Nov 30, 2017

There’s a serious bias problem with the tech we use every day. “I think tech is often designed to appeal to ‘average users” or 'normal people,’” author and digital strategy consultant Sara Wachter-Boettcher explains. “The problem is that ‘normal’ and ‘average’ often actually means ‘people like me.’ And if the team designing technology isn’t very diverse, then that’s a limited view of what people are like and what they might care about or be going through. As a result, you get a lot of tech that’s really designed for straight white guys in San Francisco with disposable income.”

Wachter-Boettcher’s recent book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech confronts this problem head-on. Often, the tech you use every day wasn’t actually designed for or by anyone like you. “Despite a lot of talk about user-centered design, tech was missing the mark when it came to designing for real people,” she explains. “Over and over, I would see products that pushed boundaries, content that was insulting, and experiences that made huge assumptions about people.” It’s what led her to write a book on the subject of toxic technology, investigating why exactly these apps that are supposed to be catering to our every need so often don’t serve us at all.

“People often talk about technology as a mirror: It’s just reflecting back what’s already present in society,” Wachter-Boettcher says. “But it can actually be a magnifying glass. When a tech product presents a limited view of people—for example, if it demands that you identify as male or female, or it assumes that all its users are straight — then that reinforces that that’s all that exists. For the person using the product, this can feel alienating and frustrating.” In addition to that, though, when the people who design tech make these assumptions about their user base, they reinforce existing stereotypes in society.

Wachter-Boettcher uses the example of the online craft site Etsy in her book Technically Wrong. In January 2017, the app sent a push notification to users that said, “Move over, Cupid! We’ve got what he wants. Shop Valentine’s Day gifts for him.” The designers of this message didn’t take into account the fact that many of their female users might be partnered with women or people who are non-binary. It might seem like a small thing, but when every app is designed this way, it quickly makes users who don’t fit the mold feel like they don’t belong. But it doesn’t end there.

“This gets much, much more worrisome when we talk about all the ways tech companies are building bias into algorithms, where it’s invisible—but dangerous,” explains Wachter-Boettcher. “For example, if you train an algorithm to sort resumes and look for top candidates by showing it the people who have been hired for the role in the past, and all those people are men from certain schools and certain backgrounds, then you will train the system to bring those biases along into the future. And because the machine is making choices, people tend to trust it. Suddenly you’re saying, ‘Well, I guess it just so happens that all the top candidates are white’—and so the bias of the past is strengthened by the algorithm.”

The answer to this problem, Wachter-Boettcher believes, is a rethinking of these issues on a larger level and a serious discussion about ethics in tech. “What are the overarching values and priorities in tech that have allowed ‘engagement’ to trump all else? And what are the ethical principles that should be guiding all work in a tech company? How do those principles manifest as actual processes, practices, etc.? Most tech companies can’t answer that question. There might be some grand statement written in some corporate document somewhere about their vision to help people, but there’s typically little in terms of systems and patterns that the organization follows to ensure it’s actually serving people ethically.“

In the meantime, she advises people to be conscious of how the tech we’re using every day affects us. “Be aware of how a product makes you feel! Is it making you feel creeped out? Depressed? Overwhelmed? Maybe that’s the product’s fault.” Additionally, pay attention to how apps are using your personal data, as well as the default settings they come with. “For example, if an app has a default avatar that looks like a man, what does that say about who they thought would use the product or who they assumed they were designing for?”

The end goal is to recognize that this is a systemic problem, rather than individual occurrences of blindness. “When we can understand that biases, oversights, and unethical decisions exist as part of a system, we can start to look at systemic ways to address them.” That’s the only way to really solve these issues. “Think about them as individual things and you eventually surrender to exhaustion. Think about them as a system and you can get less invested in each individual occurrence and spend your limited cognitive and emotional resources figuring out what to do about the system.”