The Tech Industry Is Clueless About People. Let’s Debug It.

A veteran web consultant on the industry’s biggest flaws and how to address them.

Mother Jones illustration; Dimitris66/Getty

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

In October, Google Maps rolled out an experimental feature that estimated the number of calories burned when people walked to their destinations. The feature had a little quirk, though: It used mini-cupcakes to illustrate the calorie counts.

The backlash was immediate. Users pointed out that calorie counts could act as a trigger for people with eating disorders. Others found it shaming, and questioned how Google was counting the calories. And the kicker? There was no easy way to turn the feature off. A week after it was introduced to iPhone users, Google ditched it, citing “strong user feedback,” according to TechCrunch. 

Tech design failures similar to the Google cupcake fiasco are the subject of Sara Wachter-Boettcher’s new book, Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. A web consultant who has worked in the industry for a decade, Wachter-Boettcher details countless mishaps, from algorithms that categorized black people as gorillas or failed to recognize names as legitimate to period-tracking apps designed by clearly tone-deaf dudes. Many tech products are full of “blind spots, biases, and outright ethical blunders,” she writes, and these oversights “exacerbate unfairness and leave vulnerable people out.” 

Wachter-Boettcher isn’t totally anti-tech, but she argues that coders, designers, and the industry at large need to do a better job of checking their biases. She brings us into the design process, showing how a lack of diversity can lead to very narrow ideas about who is using the product. Seemingly small things, for instance, the default choices users are presented with in an online form, can have a tremendous impact on people’s lives—especially as tech moves further into fields such as predictive policing or personalized education. I asked Wachter-Boettcher about the biases she has observed and what a more inclusive tech product might look like. 

Mother Jones: What was the first example of problematic tech that really stuck out for you?  

Sara Wachter-Boettcher: One of the first was when Eric Meyer, my co-author for Design for Real Life, had this tragic experience with Facebook’s year-in-review feature: At the end of 2014, he got this ad for a year-in-review [photo album] and Facebook made the cover one of his most popular photos of the year—one with the most comments, the most engagement, as they say. It was a photo of his daughter, Rebecca, who had died a few months before on her sixth birthday. Facebook had inserted it into their ad, surrounded her face with all these peppy illustrations of people dancing at a party with balloons and streamers and told him, “Hey, here’s what your year looked like.” He was just completely gutted by this—it broke his heart all over again.

Eric Meyer

Facebook made a lot of assumptions and decisions that had a profound emotional effect on him. And it really hit me how much tech has been so narrowly focused on metrics of engagement—by toying with emotions, or by making assumptions about what people wanted—without really thinking of the ramifications.

What’s really fundamentally going wrong in so many tech products—especially if you look at the investigation into Facebook ads and Russian meddling—is this prioritization of engagement so we can get ad dollars at all costs. People have this “move fast, break things” mentality, where it’s assumed that anything you break isn’t going to be that big of a deal, but you see that mentality being taken to further and further extremes, and it’s starting to cause tremendous problems. 

MJ: Do you think people in tech realize this?

SWB: Since the book came out, I’ve been hearing from a lot of people in tech that this is entirely new to them. Just like in any industry, you’re very focused on your work, on what’s happening day to day, and you haven’t necessarily zoomed out and connected the dots. There are certainly people wrestling with what the tech industry has created, but it’s not consistent.

I hear from a lot of people who aren’t in tech and often have a great love for tech but also have a sense of, “Oh, this is creepy, this makes me feel weird, I worry that this is too invasive.” But oftentimes they don’t have a lot of vocabulary to identify what’s going on, or why it’s making them feel that way.

MJ: In the book, you talk a lot about how the design process leaves certain people out. Will you elaborate?

SWB: Tech companies are used to designing for the “average user,” the normal person. You’ll see that in the use of personas, which are these fake profiles of people that you target in your design and ask questions like, “How did this persona perceive this feature?”But personas are really flat. They don’t reflect the actuality of people. They can be very limited and jump back into stereotypes, and can make assumptions like, “Oh, this persona is a 35-year-old mom, so what we should be targeting is this,” without really getting into the needs, desires, and patterns that are likely to drive decisions. We don’t stop and think about, well, how does this work for someone who’s different from what’s envisioned?

Eve by Glow, a period app designed by men.

Sara Wachter-Boettcher

MJ: Your book makes us think a lot more about default options. For instance, my name is too long for Twitter, and I never really thought about how maybe Twitter could do something to make sure the name field was long enough. 

SWB: Right. My name doesn’t fit either. This comes back to that profound lack of perspective on what the world really looks like, and that paternalism, of like, we know what names look like, this is fine. I don’t really think we want a group of people who are so nonreflective of the world to make decisions about how identity should be categorized.

In my mind it’s like, okay, you can still use the product. But I do think of these things as digital microaggressions. They add up over time, and they’re definitely alienating. And they’re red flags for bigger underlying issues. If you haven’t thought about diverse names, then what else are you not thinking about?

MJ: You’ve really called out the tech industry for ignoring these issues. What has been their response?

SWB: So far, so good. I think there are definitely people who would say, “I don’t care; this isn’t important.” But I’ve also gotten a lot of positive responses. There are some people who are really hungry for this, and they’re realizing something needs to change.

I’ve been thinking about this tweet from Zeynep Tufekci a lot, where she says: The tech industry has effectively been telling itself that what it’s doing is tech, but actually they are in the people business, and they’re in way over their heads. 

I was up at Google Boston’s office the other day, and somebody mentioned that a lot of the time, product ideas come from engineers who see they could do something cool. I think that [mentality] has been prized for too long. That is not what’s going to make a sustainable, ethical, tech product. That’s what’s going to make something that might seem cool on its face but could have tremendous ramifications that are not well understood.

MJ: So broadly, if a company were going to be more inclusive, what should it be doing? 

SWB: One thing would be looking at not just, “Do we have a diverse team?” But also, “Do we have people who understand how what we’re creating could impact people—whose job it is to think about that?” Then I would think a lot about process and priorities. So how do you decide you’re going to be building a new feature, who’s there to provide checks and balances, what are the ethical guidelines?

A Snapchat filter that many users called out as racist

Grace Sparapani

MJ: What do you say to the people who claim not to care about any of this. Like, “Yeah, whatever—it’s just an app.”

SWB: I’ve definitely gotten that response. I remember a while back, I wrote about this study about [how] smartphone assistants weren’t equipped to help during a crisis. I put it on Medium, and it blew up. I got hundreds of comments. A lot of them were like, “First-world problems!” Or, “Only an idiot would go to their smartphone in a crisis.” I looked at that and thought, “Well, you know what, though? People are doing that. You can think they’re stupid, but you’re not going to change the fact. So it does nothing for me to judge whether or not it’s a good or bad idea to turn to their smartphone assistant in a crisis. What is important to me is to figure out, “If that’s true, how do we react to that as an industry?”

If we can spend so much time and energy adding mini-cupcakes to things, can we spend a minute thinking about how people are actually using this product and trying to avoid harming them? The idea that the tech industry is going to get to decide for everybody what should be important or what we should care about is pretty ridiculous, and I don’t think anybody actually wants to live in that world.

MJ: So what can we do about it? 

SWB: That’s the No. 1 question people ask. This isn’t a scenario where you can vote with your dollars—most of the time you’re not paying these companies any money. You could basically opt out, which is a perfectly fine choice, but I don’t really think that’s the answer. What we’re dealing with is systemic, and the only way it can be fixed is at that level.

So what can you do as an individual is to understand what’s going on and to know how tech companies operate and what their incentives are. I think that awareness, even if nothing else about your behavior changes, creates a much better opportunity to have these critical conversations about technology, our boundaries with it, and the role it’s playing in our lives. The big problem is that tech has gotten a free pass. And there’s no reason it has to be that way.


If you buy a book using a Bookshop link on this page, a small share of the proceeds supports our journalism.

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate