Lizard Brains Still Control Us All

Over the past few years Amazon has been experimenting with new software to help them make better hiring decisions:

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars — much like shoppers rate products on Amazon, some of the people said.

Hmmm. I’m not sure that machine learning is yet at a stage where it can really help much with this. On the other hand, it can be useful for ferreting out existing hiring patterns to see what Amazon’s managers seem to value most. So what did they find?

By 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way. That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools….The Seattle company ultimately disbanded the team by the start of last year because executives lost hope for the project, according to the people, who spoke on condition of anonymity.

This is yet another confirmation—as if we needed one—that even the best-intentioned of us are bursting with internalized biases. Most of Amazon’s managers probably had no idea they were doing this and would have sworn on a stack of C++ manuals that they were absolutely gender neutral in their hiring decisions. In fact, I’ll bet most of them thought that they bent over backward to give female candidates a break. But down in the lizard part of their brains, it was the same old story as always: they preferred hiring men to women.

There’s a limit to how much you can take away from this. It’s another example of how implicit biases can affect us all, and a warning that any system we’re responsible for training—whether it’s fellow humans or digital computers—will pick up those biases. We all know we need to be careful about passing along our biases to the next generation, and it turns out we have to be equally careful about passing them along to the software we build.


The more we thought about how MoJo's journalism can have the most impact heading into the 2020 election, the more we realized that so many of today's stories come down to corruption: democracy and the rule of law being undermined by the wealthy and powerful for their own gain.

So we're launching a new Mother Jones Corruption Project to do deep, time-intensive reporting on systemic corruption. We aim to hire, build a team, and give them the time and space needed to understand how we got here and how we might get out. We'll publish what we find as a major series in the summer of 2020, including a special issue of our magazine, a dedicated online portal, and video and podcast series so it doesn't get lost in the daily deluge of breaking news.

It's unlike anything we've done before and we've got seed funding to get started, but we're asking readers to help crowdfund this new beat with an additional $500,000 so we can go even bigger. You can read why we're taking this approach and what we want to accomplish in "Corruption Isn't Just Another Scandal. It's the Rot Beneath All of Them," and if you like how it sounds, please help fund it with a tax-deductible donation today.

We Recommend


Sign up for our newsletters

Subscribe and we'll send Mother Jones straight to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.


Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.