Amazon’s AI Hiring Bot Turns Out to Be a Bias Sniper
In a recent Reuters piece, the tech giant Amazon quietly shut down a “sexist” AI tool it had been tinkering with for years. The bot, supposed to sift through thousands of resumes and handpick the cream of the crop, ended up favoring men over women after learning from a decade‑long dataset skewed toward male applicants.
What the Team Was Trying To Build
Starting back in 2014, Amazon’s hiring squad aimed to create a “holy grail” of recruitment: an engine that would take any number of resumes, crunch the numbers, and spit out the top five candidates ready for interview.
- Vision: “If I throw 100 resumes at it, I want the top five for a quick hire.”
- Approach: AI‑powered resume screening.
- Data source: 10 years of applicant information, mostly from men.
The Unintended Outcome
The algorithm unintentionally learned that male resumes had a higher probability of being successful—simple bias in a data pipeline turned into a misogynistic hiring shortcut. By the start of last year, Amazon decided this was a slippery slope and disbanded the entire project.
What’s Next?
So far, Amazon has kept quiet in response to the report. But one thing’s clear: even the smartest bots can go sideways if they’re fed skewed information. Hopefully the company uses this as a learning moment.
