Amazon has just landed into hot water over its facial recognition technology. And the American Civil Liberties Union is rubbing the company's nose in it.
It's not the first time a big name in tech has waded through criticism over this particularly type of technology. Facebook was sued and Microsoft scrubbed mentions of its work with U.S. Immigration and Customs Enforcement, or ICE. But this is the first time the embarrassment directly involved members of Congress.
Not in a hearing. As subjects of a face matching experiment with some significant mistake, including confusing the face of famous civil rights leader John Lewis with a mug shot.
Amazon licenses its facial recognition software -- called Rekognition, which has many applications in marketing, as you might assume -- to law enforcement agencies across the country. There are long-standing concerns about the accuracy of facial recognition and how use by police and federal agencies could potentially wrongly associate people with crimes.
The Congressional Black Caucus wrote a letter of concern to Amazon. Systems from many companies have often been particularly inaccurate in identifying people of color.
The ACLU used the publicly-available software from Amazon, spending $12.33 on the experiment. They compared a database of publicly-available mug shots, some 25,000 in all, with "public photos of every current member of the House and Senate." That is 535 people.
Overall, 5 percent of the members of Congress, 28, were wrongly matched to people in the mug shot database. While people of color are 20 percent of Congress, they were 39 percent of the false matches, including the quite famous John Lewis.
Of course any pattern recognition technology will be imperfect and create both false positives, where a match is made that doesn't really exist, and false negatives, in which actual matches are missed.
Nothing like catching some of the bad type of attention Congress can hand out by comparing members to arrestees.
The Silicon Valley mentality has an unfortunate inclination to send things out before they're really finished. Build fast, get feedback, create the next iteration.
That's fine if the problems that creep up aren't too large. When you potentially could help send people to jail for things they didn't do, you now have a significant business and moral problem.
Amazon told TechCrunch that there wasn't anything wrong.
Amazon has, naturally, rejected the findings. The company noted that such technologies are used to narrow down results, rather than make arrests. "We remain excited about how image and video analysis can be a driver for good in the world," it said in a statement.
Uh huh. All is fine, until they or someone close to them are on the wrong side of an interview desk in a police station. Or until some members of Congress decide to hold hearings.