A growing list of technology companies are banning the use of facial recognition tools by law enforcement agencies. Microsoft and Amazon joined IBM, which previously said it was exiting the business of making the technology altogether. 

Facial recognition software has become an increasing part of our daily life as our smartphones, tablets, and many laptops use the technology to secure or unlock. It's also become an area of focus for companies wanting to leverage the power of artificial intelligence (AI) to automate processes that used to take large amounts of time and had to be done manually.

As facial recognition's use increases, privacy advocates have expressed concerns that it violates privacy, as well as contributing to discrimination. Until now, neither of those concerns were enough to persuade tech companies to reconsider how they made the tools available to government and law enforcement agencies.

Amazon's software, called Rekognition, allows police departments to upload photos from security footage, for example, and compare them against a database of known images like mug shots. That's not all that different from what a person might do comparing the photos, but using AI and machine learning, the facial recognition tool is able to do the same task much faster. It's also relatively inexpensive, making it an attractive option for government agencies. 

At the same time, studies have shown that the technology has a much higher error rate when comparing black and Asian male faces, which could potentially lead to the arrest of the wrong person. This major flaw has come to the fore as tech companies find themselves responding to the public outcry around cases of police brutality against minorities. As a result, they're making a decision that providing the technology to police departments is bad for business. At least for now.

"We will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology," said Microsoft's president, Brad Smith.

In a statement, Amazon said it hopes that a "one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested." Amazon has said it will continue to provide the tool to organizations that use it to fight human trafficking and helping missing and exploited children. It isn't entirely clear whether Amazon is halting its sale of the technology to the federal government. 

On one hand, it's fair to ask tech companies why it took this long to find their way to the right side of this issue. It shouldn't have taken public outcry for them to realize their tools could be used in harmful ways.

It's also particularly interesting that the move comes as Congress is taking up the issue. With lawmakers set to regulate the use of facial recognition, the companies that make it are understandably anxious to see where both public sentiment and government policy go on this. 

At the same time, I'm willing to give tech companies at least some credit for realizing that while they may be producing products with positive intentions, there are collateral effects that contribute to structural, societal problems. Taking action to do the right thing, even when it costs you money, is still the right thing.

If nothing else, that's a lesson we could all learn right now. It's long past time to do the right thing, and even when giant tech companies might have mixed motives, we all benefit when we encourage them to do the right thing anyway.