This past May, New York City mayor Bill De Blasio assembled a task force to review how city departments use algorithms to make decisions. In a report due to be released in late 2019, the task force will determine whether the algorithms are "fair," "accountable," and "equitable."
Regardless of whether the New York City report finds that AI can be trusted, bias is a known problem -- especially for algorithms used for critical public and private functions. Algorithms decide who is approved for a home loan, who receives a job interview, and even who is granted parole, and soon AI could decide what students learn. If biases hiding within basic functions of society continue to go unchecked, communities that are already underprivileged could be locked into a cycle of disadvantage.
At present, public organizations that find bias in their algorithms have few options other than discontinuing their use. The private sector, however, has hit on some promising solutions.
You don't have to know AI to create a great AI business just like you don't have to know diagnostics to create a great med-tech company. A large quantity of health tech companies are just improving digital systems and processes. The same will be true in AI. Pointing and directing its use can be your next business.
Here are three advances in machine learning that are opening up opportunities where they have traditionally been unavailable.
You don't have to be a data scientist to leverage AI.
In the U.S., just 38 percent of small businesses are owned by women, and 28 percent are owned by minorities. Kabbage, an online lending platform for small businesses, uses algorithms that look exclusively at the real-time business performance of customers to qualify applicants. As a result, they see a higher percentage of women- and minority-owned businesses than the national average of these same businesses because the company's algorithms don't consider protected-class information, such as race and gender.
Kabbage is sidestepping the "bias in, bias out" problem by having an application process that's designed to never let it in bias. But this illustrates that you don't have to be a data scientist to create better uses of AI.
Reverse engineer the problem that you are looking to address whether that is in finance, healthcare or elsewhere. The best advice I can ever give in starting a business, is that you shouldn't focus on a problem that exists. You should address a problem that will exist. And that's the entire point of this article. AI is going to grow and in turn it is going to ned systems in place to make it more efficient in any organization.
High-Tech bias recognition will still need low-tech solutions.
Now, there are also high-tech ways this being addressed. In May 2016, Microsoft entrusted the online public with a different kind of algorithm: one named Tay, a Twitter chatbot, that it hoped might learn to interact with other users in organic ways. But because of the vitriol directed toward it, Tay quickly learned to pump out hate speech ranging from Holocaust denials to anti-feminist tirades.
Since then, Microsoft has begun building a tool to identify bias in a range of AI technologies-turning a misstep in to a positive. Although Microsoft has been relatively tight-lipped about the program's inner workings, senior researcher Rich Caruna says the tool aims to automate detection of unfairness at a time when few in the field can evaluate the transparency, intelligibility, or explanations AI algorithms provide.
This shouldn't scare you from trying to address AI. There will be thousands of industries that will need to better implement AI and there aren't enough companies addressing this from a consulting perspective. It's similar to the fact that there are a lot of data companies now run by people who aren't programmers-they just identified an industry need and pooled expertise.
This is a business opportunity with purpose.
Biased algorithmic outputs might seem harmless in a vacuum, but their consequences are very real. A deserving small business owner may not get a loan; a single mother trying to support her family may be denied an interview. The timing for when the first organization, public or private, can entirely eliminate bias from its processes remains unknown. In the meantime, there are private companies working diligently with AI to reduce bias.
But the next step in that evolution is for low-tech startups to emerge that will address the human side of AI. How can a chiropractors office use AI for scheduling? How can a realtor maximize their reach with AI? How can a financial planner reduce the dreaded cold call with AI? How can municipalities make it easier for low-income people to get transporation to jobs?
Addressing these issues and more might seem small but can create both a large consulting business and a great impact.