Big data has been a business buzzword for years now, with entrepreneurs and the media endlessly chewing over the promise and perils of collecting massive quantities of data.

Few are better placed to wrestle with these sorts of questions than Cathy O'Neil. She's a Harvard-trained mathematician who went to work as a hedge fund quant after completing her PhD. She recently she wrote a book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, about her experience.

In it, she raises a rarely discussed worry about big data -- not only is it subject to our biases, but it can also actually legitimize and amplify them. An algorithm, she argues, can totally be racist.

Big Data, big misconceptions.

In a fascinating interview with the Harvard Gazette, O'Neil explains how her experience of working in finance during the financial crisis opened her eyes to the fact that data, which she'd previously viewed as neutral and apolitical, can be "weaponized." Math, she learned, can be made to lie.

But the problem goes deeper than just bad faith actors knowingly manipulating algorithms to get the results they want (such as higher ratings than were justified for mortgage-backed securities). The larger issue is that even quants and their employers who act in good faith can end up doing profound harm. O'Neil explains:

Big data essentially is a way of separating winners and losers. Big data profiles people. It has all sorts of information about them - consumer behavior, everything available in public records, voting, demography. It profiles people and then it sorts people into winners and losers in various ways. Are you persuadable as a voter or are you not persuadable as a voter? Are you likely to be vulnerable to a payday loan advertisement or are you impervious to that payday loan advertisement?

OK fine, you might say, no one likes to be labeled a loser, but talk about the whole process with less inflammatory terminology and you could see how sorting customers could spare business a lot of wasted effort and customers a lot of annoying, irrelevant marketing. But this misses the truth that big data isn't just being used to decide which coupons to offer shoppers, or which flyer to mail to a particular voter.

The public, O'Neil asserts, doesn't "quite understand how pernicious [algorithms] can be and, often, that's because we're not typically subject to the worst of the algorithms: the ones that keep people from having jobs because they don't pass the personality test, the ones that sentence criminal defendants to longer in jail if they're deemed a high recidivism risk, or even the ones that are arbitrary punishments for schoolteachers."

This sorting process isn't always a win-win for the sorted and the sorter. It can be adversarial, benefiting the company or institution and harming the person profiled. Many people, O'Neil suggests, gloss over this reality.

Disguising bias

And worse yet, quants and companies often bake bias into the algorithms we use to sort people in these adversarial and high stakes situations. The factors used to separate the 'winners' and the 'losers' (or whatever kinder label you'd like to use), can include characteristics like gender and race that we know are highly susceptible to bias - so highly susceptible, in fact, that you are legally prohibited from using them to make many of the most consequential decisions.

You can't say you didn't give a family a loan because they were black without facing a lawsuit. But many algorithms, even those employed by peer-to-peer lending marketplaces, make decisions based, in part, on race all the time. We accept them because they're dressed up in a veneer of math.

With her critiques, O'Neil says, she wants to start a conversation about "what it means for an algorithm to be racist."

Big data, she concludes, holds tremendous potential. Used thoughtfully, algorithms can actually strip human bias out of decision-making. But you have to be aware of the problem to correct for it. And most of us, including many quants, are not.

Want more details of O'Neil's thinking? Check out the complete interview.

What do you make of O'Neil's critique?

Published on: Nov 2, 2016
The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.