It's worth noting that the overarching  sentiment here regarding personal data use--i.e., this is the end--isn't watered down by any qualifications. Companies that traffic in personal information using the term "privacy" as a cover for the complete absence of it, are about to hit a brick wall.

Lost in much of the nonstop media coverage of the Cambridge Analytica privacy grab is exactly what happened: The curtain was drawn back and we got to see the Great and Powerful "Us" being used by information brokers to make a king--or rather, a president.

Without settling the way privacy is understood, and perhaps legislated, what happened with Cambridge Analytica is going to happen again.

What Happened

Let's be clear. This was not a data breach.

Facebook knew for years that it was possible for third party apps to access tons of data, and they knew for at least two years that Cambridge Analytica had possession of data associated with more than 50 million--almost all of it being used without the knowledge or consent of the people with whom it was associated.

While it's tempting to say the Cambridge Analytica story never would have seen the light of day had it not been for a 27-year-old whistleblower, that's not exactly accurate.

Christopher Wylie came forward about his role in building the "psychological warfare tool" used by former White House insider Steve Bannon during Donald Trump's successful 2016 presidential campaign. Wylie revealed the project had used an enormous amount of information scraped from Facebook by a third party exploiting a backdoor that has since been sealed. There was sufficient data to create a master attack plan of never-before-seen proportions, and, as it turned out, it helped win the election. The reason the story exploded is simpler: We've reached a tipping point.

For sure, there was a major violation of consumer trust, but the reason this story shook us to the core is that it provided a missing puzzle piece that revealed the big picture of the uses and (depending on your view) the abuses of personal data in our surveillance economy.

And this is why things are about to change. The emperor is naked.

What We Learned

Perhaps the most frightening aspect of the Cambridge Analytica story was the revelation that you didn't (and still don't) have to have bad cyber hygiene--or even necessarily use social networking--to get caught in the sprawling data gill nets collecting personal information for corporate gain.

The information in question was originally harvested by a psychology researcher from Cambridge University. He only had access to about 270,000 people, all of them users of his "thisisyourdigitallife" app. But because of a flaw in Facebook's API, or third-party interface, the researcher was able to pull data associated with 50 million more users.

A critical view might hold that these Facebook users opted in for a not-so subtle form of data-snatching--trading valuable personal information for a few minutes of app-based navel gazing--and they got what was coming to them.

The number of affected people ballooned into the tens of millions because the Facebook flaw allowed any third party that knew what they were doing to grab information about friends and friends of friends and so on. The very fact of those friendships as well as anything that could be gleaned about them from posts they appeared in or liked or hid from their timeline was marketable. Cambridge Analytica proved that.

In the surveillance economy, everyone is "gilt" by association--so much gold waiting to be mined. We've always known it, but this story brought that fact come to life.

What to Do?

The personal information of millions of Facebook users went walkabout because no one respected it enough to make sure it was handled correctly.

It is no longer possible to give "surveillance economy" companies the benefit of the doubt when it comes to the handling of sensitive personal information. The highly marketable data sets that these companies traffic in possess value. It is presumably protected from theft by third parties for this reason. But when it is sold, there is no commensurate protection for the consumer.

When we talk about handling personal information correctly here, it's not only about keeping it safe from a breach. That should be a given.

The basic rules of marketing and advertising were at work here, but still it seems data may need to be protected more than it is. Consumers are fair game, but you should at least have some say about how your personal information is used.

Had it been radioactive material, everyone handling the (sort of) pilfered personal data that Cambridge Analytica used would have taken a litany of carefully designed precautions. Federal and international guidelines would have been in place--agreed upon best practices of notification in the event of a "spill" or leak. Those rules would have been the subject of well-funded government and non-governmental-organization studies.

In the surveillance economy, many stand a chance to win, but the fallout when data goes sideways (or missing) could profoundly impact your life, and there are no meaningful guidelines as of yet in the United States.

The benefit to consumers from all this data being scooped up and converted into predictive marketing is an added layer of convenience and perhaps access to more creature comforts that are "exactly right." But the downside may be something as big as the manipulation of our democratic process.

Published on: Mar 28, 2018