Last week, in an announcement about "building a more private web," Google made it clear that it has no interest in actually protecting user privacy online. A blog post announcing changes to Chrome called Privacy Sandbox sounds good--it starts off with words that make it sound like Google is ready to make dramatic changes both to its web browser and its overall business model.
"Privacy is paramount to us, in everything we do," begins the post written by Justin Schuh, Director of Chrome engineering. "So today, we are announcing a new initiative to develop a set of open standards to fundamentally enhance privacy on the web." Except that's not true at all. Google could make the internet respect your privacy by implementing third-party cookie blocking, but it won't. I reached out to Google but did not immediately receive a response.
Further still, the company is trying to explain that it "can't" take such measures because it would ruin the web. According to Google, the company is committed to the "vibrant web" but we'll get to that part in a minute.
Tracking users online.
There are two primary forms of tracking what individuals do online. The most common are cookies, little pieces of code that are set by a browser to send information back to advertisers about your browsing activity. Cookies are also used by websites to remember you so that you don't have to log in every single time.
Device fingerprinting is different. It identifies users by certain characteristics of the device they're using, and monitors when that device interacts with various sites across the internet. Fingerprinting is a far more invasive form of tracking since not only does a user rarely know it's happening, but it reveals a deeper level of information compared with cookies.
Google's Schuh goes on to say that "large scale blocking of cookies undermine people's privacy by encouraging opaque techniques such as fingerprinting." Essentially, Google is making the argument that protecting privacy is bad for... protecting privacy.
A compromise between privacy and profit.
The real problem is that cookies prevent targeted advertising based on a user's web activity. And that's bad for Google. "Blocking cookies without another way to deliver relevant ads significantly reduces publishers' primary means of funding, which jeopardizes the future of the vibrant web," writes Schuh.
Except, it's not even clear that that's true. Sure, retargeted ads are effective, but they aren't the only way to make money online. As The Wall Street Journal's Keach Hagey reported in May, a recent academic study shows that using cookies only adds 4 percent more revenue per ad impression. Contextual ads (which show products related to the content on the page you're already viewing) can be as effective without the privacy concerns.
That's not to say that advertising is itself bad. Most sites on the web are supported by advertising (including this one), and in many cases, that results in a lot of high-quality content that wouldn't otherwise exist.
The reality is that there will always be a compromise. You literally can't have a private web experience with personalized ads. You can have privacy, or you can have personalization. And since personalization means profits, Google has consistently opted for profits over privacy.
In fact, Google's idea of privacy is, "we'll help you protect your information from bad guys as long as you don't mind us using it to make a lot of money."
Google could make changes today.
Here's the thing: Apple and Mozilla (along with others) are already doing this. Apple has implemented technology to fight device fingerprinting, and Safari has always blocked third-party cookies. Firefox has taken similar privacy-focused steps and both browsers work just fine and publishers are still able to show ads. Then again, neither company is in the business of selling web-advertising.
Google could block third-party cookies and device fingerprinting technology today. Sure, someone would start looking for other, more nefarious ways to track users online, but I'm pretty confident the incredibly smart people at Google would find a way to protect us against that as well. The problem itself isn't that complicated.
The only thing that's complicated is saying goodbye to the large pile of cash that comes from selling ads and trying to figure out ways to replace it with something that doesn't require the compromise between privacy and profit. That's the struggle Google is having, and it has made clear which side it plans to stick with.
Google could make these changes today, but it won't. The reason has little do with protecting privacy, and everything to do with profit. When the company says it's for protecting the vibrant web, the truth is that it needs a vibrant "ad-supported" web to continue to make money.
Or, as privacy researchers at Princeton said of the announcement, everything else is "privacy gaslighting."