I've often wondered what it would take to get Mark Zuckerberg's attention. More specifically, I've wondered what it might take for him to finally understand what people really think about Facebook and its position and responsibility in the world. That covers a lot of ground, but for the purpose of this column, mostly I'm referring to how the company handles content and the benefit it gets by encouraging users to post and share extreme and incendiary content on its platform. 

See, every time I've ever heard Mark Zuckerberg speak, there's always been this disconnect between what he--or really anyone from Facebook, for that matter--says the company believes, and what the rest of us observe on a regular basis. On matters from free expression, hate speech, fake news, and privacy, I've become increasingly convinced that Zuckerberg simply doesn't hear anything that doesn't reinforce the beliefs he already holds.

Well, it turns out that it's not at all complicated. A 7 percent drop in the company's stock price, which equated to a $7 billion drop in Zuckerberg's net worth, finally did the trick. 

Whether it was the brutal awareness of the amount of money his refusal to take responsible action was costing himself and other investors, or the simple recognition that Facebook isn't actually invincible, Zuckerberg finally did something. Though, maybe not as much as it seems, and certainly not enough.

Facing a growing list of advertisers that have agreed to boycott Facebook's platforms, including big brands like Verizon, Unilever, Honda, Starbucks, and Coca-Cola, Zuckerberg responded on Friday in a Facebook post by saying the company will take steps to add labels to some content on the site. 

Most notably, Facebook will mark posts from politicians that would, from other users, violate its content guidelines. Here's how Zuckerberg described it:

We will soon start labeling some of the content we leave up because it is deemed newsworthy, so people can know when this is the case. We'll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what's acceptable in our society -- but we'll add a prompt to tell people that the content they're sharing may violate our policies.

Just so we're all on the same page, this paragraph is the literal definition of trying to have it both ways. Which isn't all that surprising since that's always been Zuckerberg's m.o.: Do the least amount possible to mollify the greatest number of people. That way, Facebook can continue to do its thing without upsetting anyone who might actually be able to exert pressure on the company to truly change. 

Which honestly, now that I think about it, simply reinforces what I already said I thought to be the case: Mark Zuckerberg doesn't get it. 

Take, for example, this gem from Zuckerberg's Facebook post: "If we determine that content may lead to violence or deprive people of their right to vote, we will take that content down. Similarly, there are no exceptions for politicians in any of the policies I'm announcing here today."

Did you catch the "we" in that first sentence? How exactly does Facebook determine that content may lead to violence? What criteria will it use? Also, if Facebook does not determine it "may lead" to violence, does that mean it will simply leave it up regardless of how offensive or inappropriate a post may be?

There's a reason that activists and advertisers are saying enough is enough, and it's precisely because Facebook isn't doing enough. Regardless of what you think about President Trump, it's not as though he's abusing Facebook. He's using it exactly as intended, and in a way that directly benefits Facebook. The problem is the way Facebook's algorithm and business model incentivize incendiary content. 

Listen, Facebook says it doesn't want to be the "arbiter of truth," and honestly, I don't think that any of us should want it to be deciding who gets to use its platform and what they can say. Except, it's Facebook's platform, and it absolutely gets to decide both of those things. That means it has a responsibility to do so in a way that is both reasonable and that makes it accountable to users and advertisers. 

Advertisers have made it clear they plan to hold Facebook accountable for the way bad actors use the platform until Facebook takes real action. Zuckerberg has been equally clear: He just doesn't get it.