A pair of stories are emerging this week that aren't good if you're Facebook. Actually, they aren't really all that good if you're a Facebook user either. Both represent problems that a lot of people already believe are true about the company-- that it can't be trusted, and that there's no room for anyone else's ideas other than the company's founder and CEO, Mark Zuckerberg.

I'll look at them one at a time, but more importantly, I want to talk about what these two stories really mean for the company.

A founders problem.

The first is that with the departure of Nate Mitchell, who co-founded Oculus and headed up virtual reality initiatives for Facebook, the founders of the company's biggest acquisitions, WhatsApp, Instagram, and Oculus have all now left the company. 

Why is it that Facebook can't keep these founders around after it buys up their companies and starts to integrate the technology into Facebook?

Look, it's not uncommon for founders to depart for other things when their company is acquired, but Facebook had-- at least at first-- kept the founders of those companies on to run them as largely independent teams. 

That appears to be changing, and it's not great for those companies, or for users.

In fact, just last week it was revealed that Facebook plans to add "From Facebook" to the descriptions of WhatsApp and Instagram in the App Store, for reasons that are unclear other than the company wants to make sure that users are fully aware that it's in charge.

Facebook actually is listening.

Second, and probably an even more significant story, Bloomberg's Sarah Frier reports that Facebook has admitted it uses contractors to transcribe users' interactions with the company's products and devices. 

While the company told Bloomberg it has stopped the practice-- at least for now-- Facebook joins a long list of tech companies engaged in similar practices without really letting anyone know what's going on.

I reached out to Facebook but didn't immediately receive a response.

I've now written about every major tech company listening to recorded interactions of its users, without clearly disclosing that it does so. As mom used to say, just because everyone's doing it, doesn't make it okay. 

For Facebook, it's actually an even bigger problem since it simply confirms something in people's minds the company has long tried to refute-- that it listens to users' conversations in order to show them ads.

In fact, last year, Zuckerberg tried to dispel what he referred to as a conspiracy theory that the company listens using a device's microphone, telling Congress "We don't do that."

The company also told Bloomberg that the users affected "chose the option in Facebook's Messenger app to have their voice chats transcribed," though it was never made clear that it would be humans who would review them, as opposed to artificial intelligence (AI) transcription.

While it's not exactly clear why the company is using a team of contractors to transcribe portions of conversations primarily recorded by Facebook Messenger, it says it has nothing to do with ads, and that the recordings are anonymous.

That's not likely to matter to most users who already don't trust the company after a series of privacy scandals and increased scrutiny by regulators and congress.

In fact, just last month, the company agreed to pay a $5 billion settlement over concerns that it repeatedly violated user's privacy, and failed to adequately protect personal information. 

Things could get worse.

Here's why things are likely to get even worse, both for the company and for users:

Facebook seems to operate in some kind of alternate reality where it thinks that the normal rules of engagement don't apply. That's really the only explanation that makes sense when you piece together this latest news.

Those rules are really quite simple: be honest with your customers, respect their personal information, don't mess with what's working just so you can get credit for it. Most of us learned some version of all three early in school. 

Facebook, on the other hand, seems to think it can simply do what it wants, without regard to whether it's good, or right, or reasonable. It seems genuinely not to get why users have a hard time trusting it-- or it maybe it just doesn't care. 

Both of these stories just reinforce what most people already believe-- that the company has a "we know best" mentality that doesn't have it's users' best interests in mind, but instead uses their personal information to make a profit without regards for people's privacy. 

Either way, it's not a good sign-- either for the company, or for the 2 billion people who use the social network regularly.