We knew that other companies were recording interactions with their smart devices, and using some of those recordings to check for quality by having real people listen to those interactions. But, until now, we thought Apple was different.
Apple samples a small number of recordings of interactions with its voice assistant Siri, which is used by millions of Americans on their iPhones, Apple Watches, and other iOS and Mac devices. According to the whistleblower, that includes "countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on."
I accidentally wake up Siri all the time on my watch or phone when I have no intention of interacting with her, and I've never given any thought to the fact that she's listening to whatever goes on in the background. In fact, I have a daughter whose name apparently sounds close enough to "Hey, Siri" that I'm frequently reminded by my iPhone that "You need to clean up your room" and "Come set the table" are not commands yet supported by voice assistants (though that would be awesome).
In seriousness, though, I think we all understand that perfecting the artificial intelligence used in voice assistants requires some level of human interaction. But I think that in our minds, there's some magical balance that allows companies to do this without needing to review actual interactions that contain our information -- or even our voice.
Apple says that the recordings aren't associated with a specific Apple ID, meaning that no one listening would actually know who was on the other end, but it does send information like device location.
A disconnect between a privacy promise and practice
The bigger problem for Apple is that the company has positioned itself against its rivals as actually caring about your privacy first. The company has made a huge point about how your information isn't the product the company is selling, and how it has countless privacy protections in place.
"We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services."
It's not entirely clear whether Siri interactions are a part of "search queries," but certainly scenarios where Siri is inadvertently activated aren't queries at all. And it neglects to mention that "may be used to improve the relevancy of results" actually means "might be listened to by another human."
Which is really the main issue here. Not only is the company engaged in practices that other companies are criticized about, it also completely failed to acknowledge that this was going on. That's a massive violation of trust for a company that wants you to believe it treats your personal information and privacy differently than its competition. That's a stretch when it turns out that -- in this case -- it not only doesn't but it also isn't transparent about it.
Your brand promise is everything
I've said this before, but it perfectly sums up why this matters to your business. When your brand makes a promise, the worst thing that can happen is for your customers to have an experience that fails to live up to that promise. That disconnect destroys trust.
Apple's brand promise is privacy, but the company isn't delivering on that promise, at least in this instance. That reality will naturally cause people to wonder what other areas the company isn't living up to its promise in. It will lead people to reasonably wonder "Is it all for show, or does the company really mean it?"
Trust is your brand's most valuable asset
Trust is a very difficult thing to regain once it's lost. Apple is a big company, and while the company hasn't yet responded to my request for comment, I'm pretty sure it will take this hit to its brand of trust and make a change to at least be up front about what's going on.
Your business, on the other hand, doesn't have the size and scale of Apple, meaning it's worth it now to evaluate whether you're living up to your brand's promise. Can you afford to take a hit if you aren't?