A few months ago, Apple and Google did something extraordinary. They agreed to work together on a standard for technology that could be used for  Covid-19 exposure notification apps. The two tech giants compete fiercely across a range of areas, so it was notable that they agreed to collaborate on something so important.

There was just one problem: As the technology rolled out to devices through updates to iOS and Android, there's been some confusion. In fact, there are many people who believe that the companies had no business adding anything that could "track" them, without their explicit permission. As a result, there's a lot of misinformation about what exactly was added, and--more importantly--exactly what it does. 

There are really two parts of this, and both are important to understand. The first is that Apple and Google created a common standard for sharing Bluetooth keys across devices, so that if someone tests positive for Covid-19, that information can be shared with people they may have come in contact with.

Except--and this is the most important thing--in order for any of that to happen, an individual has to have downloaded an app that uses that technology. That means that if you've never downloaded an exposure notification app from the public health organization in your area, nothing Google or Apple has done will have any effect on your phone or your life.

There's actually a third component, and I suppose it's at least as important. Apple and Google built their Covid-19 exposure notification technology in a way that doesn't share any personal information, and no information is ever uploaded to either company's servers. Essentially, phones that have downloaded an app for this purpose can share "keys," when users are in close proximity for at least a set amount of time. Your device keeps a log of the keys with which you have been in contact.

When a user later tests positive for Covid-19, they confirm with the app their test result, and their key is uploaded. That key doesn't include any identifying information and doesn't even include a location. 

All of the other users who have the app then receive a list of uploaded keys, and when one matches their log, they are notified that they may have been exposed and may want to be tested. What they are not told is who they were exposed to, or even where. That isn't the point. This isn't contact tracing.

Maybe the most important thing to know is that if you do nothing, well, nothing has changed. Sure, there is a new technology standard in the most recent versions of iOS, but unless you download an app and turn it on, nothing happens. 

It does, however, highlight an important lesson, which is that tech companies have a huge trust deficit. People simply don't trust them, or at least, they say they don't. I don't know anyone who is actually giving up their iPhone over this, but there are definitely a lot of people complaining on social media. That goes deeper than just one new feature. It's a reflection of the overall brand reputation, and in many cases it could use some work.

I've said it enough times that it's starting to become a cliché, but it's still true--trust is your most valuable asset. The pushback the two companies are getting over something that has the potential to be a huge benefit to everyone is the perfect illustration of that. And, if it can happen to huge tech companies, it's worth considering how the things you build affect your reputation as well.