As Facebook founder and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee on April 11, users should be listening to find the answer to a key question of their own: Why should they trust Facebook?
The trust issue exploded for Facebook after it acknowledged that data from as many as 87 million users was accessed by Cambridge Analytica, a political strategy and communications firm tied to the 2016 Trump presidential campaign. Cambridge Analytica is alleged to have mined user data for voter insights, in an incident that has been called the largest known leak in Facebook history.
Zuckerberg will now face questioning from the House committee on how Facebook uses and protects user data. For Facebook's 239 million monthly users in the U.S. and Canada, and 3.2 billion monthly users globally, it's not enough to expect lawmakers to ask the tough questions. In 2011, the Federal Trade Commission (FTC) ordered Facebook to focus more on privacy issues, with seemingly little effect. Instead, consumers need to be more vigilant--for example, by learning more about how their information is gathered and by reading the fine print of privacy policies. This is time-consuming and requires more effort. But given the amount of time people spend with Facebook, and given how regulators have struggled to keep up with an ever-changing internet landscape, personal due diligence is the most reliable way to make wise decisions about whom to trust.
Researchers understand trust in terms of three dimensions, and, leading into Zuckerberg's testimony, Facebook users should carefully consider each one.
The first dimension is competence, which in this case speaks to whether Facebook reliably delivers an enjoyable experience. Few question whether Facebook consistently provides engaging information and helpfully connects users with others.
The second is benevolence: Does Facebook have the user's best interests at heart? Benevolence is particularly relevant for Facebook users, because the company is especially good at learning what people love and despise, and its business model depends on monetizing this user information. Users watching Zuckerberg's testimony should carefully consider whether Facebook has their back and therefore whether they think their personal information is truly safe.
The third is honesty, meaning how transparently Facebook communicates with users and how diligent the company is at keeping promises. During his testimony, Zuckerberg will likely offer assurances about how Facebook is planning to address data privacy. It is up to users to remain vigilant about the extent to which those promises are kept.
Distrust is a problem for Facebook because the data it collects is what allows the company to create the personalized experience that users truly value. To the extent that the demands of distrustful users restrict the company from collecting this information, privacy may be enhanced but the benefits of the Facebook experience will be undermined.
Why, after a number of Facebook privacy scandals over the past decade, are users finally thinking about why they trust Facebook? One explanation is a phenomenon I call "trust complacency." Without personally experiencing the negative effects of faulty privacy policies (or exploding mobile phones, or deceptive automotive technology), people become cocooned in a trust bubble in which they presume that their trust has been well placed. The problem, however, is that complacent users give companies the latitude to take greater and greater advantage. If there is no cost for not caring about user interests, can you blame Facebook for pushing the boundaries?
Facebook has faced a backlash, including #DeleteFacebook. While a few high-profile users have abandoned Facebook, it's not clear how much of a broader impact this has had. Facebook borders on ubiquity--the perception is "everyone" is on the platform, which further encourages a perception of trustworthiness. The analogy I use is a mountain climbing company that has been around for many years and has photos of previous expeditions on its site. Seeing this, other climbers are likely to assume the company can be trusted, even if it actually has a poor safety record. Popularity encourages trust.
But no matter how much a company like Facebook wants to appear to be our friend, consumers must do the hard work of discerning whether trust is well placed in a company--and why.