Trusting thy neighbor: It's the supposed premise of so many startups in the "sharing economy." Renting your cozy bed to a complete stranger on Airbnb, jumping in a non-professional's car for an UberX ride, or dropping your pooch off at a nearby home via DogVacay while you're out of town. Each of these acts seems on its surface to require a hearty scoop of trust.
Even the names this somewhat-recent spate of fast-growth companies, mostly out of San Francisco, coined for themselves seem crafted to channel trust: "collaborative consumption," and "peer to peer" are both nearly as common as saying a company is part of the sharing economy. And it's all so friendly, it's tempting to make claims that these companies have the potential to restore something that's dwindled in modern urban America: trusting fellow humans.
It's a dreamy thought. But let's be real: These companies are digital marketplaces designed to scale quickly, easily, and with their only size limit being the population of earth. A recent Wired cover story seemed poised to explore how companies in the sharing economy foster compassion, community, and trust, but it actually surfaced a more complex and compelling argument, one with which I agree: These companies actually have built quite a sturdy backbone of protections for users (certainly these fail from time-to-time, but they exist) that they've eliminated much of the need for that elusive trust by customers. Wired's Jason Tanz writes:
Indeed, for the time being the boundaries of the sharing economy are protected fairly rigidly. If you’ve ever been caught driving more than 20 miles over the speed limit, you can't rent a car on RelayRides. Aspiring Lyft drivers must pass a background and DMV check and get approved by a mentor, who judges applicants not just on driving ability but on personality. DogVacay hosts go through a five-step vetting process that includes training videos, quizzes, and a telephone interview.
This sort of institutionalized trust can be, as Tanz notes, "in tech-industry parlance, a high-friction affair." Consider the case of eBay, arguably the first major peer-to-peer marketplace. The company had no desire to or interest in forcing every individual who wanted to sell a pair of once-worn Louboutins to become a licensed shopkeeper. So it created another way of regulating members of its community: peer-to-peer ratings and reviews. And the beauty of the system was that it didn't rely on employees of eBay to constantly monitor every account. It could rely on light monitoring for problematic activity, along with algorithms that could help flag fraud. Individual buyers and sellers helped this process along, by rating each other and providing feedback to the system. This crowdsourced quality-control would become a hallmark of the sharing economy.
Eventually, based on a combination of these safeguards, eBay could guarantee every transaction. "In so doing, eBay evolved from a passive host to an active participant in every transaction," Tanz writes.
Last week I spoke to John Zimmer, the co-founder and president of Lyft, and mentioned the article. He said that protecting customers by providing safeguards that make them feel safe to open up and trust is "a real responsibility."
"We need to be doing everything we can to provide a platform for users that's as safe as it can be," he says. "We take that really seriously."
In comparing the evolution of the modern marketplace economy to the explosion of institutional banking and insurance in the early 20th century, Tanz continues: "this new system acted as a trust proxy; it didn't require people to trust one another, because they could rely on a centralized system to protect their interests."
The digital marketplace as a trust proxy: Now this makes a lot more sense.
In a book he co-authored called The Trusted Advisor, speaker and author Charles H. Green says: "The sharing economy is itself a play in a much grander fundamental shift from an infrastructure that protects people from each other to an infrastructure that helps people trust each other."
Over at Forbes, journalist Kashmir Hill argues that much of that infrastructure is in one place: Facebook. That's because it's the preferred sign-in and identity verification of the sharing economy. Therefore, she hypothesizes that "eventually, we will all have star ratings attached to Facebook profiles or something like them, with reviews from many different contexts. This will have some significant (and potentially horrifying) privacy implications, much as Google footprints have in the digital world."
Zimmer, the Lyft president, told me he believes his ride-sharing company would function just as well without using Facebook as an identity-verification platform.
Facebook sign-in or no, I'll buy that small interactions between users of these companies can be meaningful, and therefore exhibit--and maybe even help foster--interpersonal trust in communities. And perhaps all those little experiences compounded do mean the industry as a whole is nudging trust. But because there's so much at play that's so highly designed by any given peer-to-peer startup to feel like trust, any real, organic, sense of trust in a grander sense still feels like a hologram.