Not long ago, a Silicon Valley CEO told me something that made my ears perk up. She was talking about a move earlier in her career from McKinsey & Company to Google. It was a natural transition, she said, because both are "mission-driven companies."
McKinsey, mission-driven? That will come as news to anyone who has been present when its consultants came into their company and did what they're famous for: recommended a deep round of layoffs. But it's true, technically. McKinsey has a mission, which boils down, basically, to helping its clients succeed in their missions. And it carries out that mission even when the client is, say, a repressive Middle Eastern kingdom interested in jailing and torturing subjects perceived as disloyal. (McKinsey says it doesn't think any of its work for Saudi Arabia was used to target dissidents and would be "horrified" if it proves otherwise.)
I thought about this exchange again after Google's CEO, Sundar Pichai, recently made his first public remarks about Project Dragonfly, the company's controversial plan to introduce a search engine in China that will allow the country's ruling regime to censor its citizens' searches. While Google doesn't love censorship, he said, "We are compelled by our mission [to] provide information to everyone, and [China is] 20 percent of the world's population." By serving "well over 99 percent of the queries," Google will be able to more nearly fulfill its mission--"to organize the world's information and make it universally accessible and useful"--than it would via the alternative, staying out of China.
In Silicon Valley, there's probably no proposition more widely accepted than the idea that mission-driven companies are better companies. Influential venture capitalists like Marc Andreessen and Fred Wilson agree: Founding teams with a mission other than making money work harder to achieve their goals and have an easier time hiring and inspiring great employees. Announcing that you have no agenda other than to generate some profits and a nice exit will get you turned down by every investor on Sand Hill Road.
That makes perfect sense in the context of startups. But as they become big, established companies--or Goliath world-eaters--weird things happen. That noble, useful mission gets stretched to encompass new areas of a diversifying business. Maybe it even comes completely untethered from its original meaning. Or the attainment of national or global scale creates unforeseen dynamics that make it harder to execute on the mission. Having monopolized a populace's attention span, say, a platform suddenly becomes a target for outside interests seeking to weaponize that attention.
You can think about "return on mission" as a logarithmic growth curve. In its early days, the curve is steep as the company, doing something no one has ever done before, and makes rapid gains. As the years go by, however, the company is operating in a world it has already largely transformed, competing with its own past success. Working the flat part of the mission curve, it's increasingly grappling with edge cases and unforeseen consequences. That's how a company devoted to making the world's information accessible goes from inventing great web search to inventing censored web search to ... helping target drone strikes?
The vaguer and more grandiose the mission, the more easily it warps into a funhouse version of itself. If Facebook's mission were "to collect users' data and use it to target them with advertising," you'd have to agree, it's doing a pretty good job. Instead, it's "to bring the world closer together." Yet, in spite of its impressive contributions to political balkanization in the U.S. and outright ethnic cleansing abroad, Facebook forges on--because, after all, the more divided we become, the greater the need for closeness, right?
There's a scenario A.I. researchers talk about wherein a machine entity using recursive learning algorithms could, with no malice, wipe out humanity. Elon Musk, who worries about this sort of thing, put it like so: "Let's say you create a self-improving A.I. to pick strawberries and it gets better and better at picking strawberries and picks more and more and it is self-improving, so all it really wants to do is pick strawberries. So then it would have all the world be strawberry fields. Strawberry fields forever."
Depopulating the planet to make room for more strawberry fields--that's what it looks like when a mission escapes its original context and just keeps going, with no limits other than its own internal logic.
At the same event where Pichai pinned Google's China plans on its mission, Jeff Bezos talked about Amazon's equally controversial dealings with the U.S. government, which include a bid for a $10 billion cloud-services contract with the Department of Defense. Noting that Google had declined to renew its contract with the Pentagon in the face of opposition by its employees, Bezos said he disagreed with that decision. "If big tech companies are going to turn their back on U.S. Department of Defense, this country is going to be in trouble," he said. "I like this country."
You don't have to like Amazon's government work, which also includes a facial recognition system it has pitched as a border-security tool, to appreciate how Bezos framed the matter, as a matter of loyalty and responsibility. He easily could have punted, claiming Amazon's pledge to be "the Earth's most customer-centric company" means the customer is always right, even when it's taking toddlers away from their parents. He would have been in good company, had he done so, right there alongside Google, McKinsey, and Facebook. But Bezos knows when it comes to the really hard decisions, the mission isn't always the answer. Sometimes, it's just an excuse.