Project Maven--which looks to be on track, despite reservations--offers the government machine learning programs that help analyze drone footage more efficiently and could significantly streamline intelligence gathering. For Google, it could offer a leg up as it competes with rival companies for lucrative government cloud-computing contracts.
But the deal doesn't sit well with some Google employees, who cite "harmful biases, large-scale breaches of trust, and lack of ethical safeguards" as reasons to oppose Project Maven. They argue that the project is at odds with the company's stated mission to "do no evil," and believe that the tech giant should not be work closely with the military. Thousands more have signed a petition calling for Google to do the same.
Declining to participate in government projects will feel right to some tech workers, but it won't achieve the ends they ultimately seek. It won't change the government's mandate to safeguard citizens or eliminate its need for new technologies to help accomplish this. It won't prevent them or our enemies from acquiring these technologies, either.
Google's employees should mobilize around the opportunity to engage the government in developing technologies that do not breach trust, offer ethical safeguards, and still accomplish the protections that are in our society's self-interest. If Google doesn't, the Department of Defense will just take its business elsewhere. A similar situation could take shape at Amazon, whose facial recognition software now in use by law enforcement is under fire from privacy activists and certain lawmakers.
For tech workers still on the fence about working with the government, I offer the following critiques and advice. When the stakes are so high, it's worth considering a different perspective:
Recognize your short-term thinking.
If you're arguing for more drone technology but less government involvement in its development, you're being short-sighted at best and oblivious at worst. The government will absolutely play a role in unmanned aircraft.
It will do so not just because of military capabilities, but also because of monitoring, privacy, and other security concerns. The government is, after all, responsible for the protection of its citizens first and foremost.
It's also short-sighted, even disingenuous, to focus on the risks of providing the government drone-related technology while ignoring the threats associated with not providing it. Few would argue it's wise to poorly equip our government in its attempts to defend our interests when other governments are actively working to undermine those interests.
Both issues deserve our attention: preventing the government from doing harm and equipping it to prevent harm.
Recognize selection bias.
It's easy to elicit fear of physical harm when invoking images of fighter drones and the military--but non-physical harm is potentially more dangerous in today's world.
Google and Facebook are today's public forum and source of most knowledge and information. Along with Amazon, they're also today's public marketplace.
They know more about you than you do, and certainly more than the government does. The information they control is more likely to harm you at the personal level, while also being less subject to regulation and controls.
Ask yourself: Are you more in danger of being hit by a missile or of having your identity and all associated data stolen? Is your privacy and sense of personal security threatened more by drones or by private internet companies? Where is the outcry from employees on this front?
Recognize the opportunity.
Whenever you mention the government, Silicon Valley's positive technology evangelism disappears--but working with the government presents an opportunity to avoid the very outcomes these technologists fear. The government is going to play in this arena with or without your permission or involvement.
Technologies that can help the government perform more efficiently, and with fewer costly mistakes, are crucial. If you want to have an impact, you are far more likely to do so if you are engaged and have a seat at the table.
As entrepreneurs, we should be embracing such challenges instead of withdrawing from ethically complicated situations. Instead of quitting, ask: What can Google teach the government about how it should or shouldn't be using drone technology, AI, and machine learning? What about Amazon's facial recognition technology?
The government has an obligation to ensure our safety. Technology developers should embrace the same moral obligation. The phrase "with great power comes great responsibility" applies to the Valley as much as it does to Washington.
Don't shirk that responsibility by disengaging. Instead, own it and turn it into an opportunity.