This is a story with a common beginning, an atypical outcome, and potentially enormous stakes.
It involves a former Google engineer who worked on one of Google's most controversial projects: Project Maven, which was its contract to build artificially intelligent technology that interpreted images from military drones.
After leaving that project--and Google in its entirety--computer engineer Laura Nolan is now warning of a looming technological danger.
And the phrasing that she and a group she's become active with use to describe the problem is attention-grabbing, to say the least: "killer robots."
As in, "the Campaign to Stop Killer Robots," which is a group dedicated to promoting an outright ban on autonomous military weapons, much as the use of chemical weapons was banned by international agreement.
I know this can sound alarmist, especially if you haven't spent a lot of time monitoring the growth of lethal autonomous weapons over the past decade or so.
That's part of the problem, according to Nolan, an engineer based in Dublin who is described as having been one of the "top software engineers in Ireland" by The Guardian newspaper, which wrote about her this week.
Her concern is that artificially intelligent robots, not directly controlled by human beings, aren't limited by any of the human attributes that some of us admire in the military--leadership, courage, judgment, discipline--and that they could become a recipe for disaster as a result.
"There could be large-scale accidents because these things will start to behave in unexpected ways," Nolan said, as quoted by the Guardian. "Which is why any advanced weapons systems should be subject to meaningful human control."
In fact, at least some members of the U.S. military seem to share Nolan's concerns, if not her idea of banning the technology outright.
"Autonomy makes a lot of people nervous," Steve Olsen, deputy branch head of the Navy's mine warfare office, told Military Times recently, adding, "The last thing we want to see is the whole 'Terminator going crazy' [scenario]."
Now, I find this whole issue fascinating, and not just because it's the real-life manifestation of the plot of some of the best science fiction.
It's also interesting because Nolan began to talk about this issue after resigning from Google.
Even if you're involved in a less controversial industry, what do you do if an employee objects to your business--and especially if he or she ultimately quits and starts to protest loudly against you?
I can imagine that situation happens with some regularity. It's a big part of why people leave jobs--either they don't like their bosses, or they don't like the mission.
Tough situation. In Google's case it actually got out of the business, letting the Project Maven contract expire after more than 3,000 Googlers complained about it.