In 2014, when the Aloft Cupertino hotel first debuted Relay--a robot butler that delivers food and fresh towels to guests--they probably didn't imagine that two years later a crew of frat brothers would kidnap Relay, hide him in the closet, and send him back to the front desk wearing a bra. If Relay was a human employee, Aloft would have had to consider legal action against its abusive guests. But a robotic staff-member? "I don't think we ever considered giving him [Relay] rights," concedes Andy Evers, Aloft Cupertino's general manager.

Enter the brave new world of employing robots. Granting employee "rights" or legal protections to robots might sound ridiculous, but there is a community of professors, legal scholars, and futurists who are thinking about what policies and social mores should be in place to help humans better interact with robots in the workplace.

As robot technology moves to more offices and factory floors, business owners will have to grapple legal and ethical questions--should robots be treated like any other technology? Are there certain ways humans should and shouldn't treat the robot? "The rationale for robot 'rights' is not a question for 2076, it's already a question for now," says Peter W. Signer, a political scientist and the author of Wired for War.

Companies across many industries already have robots in their workforce. DHL, Toyota Research Institute, and Panasonic Corp. use autonomous robots by Fetch Robotics to help fulfillment center and warehouse employees. California nursery Altman Plants uses Harvest Automation's robots to transport its plants. Security guard firm Securitas is using Knightscope security bots to patrol the grounds of malls, banks, and office parks.

Most current workplace rules around robots were established to protect the humans working alongside them. In 1979, a Ford Motor employee named Robert Williams became the first human killed by a robot, when he was struck in the head by a robotic arm. His death, and a $10 million lawsuit the Williams family won, motivated factories to adopt more strict safety protocols. "Rules around robots--like granting factory robots the right of way over human workers--are made to keep humans safe," says Andy Sellars, the director of the technology and cyberlaw clinic at Boston University. 

But the new conversation around robot rights goes beyond practical safety precautions. Robots hardly possess the characteristics that would require actual rights: they are not self-aware, they are not alive, and have no will to survive, says Ryan Jenkins, an assistant professor of philosophy at California Polytechnic State University who co-wrote a paper about robot ethics.

The larger issue is how employee behavior towards robots impacts the rest of the workplace. Ryan Calo, a law professor at the University of Washington who specializes in robot law, says if the boss, or another employee treats a robot in a disturbing manner--say, sexually harassing a female-looking robot--employees who witnessed that act could sue the company for creating a hostile work environment.

Many experts draw a parallel between robot rights and animal rights. Just as being cruel to animals can become a pathway to abusive behavior at large, says Wired for War's Singer, so can "abusing" a robot. As a result, employers might want to establish formal rules of engagement between humans and robots in an effort to set a company-wide moral precedent.

"If you saw your kid yanking the legs off a ladybug or kicking a dog, you'd say, 'Wow, kid, that's not a nice thing to do to a living thing," says Singer. "If you saw someone kicking a humanoid robot that looks like a small human, you wouldn't say, 'Well it doesn't matter unless the robot's owner cares.' We may endow machines certain rights or protections not because they feel pain, but because of what the action says about the person conducting it."

However, there are robot entrepreneurs who see this conversation going too far. Dr. Steven Cousins, founder of Google Ventures-backed Savioke--the San Jose-based company behind Aloft's Relay--says it doesn't help anyone to pretend robots are more human than they really are. Which is to say, they're not. "We added blink-y eyes to Relay to imbue character," says Cousins. "I think there is value in imbuing character, but just because the equipment has character doesn't mean it should have rights."