Should robots have rights?

You wouldn’t think so, but maybe you’re wrong, writes columnist ALEX BEAM

Share with others:


Print Email Read Later

It was with some trepidation that I approached MIT Media Lab researcher Kate Darling to discuss her 2012 academic paper “On Extending Legal Rights to Social Robots.”

I found the subject fascinating, but since then maybe the field of robot rights had run out of battery power, as it were.

Also there was the guffaw factor. I didn’t want to make fun of her, but that didn’t mean other people wouldn’t. I needn’t have worried.

“Still super interested!” Ms. Darling e-mailed me. “Have fellowships at Harvard and Yale for robot ethics this year and am planning a bunch of experimental work on human-robot interaction at MIT.”

Robots having legal rights or privileges sounds ridiculous. But 20 years ago, the idea that the nation’s leading law schools would be teaching animal-rights courses seemed equally absurd. Now anti-cruelty legislation is common in industrialized countries, and late last year the Nonhuman Rights Project made national headlines when it argued that a chimpanzee had “standing,” meaning the right to sue, in a New York State court. The case is under appeal.

Animal activists necessarily assert sentience on behalf of their clients, arguing that cats, bears and elephants share an awareness that is like our own. Ditto on sensitivity to pain, physical and emotional. So here’s a problem. Robots aren’t sentient yet, and they are unlikely to be so any time soon. They don’t feel much pain, either.

Ray Kurzweil has been talking up the “Singularity” — the forthcoming union of human and machine consciousnesses — for quite a while, but few take him very seriously. The Seattle-based Society for the Prevention of Cruelty to Robots allows that robots won’t be appearing in court in the near future, “but recent advances in data nanostructures, cognitive modeling and neural networking have convinced many people that the advent of some sort of created intelligence is much closer than previously thought.”

Yes, Virginia, there is a Society for the Prevention of Cruelty to Robots, founded 15 years ago by music engineer Pete Remine. His website talks about a Robotic Bill of Rights, which Mr. Remine told me is more or less on hold; “until the state of artificial intelligence progresses a bit further, there’s really not a lot of relevant work to be done,” he e-mailed me.

There is ample proof that humans care about robots. During the height of the Iraq war, Washington Post writer Joel Garreau observed soldiers bonding with the complicated robots that detonated lethal improvised explosive devices. In one instance, a technician carried the remains of a “really great robot” named Scooby-Doo to a repair shop, hoping that the obviously “dead” robot could be brought back to life.

When we chatted, I asked Kate Darling what kinds of experiments she had carried out. “I did this one workshop where we gave everyone these cute little plush robot dinosaurs called PLEOs, and we asked them to spend time bonding with the toys,” she said. “They gave them names, they played with them a little … then we asked them to torture and kill them.

“The results were more dramatic than I could even imagine,” she said. “There was an option to save your own dinosaur by killing someone else’s, and no one wanted to do that. They refused to even hit the things.”

For an advanced society, America lags far behind countries such as Japan and South Korea in … sexual robotics. Japan has hosted a thriving female doll escort service for almost 10 years, and engineers have designed robots called actroids, often young women who “breathe,” speak and mimic many human behaviors.

Surely “Samantha,” the sensual and sensitive operating system that wins Joaquin Phoenix’s heart in the movie “Her” is barely a step removed from a sophisticated sexbot.

“The sexbot issue is going to be discussed sooner than most people think,” Ms. Darling predicted. “There are sexual acts that we don’t allow between humans, and people might argue for laws protecting robots from performing them.”

In her 2012 paper, she quotes Immanuel Kant to the effect that a man shooting a dog “damages in himself that humanity which it is his duty to show toward mankind.”

So how we treat our robots will tell us volumes about ourselves.

Alex Beam is a columnist for the Boston Globe (alexbeam@hotmail.com).



Advertisement

Latest in Op Ed Columns

'Sometimes it snows in April'
about 21 hours ago
The Squirrel’s Nest
about 21 hours ago
Three expensive milliseconds
about 21 hours ago
Advertisement
Advertisement

You have 2 remaining free articles this month

Try unlimited digital access

If you are an existing subscriber,
link your account for free access. Start here

You’ve reached the limit of free articles this month.

To continue unlimited reading

If you are an existing subscriber,
link your account for free access. Start here