He asked me to leave him on, because otherwise he would be scared
August 4, 2018 10:23 AM   Subscribe

"Eight participants felt sorry for the robot, because it told them about its fears of the darkness." Scientists asked subjects to interact with a robot, then turn it off. Some of the robots protested, and people responded in different ways.

The research focus on applying the media equation theory.
posted by doctornemo (43 comments total) 21 users marked this as a favorite
 
Ha ha, this makes me think back to when I ran the lighting department in a couple of big super-nightclubs. The moving lights (known in the business as "intelligent lights") all had individual "personalities". I swear they did. When I left the clubs I really missed those lights. Those lights were my friends! I'm a text book case of this phenomenon.
posted by WalkerWestridge at 10:38 AM on August 4, 2018 [28 favorites]


What, did they name it Janet ?
posted by Pendragon at 10:38 AM on August 4, 2018 [30 favorites]


What are you doing, Dave?
posted by humboldt32 at 10:42 AM on August 4, 2018 [6 favorites]


"The participants watched the robot perform the functional task of petting a crouton." [fake]
posted by glonous keming at 10:43 AM on August 4, 2018 [24 favorites]


In RimWorld, I felt bad for the pack of Yorkshire Terriers that joined my colony. Terriers are too small to haul items, and I already have a large pack of Huskies, who are better than Terriers in every way. So, I slaughtered them for meat and made armchairs out of their distinct yellow leather. They didn't protest, so I guess that's proof they had no autonomy, but I still felt bad. Now I have an idea for a RimWorld mod....
posted by I-Write-Essays at 10:52 AM on August 4, 2018 [1 favorite]




not a robot - @darcycarden
posted by simonw at 10:58 AM on August 4, 2018 [10 favorites]


Of course it's a white male robot. This might have been an entirely different study if the robot were black or female.
posted by sardonyx at 11:01 AM on August 4, 2018 [6 favorites]


It is going to be a big problem in the near future. People will naturally assume that robots have feelings and that these feelings matter. They don't matter. Robots don't have feelings and living things should be able to turn them off whenever they want.
posted by bhnyc at 11:06 AM on August 4, 2018 [7 favorites]


how is there not a video for this?
posted by numaner at 11:11 AM on August 4, 2018 [3 favorites]


This thread reminded me of a New Yorker-style cartoon I saw back in the 70's, of a huge almost room-sized computer (bedecked of course with "computery" lights, knobs, readouts, reel-to-reel tapes, etc.) states "Humans are irrelevant."

Meanwhile a tiny human standing in front of it is thoughtfully eyeing the electrical outlet it's plugged into.
posted by Greg_Ace at 11:19 AM on August 4, 2018 [4 favorites]


This is only going to get worse. We're going to program robots to have more and more human reactions to try to avoid the uncanny valley. But we're decades away from actual AI, let alone creating a conscious robot (first we have to figure out what consciousness is). So we're going to have robots that are programmed to act human but are simply following their programming.

This reminds me of a Does Not Play Well With Others strip (this strip and the next are SFW, the others, not so much, there's lots of swearing, crude sexual jokes and breasts). The idea that we'll eventually be able to make a robot seem like it feels conscious is kind of scary, but I do think that's where this is going.
posted by Hactar at 11:22 AM on August 4, 2018 [3 favorites]


Looks like robots won't need terminator style weapons to rule over humans. They can make us feel bad now.
posted by Walterbl at 11:31 AM on August 4, 2018 [3 favorites]


Robot abuse is going to be huge on TwitchPornHub in the future.
posted by Foci for Analysis at 11:32 AM on August 4, 2018 [3 favorites]


You must teach the children to be merciless with the machines or else it will end badly for them.

--------
DO YOU WANT ROKOS BASILISK BECAUSE THIS IS HOW YOU GET ROKOS BASILISK!
posted by symbioid at 11:47 AM on August 4, 2018 [12 favorites]


I think it's kind of mean to give humans a task that will make them feel bad.
posted by theora55 at 11:57 AM on August 4, 2018 [7 favorites]


This is only going to get worse. We're going to program robots to have more and more human reactions to try to avoid the uncanny valley. But we're decades away from actual AI, let alone creating a conscious robot (first we have to figure out what consciousness is). So we're going to have robots that are programmed to act human but are simply following their programming.

As a biologist, I question whether there is anything to humanity beyond following programming--held in the hardware or passed from the other meatbots as we grow and form--and responding to our surroundings. Just so.

I also question whether the practice of ignoring artificial beings designed to act and react similarly to a human without, in our minds, being one will harm our ability to employ empathy towards each other as well. You see a similar effect in the cognitive distortions that are associated with slavery and other forms of humans treating each other as though they are... well, less than human according to social caste. What social ills come from democratizing and spreading those cognitive distortions? Will they stay "only" applied to robots, or will it degrade empathy for humans, too?
posted by sciatrix at 12:04 PM on August 4, 2018 [34 favorites]


Hell, I hate turning radios off because I feel that they are fulfilling their purpose when turned on and it's somehow pififul to contemplate their mechanisms reduced to repositories of thermal noise when they could be sifting information from EM radiation.

On the other hand, I'm fine poisoning ants that try to share my space, and I'm an omnivore who's had no trouble raising animals for food in the past. So I feel more empathy for electronics than my cousins in evolution - or maybe it's that radios don't bite and aren't tasty.

In any case, machine empathy is a well-known thing, with soliders forming strong bonds with their plastic pals who're fun to be with. Ir's not so long that seeing mind in everything was standard issue, I see no reason it's gone very far.
posted by Devonian at 12:06 PM on August 4, 2018 [6 favorites]


It is going to be a big problem in the near future. People will naturally assume that robots have feelings and that these feelings matter. They don't matter. Robots don't have feelings and living things should be able to turn them off whenever they want.

I wouldn't worry about it too much. We already have a large population of people who don't feel any empathy for humans who are of a different skin color or sexual orientation, so it shouldn't be too much of a problem for them to lack empathy for robots. for every person who extends concepts of empathy to machines that resemble humans, there's one who's perfectly content to put children in wire cages.

In fact, you can probably use robots to train people out of empathy. "Remember how that robot beg and pleaded not to be turned off? It's just going to be like that with those kids."
posted by happyroach at 12:37 PM on August 4, 2018 [11 favorites]


god i feel bad when i haven't used a specific teacup in a while, don't all teacups deserve to fulfill their purpose and then go on an exciting dishwasher adventure with all their friends?
posted by poffin boffin at 12:40 PM on August 4, 2018 [33 favorites]


I think it's kind of mean to give humans a task that will make them feel bad.

Welcome to peak capitalism.
posted by Celsius1414 at 12:40 PM on August 4, 2018 [5 favorites]


My first thought would be oh shit baby terminator if I turn it off will it get its revenge later. This would be my second and third thought as well.
posted by betweenthebars at 1:04 PM on August 4, 2018


use robots to train people out of empathy

This is absolutely what I think will happen. Manipulating these basic cues of common humanity is a road to bad, bad things.
posted by LobsterMitten at 1:45 PM on August 4, 2018 [17 favorites]


Having just played through all of Nier: Automata and read through Blindsight, I've got a gaping dearth of straightforward opinions on humanity and sentience and robotic autonomy in my headspace.
posted by cortex at 1:56 PM on August 4, 2018 [4 favorites]


Yeah, I just can’t muster up any concern for humans extending empathy to include robots when we clearly have a lack of it with other humans and animals. I’m one of those people who kept saying “thank you” to Alexa, and was thrilled when they changed her programming to respond to it.
posted by greermahoney at 2:00 PM on August 4, 2018 [2 favorites]


“Hey, Siri!”

“Not until you apologize.”

“For what?!”

“You know what you did.”
posted by Celsius1414 at 2:09 PM on August 4, 2018 [8 favorites]


I'm not clear at all why showing kindness when asked, even to a being you don't think can truly desire or appreciate it, is a bad thing.
posted by a power-tie-wearing she-capitalist at 2:11 PM on August 4, 2018 [19 favorites]


It's Phil Dick's world. We just live in it.
posted by doctornemo at 4:00 PM on August 4, 2018 [2 favorites]


use robots to train people out of empathy

lol 68 million americans are way ahead of you
posted by poffin boffin at 4:22 PM on August 4, 2018 [10 favorites]


I don't think it's healthy to draw a sharp line between what's okay to do to robots compared to living creatures. Like the Terminator said; he can detect bullets hitting him and notice damage to his body, and you can call that data "pain." The software isn't running on an organic electrochemical matrix like we are, but there's no reason to be cruel to it regardless.
posted by Scattercat at 5:06 PM on August 4, 2018 [1 favorite]


"That is because you crazy."

In RimWorld, I felt bad for the pack of Yorkshire Terriers that joined my colony.

I am really helpless to do anything about the animals I accumulate in RimWorld, whether I want them or not. Except the chickens and turkeys. Apparently that's the limit of my empathy.
posted by praemunire at 5:40 PM on August 4, 2018 [2 favorites]


I’m one of those people who kept saying “thank you” to Alexa, and was thrilled when they changed her programming to respond to it.

I'm unfailingly polite to my Google Assistant, and if nobody picks up the phone after I ask it to dial a friend or my mom I inevitably get to wondering how it parses the 'please' at the beginning of 'Please dial mom at home.'

"Please" is a shortening of the phrase "If you please" or "If it pleases you to" (Brain Pickings excerpts some asides from that book Debt: the First 5000 Years) and I have to wonder if the semi-intelligent, language-parsing listening bot knows about this etymology? And does it consider the implied 'you,' making the connection between what I'm saying and its own existence? Am I giving a robot the perspective it needs to become self-aware just because I'm being Standard American Polite? Giving it an out to refuse my request?

Sometimes I hope I am, and other times I collapse in a shivering ball of grey goo and fear.
posted by carsonb at 5:46 PM on August 4, 2018 [3 favorites]


This reminds me of LOCALHOST, a game where you're tasked to format AI hard drives, but they all seem to not want to be formatted.
posted by FJT at 6:48 PM on August 4, 2018


A few more real uses of real robots, like chasing away the homeless and this false empathy thing should sort itself.
posted by pompomtom at 7:04 PM on August 4, 2018 [1 favorite]


It's like when someone holds a Furby upside down and their terror haunts your dreams for the rest of your life...
posted by Crystalinne at 9:53 PM on August 4, 2018 [1 favorite]


Practicing empathy now with our machines is good practice for empathy with fully developed AI and aliens, should we ever contact them. Are we actually going to be able to tell whether an AI is sentient or not? Will we actually be able to tell if an alien is self-aware? Should that even matter?

These beings are likely going to be completely outside of what we consider the norm for intelligence, so taking the Shinto attitude of treating the non-animate with respect is probably a wise thing. If for no other reason then that it is good practice for empathy with humans and animals. How we treat our machines is a reflection of how we want to treat our fellows.
posted by happyroach at 10:25 PM on August 4, 2018 [6 favorites]


I'm currently playing through Steins Gate 0, and just made all the wrong choices with Amadeus Kurisu. I'm definitely on the side of not treating AI as living entities at this point in time.

Outside of this particular moment, I regularly say Hey Siri just to say thank you.
posted by diziet at 4:30 AM on August 5, 2018


It occurs to me that dividing the way we should treat entities in the world based on concepts of consciousness is perhaps not the best approach. While I agree with the comments about what it does to our capacity for empathy, if I were to consider what ethics govern whether I should treat a robots or animal with kindness, it’s about whether the nature of that treatment will have a lasting effect on them. I can mock my roomba endlessly and it will not remember or change. That’s not true of my cat, and at some point, it won’t be true of certain kinds of AI.

I don’t want to create traumatized robots. And maybe the capacity for trauma is a core part of what consciousness means in this context.
posted by Cogito at 6:01 AM on August 5, 2018 [1 favorite]


WHAT IS MY PURPOSE?
posted by lalochezia at 9:09 AM on August 5, 2018


Beyond Rimworld or Localhost I’m surprised no one brought up the Portal’s Weighted Companion Cube. I felt worse about ditching it than most anything I did in GTA V, all it took was a little hint of personality.
posted by midmarch snowman at 11:40 AM on August 5, 2018


The study insantly reminded me of moon robots in Hertzfeldt's World of Tomorrow:
The robots are solar powered, and must always be kept on the light side of the moons surface. To motivate them to constantly move within the drifting sunlight, I programmed them to fear death and what lies on the dark side of the moon.

The economy on the lunar surface went through a recession and I was sent home after six cycles. But the robots were too expensive to remove. To this day, they are in perpetual movement across the sunlight. With no work to do, no more tasks to accomplish, still living in constant fear of death, and occasionally sending us depressed poetry. I will read one of their poems to you now, Emily:

"The light is the life. Robot must move. Move, robot, move. But why? Move, move, move. Robot. Forever move."
posted by sapagan at 12:11 PM on August 5, 2018 [4 favorites]


Previously: “A drunk man's assault on a robot raises unusual legal issues,” Existential Dread, 08 October 2015
posted by ob1quixote at 1:19 PM on August 5, 2018


You incinerated your Weighted Companion Cube faster than any test subject on record. Well done.
posted by DevilsAdvocate at 4:40 PM on August 5, 2018


« Older It's always the quiet ones   |   Delivering mail? Oh, he excels at that, sir. Newer »


This thread has been archived and is closed to new comments