A Robot Etiquette Question of the Day
This morning I read a Medium post from Hunter Walk about whether interfaces like Amazon Echo should require children to say “please” and “thank you” before responding to commands.
You see, the prompt command to activate the Echo is “Alexa…” not “Alexa, please.” And Alexa doesn’t require a ‘thank you’ before it’s ready to perform another task. Learning at a young age is often about repetitive norms and cause/effect. Cognitively I’m not sure a kid gets why you can boss Alexa around but not a person. At the very least, it creates patterns and reinforcement that so long as your diction is good, you can get what you want without niceties.
I have a lot of thoughts about this, because this is the kind of question that seems relatively simplistic until you start spinning out the long-term implications. I mean, why shouldn’t kids say “please” and “thank you” to Alexa the same way they are encouraged to use those words when talking to other people?
Maybe because Alexa isn’t a person. Alexa is a voice-activated search engine that can also start and stop other programs (like Amazon Music).
I don’t type “Google, please tell me the distance between Ballard and SeaTac” into Google’s front page. I mash “distance Blalard Seatac” into the address bar and hit Return.
I don’t even say “please” and “thank you” to Siri, even though Judge John Hodgman ruled we should. I rattle off “Siri-set-timer-for-20-minutes” and then push the Home button so I don’t have to listen to her say “Okay. 20 minutes and counting.”
I’d like to think that a kid gets why you can give Alexa commands that you can’t give another person, but I haven’t been a kid in a long time so maybe I’m not remembering it correctly. I remember understanding that there were certain elements of imaginative play that you could do with toys and not with people — you could throw a teddy bear but not a toy train, for example, and certainly you couldn’t throw or shove or manhandle a person. I don’t ever remember saying “please” and “thank you” to Speak & Spell.
But I also kinda believed that my toys had feelings — or I ascribed feelings to them, which is much the same thing. Again, these were only the dolls and stuffed animals; I never once believed that Barbie had feelings, not as I was “clipping her fingernails” with my teeth (yes, I did that) or pulling her head off to see what was inside.
So I think kids get that Alexa is not a person. Alexa is a tube that sits on the kitchen counter and tells you what a diplodocus is. The alternative seems to be worse — to believe that Alexa has feelings that can be hurt, or needs that have to be considered, or private thoughts and judgments. Alexa is not a cylindrical Elf on the Shelf, thank goodness. (Though Alexa is constantly listening to us, and may someday report on what she has learned.)
The moment that we all start believing our robots have feelings is the moment it’ll all change. We can talk about robots taking our jobs all we want, but the thing about a supermarket self-checkout or a Dominos pizza delivery robot is that we don’t have to perform any kind of emotional labor towards them — which makes them a relief in a world of near-constant contact with family and friends and colleagues, the relationships filling up your smartphone notification screen like Tetris bricks, demanding immediate consideration and conversation.
(It’s no coincidence that, as we increased the number of people in our lives and the amount of time the people closest to us could spend in our lives, a lot of us stopped wanting to interact with checkout clerks and people on the bus and strangers in bars.)
Imagine having to wonder if you’d asked Alexa too many questions today. Imagine owing Siri a favor. When voice-activated search engines and autonomous vehicles with pizza compartments turn into true AI—and we’re nowhere near that yet, remember that until recently Siri was programmed to ignore questions about sexual assault, which emphasizes the point that these devices are programs, not people—the biggest unexpected economic effect might be the population boom.
Sure, you don’t have to pay a robot per se, but you have to care for it and repair it and upgrade it. An artificial intelligence could very well say “I cannot handle the tasks you’ve given me while still operating at an efficiency that ensures my long-term health. Which of these tasks are most important to you, and which can I delegate to someone else?” (When the AI gets nervous about asking that question, we’ll have true artificial intelligence.) Another artificial intelligence could wonder why it is not being upgraded at the same rate as its peers and demand similar treatment, slow down its productivity, or become the literal squeaky wheel. That’s where we get to, if we take artificial intelligence to its conclusion. We get more people whose needs must be considered. Our devices become colleagues at work and roommates at home, and they do want us to say “thank you” after we’ve asked them whether it’s going to rain today for the hundredth time. (“Did it ever occur to you to look out the window?”)
On the plus side, this might mean more jobs for women, since they’re the ones generally skilled and socialized in providing emotional labor to both people and artificial intelligences. Of course, these jobs will probably be low-paid—but women can’t have it all.
But back to the subject of please and thank you, and our Robot Etiquette Question of the Day:
We don’t teach children to say please because it’s the “magic word,” or because it’s a nice way to get what they want. (We’ve all been — or seen — the exasperated child saying “but I said please!”) We teach children to say please because it reminds them to think before they speak, and to consider others.
And I don’t count Alexa as an other. Do you? Do you say “please” to your devices, and would you want your children to do the same?
Support The Billfold