Looks Like Our Helper Robots Will Just Buy Anything

Remember that girl who asked Alexa for a dollhouse? The story gets better.

Photo credit: Sarah Ackerman, CC BY 2.0.

I have an amazing follow-up to the “child asks Alexa to bring her a dollhouse and cookies, Alexa follows orders and purchases those products from Amazon” story:

Amazon’s Alexa started ordering people dollhouses after hearing its name on TV

The story could have stopped there, had it not ended up on a local morning show on San Diego’s CW6 News. At the end of the story, Anchor Jim Patton remarked: “I love the little girl, saying ‘Alexa ordered me a dollhouse,’” According to CW6 News, Echo owners who were watching the broadcast found that the remark triggered orders on their own devices.

Patton didn’t think that any of the devices went through with their purchases, who told The Verge that the station received a handful of reports of viewer devices attempting to order a dollhouse after hearing his remarks.

I followed that link and listened to the clip, and Patton does in fact say “Alexa ordered me a dollhouse,” not “Alexa, order me a dollhouse”—which means Alexa is not yet able to distinguish between past and present tense.

She is, however, very capable of ordering dollhouses.

Here’s where I mention that I only discovered this after reading The Atlantic this morning and learning that there are all kinds of ways for hackers and thieves to steal your money (and/or buy dollhouses) via voice-activated command, including embedding commands into static:

The Demon Voice That Can Control Your Smartphone

Hidden voice commands can cause more damage than just a false text or silly tweet. An iPhone whose owner has already linked Siri to a Venmo account, for example, will send money in response to a spoken instruction. Or a voice command could tell a device to visit a website that automatically downloads malware.

At this point I’m wondering why we don’t get the opportunity to rename our devices, as a sort of loose password against this type of hacking. If we could teach Siri and Alexa to answer to new names, then saying “Alexa ordered me a dollhouse” wouldn’t activate every Alexa in town.

Of course, we’d very quickly have those “most common alternate Siri names” lists, and enough people would start calling their devices “Samantha” that hackers would plan for that.

Or we could all agree that having to pay for the occasional surprise dollhouse—or, you know, having someone use static to steal money from our Venmo accounts—is the price of progress.

I’m very curious to see what happens next.

Support The Billfold

The Billfold continues to exist thanks to support from our readers. Help us continue to do our work by making a monthly pledge on Patreon or a one-time-only contribution through PayPal.