A machine called the cops on a human the other day.
In it’s defense, it was merely responding to the command of a human. Strike that– it was responding to what it perceived the human’s command to be, but it interpreted it in an unintended way. I’ll leave it up to you to decide which scenario, absent the facts, is worse.
Twenty minutes outside of Albuquerque, New Mexico, during a domestic dispute over an alleged infidelity, a man allegedly beat his girlfriend. Next, he allegedly pulled a gun and asked, among other things, “Did you call the sheriffs?” What Alexa, an Amazon smart speaker attached to a landline heard, was “call the sherrifs?”. Alexa, authorities say, probably saved her life.
After the SWAT team was called, 28-year old Eduardo Barros (not to be mistaken with Bezos) was taken into custody and is being held without bail on charges of aggravated battery, false imprisonment, and felony possession of a firearm.
To be fair, Alexa doesn’t call cops… people call cops, even if it’s inadvertently and through Alexa.
Perhaps in the future, people will set up safe words to call the police when they can’t do so openly, like “purple penguin steak”. Perhaps these speakers are at risk of being hacked, by teenagers in basements or government agents in data centers. Perhaps in the far (or not so far) future, such machines would be required in every home, and would automatically record conversations with specific key words like “explosives”, “jihad”, or “I want pineapple on my pizza”. Perhaps they’d call the cops even when they hear phrases like “pass the bowl” or “I don’t care if the tag says ‘do not remove’.” Maybe GPS could inform cops of speeding and other traffic violations.
I don’t know what the future holds, but Alexa may have saved a life in Albuquerque.