On April 25, Alexa Machine Learning’s Ruhi Sarikaya announced upcoming changes to Amazon’s Alexa Brain platform. Sarikaya describes that Alexa’s ultimate purpose is “to remove friction in [customers’] interactions with the digital and physical world.” The new changes to the platform exist to support that goal.
A few of the changes are simple improvements, such as launching skills using natural language and better understanding of “multi-turn utterances.” In other words, users could ask how to remove a stain and immediately be connected to the Tide Stain Remover skill, or consecutively ask “How long does it take to get to New York?” and “What’s the weather there?” without referring twice to the locale in question.
Sarikaya also mentioned a new memory feature, in which “Alexa can remember any information for you so that you never forget.” This feature is more likely to raise a red flag for those who study technological ethics: “memory” at its core is a uniquely human faculty, despite the fact that the artificial information processors we already use every day are adept at storing and “remembering” information.
Consumers and media reports typically focus on accidents or misuse of a technology when discussing technological ethics. The conversations around Amazon’s Echo/Alexa and other smart home devices usually revolve around privacy and surveillance, for example. But placing all attention in this area can blind even conscientious ethics researchers to the questions that matter most: namely, “How will this technology change the infrastructure of our lives, minds and bodies once it’s an integral part of existence?” The ultimate effects of innovation are much easier to miss than a few highly publicized privacy breaches or catastrophes. It’s the difference between boiling a frog and quickly slicing it in half with a butcher’s knife.
For example: how will Alexa having a memory affect our own human memories? A huge majority of people in the United States already use devices – smartphones, portable computers, etc. – that can remember for them. But will an Alexa memory become so pervasive that human neurochemistry changes somehow in response?
Consider the fact that Alexa is essentially an in-home slave to the home’s occupants. In January, U.K. research group Childwise published a report finding that voice-recognition programs were “teaching” children to become more demanding. Childwise’s concern is that digital assistants like Alexa, Siri and Cortana will respond regardless of manners, leading children (and adults) to simply bark orders at the device.
Alexa developers responded by incorporating a “Magic Word” feature in their upcoming May 9 software update, which (merely) acknowledges and praises children for using words like “please” and “thank you.” It’s still worth thinking about, however: how would people – adults and children alike – speak differently to an actual human being, or behave differently as opposed to doing the task themselves? And how will adapting to barking orders rather than asking politely change a user’s personality or communication strategies in everyday life?
I have tremendous respect for the Amish and other groups that show a reluctance to adopt modern convenience. These groups live in small communities around the United States and can be spotted traveling in horse-drawn buggies alongside modern cars. It’s easy to dismiss them as Luddites or simply backward in their thinking, but in reality they’re likely more tech-savvy than the average consumer.
Far from simply dismissing all technology as modernist evil, the Amish consider a new technology and carefully weigh it against their core values of rural living, manual labor and humility. If they determine that the community can employ the technology without compromising core values, they adopt it to some degree. This is the ultimate tech-savviness: giving each new innovation careful consideration instead of jumping in and immediately considering it “good” for themselves and their families.
It’s also the reason that many see the Amish as inconsistent or hypocritical. It’s sometimes difficult to understand why Amish communities embrace tractors but shun passenger cars, or embrace batteries but not mains power, for example. They carefully scrutinize and make decisions according to local need without compromising values.
But the ultimate difference between the Amish and the outside world is not merely questioning how new tech will affect and change everyday life. Rather, it’s a matter of acting following the asking. In his 2012 book Living into Focus, Arthur Boers relates the following anecdote:
An Amishman [was brought on a tour bus and] asked how Amish differ from other Christians. First, he explained similarities: all had DNA, wear clothes (even if in different styles), and like to eat good food.
Then the Amishman asked: “How many of you have a TV?”
Most, if not all, the passengers raised their hands.
“How many of you believe your children would be better off without TV?”
Most, if not all, the passengers raised their hands.
“How many of you, knowing this, will get rid of your TV when you go home?”
No hands were raised.
“That’s the difference between the Amish and others,” the man concluded.
The field of tech ethics has grown substantially in the last decade or so, and it’s now relatively normal to question innovations and worry about what societal changes they bring. With this in mind, there comes a point when only action suffices.
Many people closed their Facebook accounts following last month’s Cambridge Analytica scandal, for example, but given the severity of that breach the number should’ve been much higher. The point is that Facebook is so entrenched in modern communication and society that many people likely find it impossible to quit, even if they know they should.
It seems safe to say that digital assistants and smart home devices will become as ubiquitous as the automobile at some point. And in the case of a major breach of privacy or security, which will undoubtedly occur, it’s up to the individual user to decide whether to continue on that path or veer off on the more difficult – but ultimately right – one instead.