Technology

Amazon’s Alexa Could Turn Dead Relative’s Voice Into Digital Assistant

Published

on

Amazon users could soon hear the voices of dead loved ones as virtual assistants. Experts call the device latest features a slippery slope, comparing it to an episode of “Black Mirror”. For those curious, the episode’s name is “Be Right Back”. The experimental Amazon Alexa dead voices feature allows the digital assistant to mimic the voices of users’ dead relatives.

How Does The New Alexa Amazon Dead Voices Feature Work? 

Atop a bedside table at the Amazon tech summit demo, an Echo Dot was asked to do something: “Alexa, can Grandma finish reading me ‘The Wizard of Oz’?”

Alexa’s typically cheery voice boomed from the kids-themed smart speaker with a panda design: “Okay!” Then, as the device began narrating a scene of the Cowardly Lion begging for courage, Alexa’s robotic accent was replaced by a more human-sounding narrator.

“Instead of Alexa’s voice reading the book, it’s the kid’s grandmother’s voice,” Rohit Prasad, SVP and head scientist of Alexa Artificial Intelligence, explained during a keynote speech in Las Vegas. 

The demo was the first look into Alexa’s newest feature. Though still a work in programs, the feature would allow the voice assistant to replicate people’s voices from short audio clips. The goal, Prasad said, is to build greater trust with users by infusing artificial intelligence with the “human attributes of empathy and affect.”

The new feature could “make memories with loved ones,” Prasad explained. 

Security Concerns on Amazon Alexa Dead Voices 

While the plan of sharing a dead relative’s voice may be sentimental, it also raises alarming security and ethical concerns.

“I don’t feel we’re ready for user-friendly voice-cloning technology,” Rachel Tobac, CEO of the San Francisco-based SocialProof Security, told The Washington Post

This technology can manipulate the public through fake audio or video clips. Tobac added that if a cybercriminal can easily replicate another person’s voice with a small voice sample, they can use that to impersonate other individuals. And the worst implications might be the risk of fraud, data loss, hacking, etc. 

Another cybersecurity expert Tama Leaver of Curtin University in Australia said there’s the danger of causing misconceptions about what is human and what is mechanical. The new feature will require data-harvesting services if it’s about speaking with a dead relative’s voice.

“In some ways, it seems like an episode of ‘Black Mirror,'” Leaver said, comparing it to a sci-fi series that envisions a tech-themed world. 

Possible Security or Consent Concerns

Still, the new Alexa feature is also prone to questions on consent. Specifically, it is detrimental for people who never imagined a robotic personal assistant would belt out their voice after they die.

“There’s a real danger in using deceased people’s data. It is creepy and deeply unethical because they’ve never considered those traces being used that way,” Leaver emphasized. 

Having recently lost his grandfather, Leaver said he empathized with the “temptation” of wanting to hear a loved one’s voice. However, sharing those snippets of memories with loved ones to Amazon and the world is pretty risky. 

Prasad didn’t give more details during the event. He said that the ability to mimic voices resulted from “unquestionably living in the golden era of AI. It is where our dreams and science fiction are becoming a reality.”

Should the Amazon Alexa dead voices demo become a fundamental feature, people might be curious about how their voices could be used when they die. Tama Leaver said that sounds weird for now. But it’s possibly a question we should have an answer to before Alexa starts talking like you or me soon,” he added.

And for other news, read more here at Owner’s Mag!

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version