Alexa acknowledges the child’s request in her default, robotic voice, and then immediately transitions to a softer, more humanlike tone, apparently mimicking the voice of the child’s dead grandmother, and narrates an excerpt from the children’s novel.
Prasad introduced the clip by saying that adding “human attributes” to AI systems was increasingly important “in these times of the ongoing pandemic when so many of us have lost someone we love.”
“While AI can’t eliminate that pain of loss, it can definitely make their memories last,” he added.
Check out the demo video below:
The Alexa team is teaching the digital assistant to mimic the voice of anyone it hears from just a one-minute recording of that audio.
The company is pitching the functionality as a way to help people preserve memories, especially those who lost their loved ones due to COVID-19.
“This required inventions where we had to learn to produce a high-quality voice with less than a minute of recording versus hours of recording the studio,” Prasad said during the conference.
“The way we made it happen is by framing the problem as a voice conversion task and not a speech generation path. We are unquestionably living in the golden era of AI, where our dreams and science.”
The FUCK you will. Remember when we told you deepfakes would increase the mistrust, alienation, & epistemic crisis already underway in this culture? Yeah that. That times a LOT. “Amazon has a plan to make Alexa mimic anyone’s voice [w/o their consent]” https://t.co/kXm4EXKgp8 — Damien P. Williams, MA, MSc, ABD, Patternist (@Wolven) June 22, 2022
The phone attacking implications of this tool are not good at all — this will likely be used for impersonation. At Amazon’s re:MARS conference they announced they’re working to use short audio clips of a person’s voice & reprogram it for longer speech https://t.co/5TkEIHoeXG — Rachel Tobac (@RachelTobac) June 22, 2022
“ALEXA, PLAY DESPACITO” My dead grandma’s voice, robotic: “Now playing on Amazon Music, Despacito” https://t.co/D6BLQCafzj — Parker Molloy (@ParkerMolloy) June 22, 2022
Umm, so how soon will criminals be able to use it to call your family members begging them to Venmo cash? Or ask them for social security numbers? Or bank information? — ?bitty_in_pink ? (@bitty_in_pink) June 22, 2022