Amazon makes use of child’s useless grandma in morbid demo of Alexa audio deepfake

Enlarge / The 4th-gen Amazon Echo Dot good speaker.

Amazon

Amazon is determining the right way to make its Alexa voice assistant deepfake the voice of anybody, useless or alive, with only a brief recording. The corporate demoed the characteristic at its re: Mars convention in Las Vegas on Wednesday, utilizing the emotional trauma of the continued pandemic and grief to promote curiosity.

Amazon’s re: Mars focuses on synthetic intelligence, machine studying, robotics, and different rising applied sciences, with technical consultants and business leaders taking the stage. Through the second-day keynote, Rohit Prasad, senior vice chairman and head scientist of Alexa AI at Amazon, confirmed off a characteristic being developed for Alexa.

Within the demo, a toddler asks Alexa, “Can grandma end studying me Wizard of Oz? “Alexa responds,” Okay, “in her typical effeminate, robotic voice. However subsequent, the kid’s grandma’s voice comes out of the speaker to learn L. Frank Baum’s story.

You possibly can watch the demo under:

Amazon re: MARCH 2022 – Day 2 – Keynote.

Prasad solely stated Amazon is “engaged on” the Alexa functionality and did not specify what work stays and when / if it will be out there.

He did present minute technical particulars, nonetheless.

“This required invention the place we needed to study to supply a high-quality voice with lower than a minute of recording versus hours of recording in a studio,” he stated. “The way in which we made it occur is by framing the issue as a voice-conversion job and never a speech-generation job.”

Prasad very briefly discussed how the feature works.
Enlarge / Prasad very briefly mentioned how the characteristic works.

After all, deepfaking has earned a controversial status. Nonetheless, there was some effort to make use of the tech as a instrument somewhat than a way for creepiness.

Audio deepfakes particularly, as famous by The Vergehave been leveraged within the media to assist make up for when, say, a podcaster messes up a line or when the star of a venture passes away all of the sudden, as occurred with the Anthony Bourdain documentary Roadrunner.

There are even cases of individuals utilizing AI to create chatbots that work to speak as if they’re a misplaced liked one, the publication famous.

Alexa would not even be the primary shopper product to make use of deepfake audio to fill in for a member of the family who cannot be there in particular person. The Takara Tomy good speaker, as identified by Gizmodomakes use of AI to learn youngsters bedtime tales with a dad or mum’s voice. Dad and mom reportedly add their voices, so to talk, by studying a script for about quarter-hour. Though, this notably differs from Amazon’s demo, in that the proprietor of the product decides to supply their vocals, somewhat than the product utilizing the voice of somebody doubtless unable to offer their permission.

Moreover worries of deepfakes getting used for scams, rip-offsand different nefarious exercisethere are already some troubling issues about how Amazon is framing the characteristic, which does not also have a launch date but.

Earlier than displaying the demo, Prasad talked about Alexa giving customers a “companionship relationship.”

“On this companionship position, human attributes of empathy and have an effect on are key to constructing belief,” the exec stated. “These attributes have turn out to be much more essential in these instances of the continued pandemic, when so many people have misplaced somebody we love. Whereas AI cannot eradicate that ache of loss, it may undoubtedly make their reminiscences final.”

Prasad added that the characteristic “allows lasting private relationships.”

It is true that numerous individuals are in severe search of human “empathy and have an effect on” in response to emotional misery initiated by the COVID-19 pandemic. Nevertheless, Amazon’s AI voice assistant is not the place to fulfill these human wants. Alexa can also’t allow “lasting private relationships” with people who find themselves now not with us.

It isn’t exhausting to consider that there are good intentions behind this growing characteristic and that listening to the voice of somebody you miss is usually a nice consolation. We may even see ourselves having enjoyable with a characteristic like this, theoretically. Getting Alexa to make a buddy sound like they stated one thing foolish is innocent. And as we’ve mentioned above, there are different firms leveraging deepfake tech in methods which can be just like what Amazon demoed.

However framing a growing Alexa functionality as a solution to revive a connection to late relations is a big, unrealistic, problematic leap. In the meantime, tugging on the heartstrings by bringing in pandemic-related grief and loneliness feels gratuitous. There are some locations Amazon does not belong, and grief counseling is certainly one of them.

Leave a Comment