expansion / / 4th generation Amazon Echo Dot smart speaker.

Amazon

Amazon is looking for a way to make Alexa Voice Assistant a deepfake with a short recording of anyone’s voice, dead or alive. The company demonstrated this feature at the re: Mars conference in Las Vegas on Wednesday, taking advantage of the ongoing pandemic and emotional trauma of sadness.

Amazon’s re: Mars focuses on artificial intelligence, machine learning, robotics and other new technologies, set in technology experts and industry leaders. In the keynote on the second day, Rohit Prasad, Amazon’s Senior Vice President and Head Scientist for Alexa AI, showed off the features being developed for Alexa.

In the demo, the child asked Alexa, “Can Grandma finish reading me? Wizard of OzAlexa replies “OK” in the voice of her typical feminine robot. But then, the voice of the child’s grandma came out of the speaker, L. Read the story of Frank Baum.

You can see the demo below:

Amazon re: MARS2022-2 Day-2-Keynote speech.

Prasad just said that Amazon is “working” on the Alexa feature, but doesn’t specify what’s left and when / when it’s available.

But he provided detailed technical details.

“This required an invention that required us to learn to produce high quality audio with less than a minute of recording for a few hours of recording in the studio,” he said. “The way we did that is to frame the problem as a speech conversion task rather than a speech generation task.”

Prasad explained very briefly how this feature works.
expansion / / Prasad explained very briefly how this feature works.

Of course, Deep Faking has a controversial reputation. Still, there have been some efforts to use the technique as a tool, not as an eerie means.

Especially as The Verge points out, audio deepfake is used in the media, for example, when a podcaster ruins a line or when a project star suddenly dies (in Anthony Bourdain’s documentary). So) to help supplement. Michibasiri..

Some people use AI to create chatbots that communicate as if they were lost loved ones.

Alexa isn’t the first consumer product to type using deepfake audio for families who can’t be there directly. Gizmodo points out that TAKARATOMY’s smart speakers use AI to read children’s bedtime stories in the voice of their parents. Parents allegedly read the script for about 15 minutes and upload their voice, so to speak. This is especially different from Amazon’s demo, but the product owner decides to offer vocals rather than a product that uses the voice of someone who may not be able to grant permission.

In addition to concerns about the use of deepfake for fraud, fraud, and other malicious activities, there are already some annoyances about how Amazon is framing this feature, but it’s still There is no release date.

Before showing the demo, Prasad talked about Alexa giving users a “dating relationship.”

“In this dating role, the human attributes of empathy and emotion are the keys to building trust,” said an executive. “These attributes became even more important in such a pandemic era when many of us lost loved ones. AI can’t get rid of the pain of that loss, but it definitely lasts their memory. You can.”

Prasad added that this feature “enables lasting relationships.”

In response to the emotional distress caused by the COVID-19 pandemic, it is true that countless people are seriously seeking human “empathy and emotions.” However, Amazon’s AI voice assistant is not the place to meet these human needs. Alexa also cannot enable “permanent personal relationships” with people who are not with us.

There is good intentions behind this feature under development, and it’s not hard to believe that it’s very comfortable to hear the voices of those who missed it. In theory, I even found that I was enjoying such a feature. It’s harmless to make Alexa sound like you’re stupid to your friends. And, as explained above, there are other companies that are leveraging deepfake technology in a similar way that Amazon has demonstrated.

However, assembling Alexa features under development as a way to regain connections with a deceased family is a huge, unrealistic, and problematic leap. On the other hand, it is unfortunate to pull the heartstrings by bringing about the sadness and loneliness associated with a pandemic. There are several places that Amazon doesn’t belong to, and grief counseling is one of them.

By Author

Leave a Reply

Your email address will not be published.