A scorching potato: Amazon is growing capabilities that can permit its Alexa voice assistant to imitate any human voice after listening to them communicate for lower than a minute. Dismissing the potential creepiness of the function, some are involved concerning the potential for abuse.
Rohit Prasad, who leads the Alexa crew at Amazon, mentioned the objective of the mission is to “make the reminiscences final” after “so many people have misplaced somebody we love” because of the pandemic.
Alexa might be educated to mimic a voice utilizing pre-recorded audio, that means the particular person would not must be current – and even alive – to function a supply. In a video section proven throughout a convention this week, a baby requested Alexa if grandma may end studying The Wizard of Oz. Sure sufficient, Alexa adjustments voices to mock the kid’s grandmother and end studying the story.
Prasad mentioned throughout the presentation that Alexa now receives billions of requests per week from lots of of thousands and thousands of Alexa-enabled units throughout 17 languages in additional than 70 nations across the globe.
The potential for abuse appears excessive. For instance, the instrument might be used to create convincing deepfakes for misinformation campaigns or political propaganda. Fraudsters may leverage the capabilities for monetary achieve, like in 2020 when scammers tricked a financial institution supervisor into transferring $35 million to fund an acquisition that did not exist.
What are your ideas on the matter? Is Amazon taking the idea of voice cloning a bit too far right here, or are you intrigued by the concept of getting a “dialog” with somebody from the grave?
Image credit score: Jan Antonin Kolar