Generative AI has been used in a lot of bad ways, like making up academic papers and copying artists. Now it seems to be showing up in state efforts to influence people.
A recent report from the Massachusetts-based threat intelligence company Recorded Future says that commercial AI voice generation products, such as tech made public by the hot startup ElevenLabs, “very likely” helped with a recent campaign.
The report talks about a Russian-backed effort to weaken Europe’s support for Ukraine. It was called “Operation Undercut,” and it used AI to make voiceovers for fake or false “news” videos.
The videos were aimed at people in Europe and had a variety of topics, such as calling Ukrainian officials corrupt or questioning the value of sending military aid to Ukraine. One video, for example, said, “even jammers can’t save American Abrams tanks.” This was a reference to the devices that US tanks use to block incoming missiles, which made the point that sending high-tech armor to Ukraine is useless even more.
The report says it’s “very likely” that the people who made the videos used voice-generated AI, such as ElevenLabs technology, to make it look more real. The researchers from Recorded Future checked this by sending the clips to ElevenLabs’ AI Speech Classifier. This tool lets anyone “detect whether an audio clip was created using ElevenLabs,” and it found a match.
ElevenLabs did not answer when asked for a response. Recorded Future said that a number of commercial AI voice creation tools were probably used, but only ElevenLabs was named.
The people in charge of the influence effort accidentally showed how useful AI voice generation can be when they released some videos with real people voiceovers that had “a discernible Russian accent.” The voiceovers made by AI, on the other hand, spoke in English, French, German, and Polish, among other European languages, without any accents that made them sound foreign.
Recorded Future says that AI also made it possible for the misleading videos to be quickly shared in English, German, French, Polish, and Turkish, all of which are spoken in Europe and are backed by ElevenLabs.
Recorded Future said the activity was linked to the Social Design Agency, a group based in Russia that the U.S. government sanctioned in March for “running a network of over 60 websites that impersonated real news organizations in Europe and then used fake social media accounts to spread the false information on the spoofed websites.”The U.S. State Department said at the time that everything was done “on behalf of the Government of the Russian Federation.”
Recorded Future found that the effort didn’t have much of an effect on public opinion in Europe as a whole.
Not the first time that ElevenLabs’ goods have been blamed for being used in the wrong way. A voice scam detection company told Bloomberg that the company’s technology was used to make a robocall that sounded like it was from President Joe Biden and told people not to go out and vote in a January 2024 primary election. In answer, ElevenLabs said it had added new safety features, such as the ability to automatically block politicians’ voices.
ElevenLabs says it will not allow “unauthorized, harmful, or deceptive impersonation” and uses a variety of tools to do so, such as human and automatic moderation.
Also Read: Voice Cloning of Politicians is Still as Easy as Pie
Since its start in 2022, ElevenLabs has grown by leaps and bounds. Parhlo World stated that it may soon be worth $3 billion because its annualized revenue rose from $25 million less than a year ago to $80 million. Andreessen Horowitz and Nat Friedman, who used to be CEO of Github, are among its backers.
What do you say about this story? Visit Parhlo World For more.