Some examples include people making it sounds like Joe Biden is announcing that the US will send troops into Ukraine, celebrities reading excerpts from Mein Kampf, and all manner of racist and offensive messages.
ElevenLabs' AI speech tool, VoiceLab, lets you "clone" someone's voice from a one-minute clip of them speaking, allowing you to have at it with the cloned voice able to spit out any 2,500 characters via a text-to-speech interface.

With the state of discourse on the internet and anonymity, it's not a surprise that people have been taking advantage of and abusing the tool to create objectionable material and spreading it online.
And with that ElevenLabs is taking steps to stop "bad actors" out there from using its tool for "malicious purposes." Taking to Twitter, the company outlines the things you'd expect to have been there on day one - like a user verification system and the ability to verify that the audio was generated via AI.
"We've always had the ability to trace any generated audio clip back to a specific user," writes ElevenLabs. "We'll now go a step further and release a tool which lets anyone verify whether a particular sample was generated using our technology and report misuse."
In the immediate future, VoiceLab will only be available to users that pay, and they'll be limited to the number of characters and text they can turn into audio, which puts a price on internet shitposting using the tool.
"This will keep our tools accessible while allowing us to fight potential misuse," EventLabs explains. "Payment details won't always prevent abuse, but they make VoiceLab users less anonymous and force them to think twice before sharing improper content."