Many voice artists have found that their voices are being copied and used wrongly without them saying it’s okay. Sometimes, these copied voices even say mean or hurtful things.
Because of new computer tools, copying a voice is now quick and easy. Companies like Speechify and Resemble AI are trying to help these artists. But, changing the current voice protection laws might be the best way to solve this problem.
What’s the Story?
Voice artists are dealing with a big problem of their voices being copied.
Using tools like ElevenLabs, Uberduck, and FakeYou.ai, people are copying well-known voices and sharing them online.
To fight back, artists have made an online list. This list asks for certain names to be taken off these copying tools.
But there’s a challenge. The law doesn’t fully protect voices like it does songs or movies. Only the actual recording of a voice has protection.
This is also becoming a work issue. Some artists say that companies are using these tools to save money and time, which isn’t fair to them.
Groups like the National Association of Voice Artists and SAG-AFTRA are trying to make new rules to protect voices from being copied.
Looking at the results: how well are artists doing in stopping this copying? FakeYou.ai has agreed to remove voices if asked. Uberduck did the same recently.
However, ElevenLabs, the dominant player in voice generation, has been resistant with artists, denying verified usage of their technology in particular instances (yet recognizing the general non-consensual usage problem). As a countermeasure, the firm introduced a “voice verification” feature for user authentication.