YouTube is developing new tools to help creators find videos where AI-generated versions of their voices or faces are used without permission. These tools aim to protect creators from deepfake content, the company announced on Thursday.
Two tools are in the works, but YouTube hasn’t shared when they will be available. The first tool will focus on detecting AI-generated singing voices and will be added to YouTube’s existing Content ID system. Content ID is used to find copyright infringement, such as unauthorized use of songs or videos. This new feature will help musicians detect if AI has been used to mimic their voices in new songs. However, it’s unclear how well this will work for lesser-known artists whose voices aren’t as easily recognized. It will likely help big record labels and famous artists like Drake, Billie Eilish, or Taylor Swift to take down channels that post AI-generated songs mimicking them.
See Also: 11 Best AI Tools For Video Generator
The second tool will help public figures, like influencers, actors, athletes, or artists, identify and flag AI-generated videos that use their faces. YouTube has not confirmed if this tool will be used to automatically detect deepfake images of people who aren’t famous. When asked, a YouTube representative said that their privacy policy allows anyone to request the removal of deepfake or AI-generated impersonation content, but individuals will need to actively search for these videos themselves.
YouTube also did not confirm if it plans to use this tool to remove AI-generated scam videos, which often impersonate famous people like Elon Musk. These scam videos have frequently appeared on hacked accounts across YouTube. While YouTube’s Community Guidelines do not allow spam, scams, or deceptive content, viewers still need to manually report these videos for removal.
As AI technology becomes more accessible, creating deepfake media has become easier. A recent study found that the number of deepfake videos online has increased by 550% since 2021, with over 95,000 deepfake videos tracked. Notably, 98% of these videos were pornographic, and 99% of the impersonated individuals were women.
Source: pcmag