The number of nonconsensual deepfake porn videos online has exploded since 2017. As the harmful videos have spread, thousands of women—including Twitch streamers, gamers, and other content creators—have complained to Google websites hosting the videos and tried to get the tech giant to remove them from its search results.
A WIRED analysis of copyright claims regarding websites that host deepfake porn videos reveals that thousands of takedown requests have been made, with the frequency of complaints increasing. More than 13,000 copyright complaints—encompassing almost 30,000 URLs—have been made to Google concerning content on a dozen of the most popular deepfake websites.
The complaints, which have been made under the Digital Media Copyright Act (DMCA), have resulted in thousands of nonconsensual videos being removed from the web. Two of the most prominent deepfake video websites have been the subject of more than 6,000 and 4,000 complaints each, data published by Google and Harvard University’s Lumen database shows. Across all the deepfake platforms analyzed, around 82 percent of complaints resulted in URLs being removed from Google, the company’s copyright transparency data shows.
Millions of people find and access deepfake video websites by searching for deepfakes, often alongside the names of celebrities or content creators. WIRED is not naming the specific websites to limit the exposure they receive. However, lawyers and companies combating deepfakes online, including by systematically making DMCA complaints, say the number of copyright complaints and high percentage of removals are a sign that Google should take more action against the specific websites. This should include removing them from search results entirely, they say.
“If the sole purpose of these websites is to abuse and manipulate a person’s personal brand, or take their autonomy away from them, or host simple revenge porn, they shouldn’t be there,” says Dan Purcell, the founder and CEO of Ceartas, a firm that helps creators remove their content when it is being used without permission.
For the biggest deepfake video website alone, Google has received takedown requests for 12,600 URLs, 88 percent of which have been taken offline. Purcell says that given the large volume of offending content, the tech company should be examining why the site is still in search results. “If you remove 12,000 links for infringement, why are they not just completely removed?” He adds: “They should not be crawled. They’re of no public interest.”
In the five years since nonconsensual deepfake porn videos first emerged, tech companies and lawmakers have been slow to act. At the same time, machine learning improvements have made it easier to create deepfakes. Today, there are a few kinds of explicit deepfake content: videos where a person’s face is put onto existing consensual pornography, apps that can “undress” a person or swap their face onto a nude image, and some generative AI that allows entirely new deepfake images to be created, such as the images of Taylor Swift which spread online in January.