ADVERTISEMENT

Technology

Google Works to Reduce Non-Consensual Deepfake Porn in Search

The Google logo on a smartphone arranged in New York, US, on Tuesday, July 16, 2024. Google parent Alphabet Inc. is in talks to acquire cybersecurity startup Wiz Inc., according to a person familiar with the matter. (Gabby Jones/Bloomberg)

(Bloomberg) -- Google is making adjustments to its search engine to reduce the prevalence of sexually explicit fake content high in results, responding to the explosion in non-consensual unsavory content people have created using generative artificial intelligence tools.

When that AI-generated content features a real person’s face or body without their permission, that person can request its removal from search results. Now, when Google decides a takedown is warranted, it will filter all explicit results on similar searches and remove duplicate images, the company said Wednesday in a blog post. The Alphabet Inc. unit also said it had improved its search ranking systems so that explicit fake content would not appear as top results — a change that Bloomberg reported in May was already in the works. 

“We’ve long had policies to enable people to remove this content if they find it in search, but we’re in the middle of a technology shift,” said Emma Higham, a product manager who spearheads protections for Google’s generative AI technology in search and other apps, in a briefing with reporters. “As with every technology shift, we’re also seeing new abuses.”

In 2023, Bloomberg found that Google Search was the top traffic-driver to websites hosting deepfakes, or sexually explicit AI-generated pornography. On Google, a search for many well-known celebrities matched with the word “deepfake” pointed users to MrDeepfakes.com and other sites that largely exist to trade in pornographic imagery. One year ago, Google Search accounted for 44% of the 4 million desktop visits to MrDeepfakes.com, according to data from Similarweb. 

Google has been adjusting the results for queries specifically seeking deepfake content tied to someone’s name. Instead of showing those images, the search engine will be trained to surface high-quality, non-explicit content, like news articles, when those results are available, the company said. So far, such changes have reduced exposure to explicit image results on these types of queries by over 70%, Google said.

Between April and May, US-based search traffic to the top two deepfake pornography websites plummeted, according to data from Similarweb published in a May Bloomberg report. 

Google is also demoting websites that feature a high volume of pages that have been removed from search because they violated policies against explicit fake content, the company said in its blog post.

In a separate interview, Higham, the product manager, said that Google is facing new challenges in deciding when to take action on non-consensual explicit imagery and when to leave results on its search engine untouched, citing the need to balance users’ ability to find information with their online safety. Advocates have criticized Google for de-ranking, rather than completely de-listing, the deepfaked content.

“We have to be careful about not taking too blunt an approach and having unintended consequences on access to information,” Higham said, referencing adult performers who may not want explicit, consensual content to be de-ranked on the search engine. “But when we’re seeing high-risk queries and a high risk of fake explicit content showing up non-consensually, we are taking strong action now.”

©2024 Bloomberg L.P.