PETALING JAYA: Tech giant Google has announced significant updates to its search engine to combat the rising issue of non-consensual deepfake pornography.

In a blog post, Google’s product manager, Emma Higham, explained that the company has developed new systems to simplify the removal of explicit fake content and to demote websites known for hosting such material.

“When someone successfully requests the removal of explicit non-consensual fake content featuring them from Search, Google’s systems will also aim to filter all explicit results on similar searches about them.

“In addition, when someone successfully removes an image from Search under our policies, our systems will scan for – and remove – any duplicates of that image that we find,“ she said.

These measures are designed to provide individuals with greater reassurance, especially if they are concerned about similar content appearing in the future.

“The updates we’ve made this year have reduced exposure to explicit image results on these types of queries by over 70%. With these changes, people can read about the impact deepfakes are having on society, rather than see pages with actual non-consensual fake images,“ Emma explained.

Google is also updating its ranking systems to prioritise “high-quality, non-explicit content” and to reduce the risk of explicit fake content appearing in search results.

“So we’re demoting sites that have received a high volume of removals for fake explicit imagery. This approach has worked well for other types of harmful content, and our testing shows that it will be a valuable way to reduce fake explicit content in search results,“ she added.

Emma also highlighted the importance of differentiating between genuine and consensual explicit content, such as nude scenes, and explicit deepfake content.

“While differentiating between this content is a technical challenge for search engines, we’re making ongoing improvements to better surface legitimate content and downrank explicit fake content,“ she said.

According to Newsnation, the proliferation of user-friendly generative AI tools has significantly increased the number of deepfakes in recent years, with a 2021 study finding that approximately 96% of these deepfakes involve non-consensual pornography.