news

To prevent the Taylor Swift "indecent photo" incident from happening again, Google cracks down on deepfake content

2024-08-01

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

IT Home reported on August 1 that Google published a blog post yesterday (July 31),Announced that it will take a more proactive approach to cleaning up fake content in its search engine

Google said that with the popularization of AI technology, more and more AI-generated pictures and videos are being produced, which has also spawned bad content such as "deepfakes" and brought serious negative impacts.

IT Home reported in January this year that Taylor Swift, a global top star, became a victim of deepfakes. A large number of fake "indecent photos" were flooded on social platforms such as X and Facebook, and the number of views has reached tens of millions.

Google is rolling out new online safety features that will make it easier to remove obvious deepfakes from searches at scale and prevent them from appearing high in search results in the first place.

After the user successfully requests to remove the relevant deepfakes content from the search, Google's system will actively filter the relevant search results and clean up all duplicate image content.

Google has also updated its ranking system so that if a user searches specifically for deepfakes with a person's name, search results will show "high-quality, non-explicit content" such as related news reports.

Google said that previous updates had reduced the exposure of explicit deepfakes image results by more than 70% in queries specifically searching for such deepfakes content.