news

"AI stripping" hunts down female internet celebrities. Experts in the industry chain suggest promoting photo anti-counterfeiting technology

2024-08-27

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

"Some time ago, someone sent me a private message saying that he saw my photo and asked me if I would accept a private call. I looked carefully and found that the so-called photo was synthesized by AI." Talking about his experience of being "spoiled by pornographic rumors", blogger Xiao K (pseudonym) still has lingering fears.

In recent years, AI technology has gradually entered the lives of more people, but some people with evil intentions have also seen business opportunities. A reporter from Beijing News Shell Finance found that there are many groups selling "AI undressing" photos of Internet celebrities and actresses on the black and gray industry platforms. According to practitioners in the black and gray industries, 5 yuan can "undress" and 20 yuan can achieve "video face replacement". In this industry chain, there are also practitioners who attract traffic through eye-catching photos and then make profits through "member group" services. In addition, there are video authors on platforms such as Bilibili and Xiaohongshu who attract attention with titles such as "AI undressing tutorial".

In this regard, Zhao Hu, a partner at Beijing Zhongwen Law Firm, said that selling AI undressing or AI face-changing pictures would violate the provisions of the "Public Security Administration Punishment Law", and the perpetrators would face administrative penalties of detention or fines, and may also be held criminally liable. If the pictures are not sold but only disseminated, in serious circumstances, it will still constitute the crime of disseminating obscene materials and will be held criminally liable in accordance with the law. Teaching technology is suspected of aiding and abetting. In addition, the acts of AI clothing removal and AI face-changing by the perpetrators also involve infringement of citizens' personal information.

"There is no essential difference between using AI for deep fakes and traditional technical fakes, but the efficiency is much higher than traditional fake methods, and the technical threshold is much lower, making it easy for ordinary people to achieve. Deep fake 'works' will further flood the Internet." Pei Zhiyong, director of the Industry Security Research Center of Qi'anxin Group, told the Beijing News Beike Financial reporter that we should actively explore and promote anti-counterfeiting technology for photos and videos.

Internet celebrities on multiple platforms encounter "undressing" to attract traffic: 5 yuan for customized photos

After a simple investigation, Xiao K finally found the source of the leak of his photos: someone used his "AI undressing" photos to attract traffic on foreign social platforms and pornographic websites, and sold AI image editing services.

Xiao K's experience is not an isolated case. As early as March last year, a photo taken by a Xiaohongshu blogger on the subway was artificially "undressed by AI" and circulated in a WeChat group. In the blogger's comment area, many women complained about similar experiences, and they could not hide the shadow in their hearts despite empathy.

Shell Finance reporter searched according to the clues and found a seller engaged in "AI undressing". The other party said that if you provide any photo and spend 5 yuan, you can provide "undressing" service. If the photo does not achieve the effect, you can contact to change the picture. In addition, you can also do "video face swap" for 20 yuan.

The seller said, "There are too many orders for 'removing clothes' recently, and I don't have time to do face replacement."

Screenshot of the chat record between the Beijing News Shell Finance reporter and the “AI undressing” seller.

The abuse of "AI stripping" technology in pornography can be traced back to a software released overseas in 2020. Due to controversy, it was quickly removed from the shelves by the developer. Despite this, this technology is still used by some criminals to make money.

In June this year, Beijing police cracked a case in which a technician who used to work for an Internet company claimed that "computer AI can remove clothes at a bargain price" and "people around you, Internet celebrities, and stars are all OK", and made extra money by selling each photo for only 1.5 yuan. He sold the relevant pictures to 351 people through the Internet, with a total of nearly 7,000 pictures. In the end, he was prosecuted by the Haidian District Procuratorate of Beijing for the crime of disseminating obscene materials for profit.

Shell Finance reporters found that "AI undressing" has now formed an industrial chain, including the sale of AI deep-forged pictures, paid AI "photo-editing" of normal photos, and teaching "technology" to achieve undressing.

In addition, the black and gray industries have already set their sights on female internet celebrities, female stars and other well-known figures, and "AI undressing" has become the password for traffic.

A reporter from Shell Finance saw a group called "Internet Celebrities XX's Undressing Communication Group" on a black and gray market platform. The group has nearly 15,000 members, among which most of the well-known female Internet celebrities on Douyin, Bilibili, Weibo, and Xiaohongshu have pictures of themselves undressing.

Shell Finance reporters noticed that this group initially only used AI undressing on some Internet celebrity photos, and later gradually expanded to the entertainment and sports circles. The profit-making method of this type of "communication group" is also relatively simple. The black and gray industry practitioners first threw some photos in the group to attract members to buy the "membership group" service, with a membership fee of about 70 yuan.

Bilibili and Xiaohongshu still have "AI undressing tutorials" Lawyers: Teaching technology may constitute aiding and abetting a crime

The reporter found that after the AI ​​raw image model technology matured, the corresponding "one-click undressing" capability was "developed" by netizens from a well-known open source AI raw image model abroad, and gradually became widespread. Although with the improvement of regulatory rules in various countries, the large model has revised instructions such as "undressing". However, Shell Finance reporters searched on Bilibili and Xiaohongshu and found that there are still some "AI undressing tutorial" videos. Although some are actually AI "changing clothes", it also means that this type of technology is still being rampant.

The reporter searched for "AI undressing" and other related keywords on a certain video website and obtained the results.

Many domestic internet celebrities are quite helpless about this. In May last year, a female internet celebrity with more than 500,000 followers complained in a post, "Many people have experienced their pictures being stolen and posted on the Internet to attract traffic and 'sell movies', as well as the low-level rumor-mongering method of putting pictures together with other pornographic pictures. Anyone with a discerning eye can tell that it is fake. I usually don't care about these things, but when the number of private messages suddenly increases and I find the 'rhythm' is picking up, I have to refute the rumors."

Zhao Hu told Shell Finance that selling AI undressing or face-changing pictures violates the provisions of the Public Security Administration Punishment Law. Charging to help "customize" AI undressing pictures may constitute the crime of making and selling obscene materials for profit. If the pictures are not sold but only disseminated, in serious circumstances, it still constitutes the crime of disseminating obscene materials. Teaching technology may constitute aiding and abetting.

"Charging for 'customized' AI undressing pictures may constitute the crime of producing, copying, publishing, selling, and disseminating obscene materials for profit." Zhao Hu said that AI clothing removal and AI face replacement also involve infringement of citizens' personal information. Once this information is leaked or abused, it will pose a serious threat to personal privacy and security.

"It must be clear that the use of AI tools for forgery is much more efficient than traditional forgery methods, but the technical threshold is much lower. As the saying goes, it takes only one mouth to spread a rumor, but it takes a lot of effort to refute it." Pei Zhiyong told Shell Finance reporter that at present, AI deep fake videos and pictures can be identified through professional technical means or expert appraisal, but the cost is also very high. "With the continuous advancement of technology, it is inevitable that AI-generated or synthesized videos will eventually be unable to be authenticated."

In his opinion, although it is unrealistic to prevent people from using AI to generate fake videos or pictures, it does not mean that there is no way to deal with fake videos or pictures. "China has very mature governance methods for online pornographic videos and pictures. Even if we cannot use technology to identify the authenticity of a picture, we can prevent the spread of pornographic content from the platform side, locate the source of the spread of pornographic content and crack down on it."

Pei Zhiyong said that Internet platforms cannot pursue traffic unilaterally, but should actively participate in network governance under the guidance of relevant laws, effectively limit the scope of dissemination of relevant information, and label unconfirmed important information as "unconfirmed" to moderately reduce the risk of ordinary people being misled.

Pei Zhiyong believes that we should actively explore and promote anti-counterfeiting technology for photos and videos. "For example, if you use a certain brand, model, and serial number of a specific mobile phone to take a photo, the photo file will have an encrypted verification code that is invisible to the naked eye but recognizable by the machine. If someone modifies the photo, whether it is AI modification or manual modification, the verification system can recognize that it is not the original. This method is not a new technology, but a mature cryptographic technology. All it needs is promotion and in-depth popularization."

Sun Yue, founder and CTO of Beijing Xindun Times Technology Co., Ltd., said in an interview with a reporter from Shell Finance that deep fake technology is rampant, and biometric features such as voice and face have lost their original high-trust authentication properties. If you want to prevent the harm caused by it through technology, it is recommended that major system platforms strengthen the identity authentication of application systems, and strong authentication measures are required to resist the risk of identity forgery. Through multi-factor identity authentication, terminal risk perception, big data analysis and other product technologies, combined with user behavior modeling, knowledge graph causal inference and other means, the identity risks brought by deep fakes can be effectively resisted.

"In the future, for the healthy development of AI, in order to deal with the risks brought by deep fake technology, first of all, the government and industry regulatory agencies should actively improve relevant laws and regulations. Secondly, increase investment in the research and development of deep fake detection, digital watermarking and other technologies; thirdly, relevant departments and news media should also increase anti-fraud publicity efforts, enhance the safety awareness of the people, and make joint efforts to protect the safety of people's life and property." Sun Yue told the Shell Financial reporter.

Reporter contact email: [email protected]

Beijing News Shell Financial Reporter Luo Yidan Editor Wang Jinyu Proofreader Yang Li