Spread of sexual deepfake images created by generative AI growing in Japan

Spread of sexual deepfake images created by generative AI growing in Japan

8
0
SHARE

Spread of sexual deepfake images created by generative AI growing in Japan

Deepfake images, partially modified, are seen on a computer

Sexual deepfake images and videos created by misusing generative artificial intelligence and targeting children and women are rapidly spreading across Japan.

In some cases, fake nude images of individuals are created without their knowledge, and exposed online along with their real names, addresses and school names. The Mainichi Shimbun looked into the real damage caused by generative AI, which can not only lead to children becoming victims but also potential perpetrators.

Sumire Nagamori, head of the volunteer organization Hiiragi Net, which patrols the internet and reports content to platform operators and the police, scrolled through her computer screen with a grim expression.

She was doing research on an overwhelming volume of sexually fake images and videos posted on social media, including school group photos which have been edited to make the female students appear naked, and yearbook pictures where bodies beneath the faces have been modified sexually.

Many schools use a system where professional photographers upload event photos to dedicated websites, allowing families to select and purchase the images they want. It appears that these pictures are also being leaked and misused online.

There are even exchanges on the internet, such as, “Let’s trade login IDs and passwords for each school’s photo sales website.”

Previously, creating sexually fake images such as “idol collages,” known as “aikora,” which combine a person’s face with pornographic content, required some degree of editing knowledge and technical skill.

But now, with apps and websites that incorporate generative AI, users can simply upload an image and generate a deepfake in a matter of less than a minute. The lowered barrier to creation is contributing to the rapid rise in damage.

The Tokyo-based non-profit Organization for Pornography and Sexual Exploitation Survivors, or PAPS, which has supported victims of digital sexual violence, began actively calling on victims of pornographic deepfakes to seek help around six months ago.

One victim became unable to attend school after fake images of them were spread online along with their school name, due to fear that “a stranger might be waiting in front of the school.” PAPS chairperson Kazuna Kanajiri stated, “The fear experienced by victims is immeasurable,” and urged those affected to seek support.