South Korea is grappling with a rise in deepfake pornography crimes, where AI-manipulated explicit images of women are spread online without consent. Victims are demanding stronger action from law enforcement, while activists infiltrate online spaces to expose perpetrators.
Deepfake Porn in South Korea: How AI is Fueling a Disturbing Digital Crime Wave
In recent years, South Korea has faced a troubling surge in digital sex crimes involving deepfake pornography—AI-generated explicit content created without the victim’s knowledge or consent. These digital manipulations can be shockingly realistic, leaving victims emotionally shattered and often without clear legal recourse. Now, activists, lawmakers, and victims are speaking out, calling for stronger protections and justice.
The Nightmare Begins with a Notification
For “Ruma,” a university student in Seoul, the ordeal started during a seemingly ordinary lunch in the summer of 2021. Her phone erupted with messages, revealing something horrifying—her face had been superimposed onto pornographic images and shared in a Telegram chatroom. Accompanied by degrading comments and direct harassment, the images were circulated widely. The anonymous perpetrator seemed to know personal details about her life, intensifying her fear.
Although Ruma had never shared intimate photos, that didn’t stop the attacker from using public social media images to fabricate the exploitative content. It’s part of a growing trend where AI deepfake tools are weaponized to create highly convincing pornographic media, often targeting unsuspecting women and girls.
A Country Already Battling Digital Sex Crimes
South Korea has struggled with online sexual exploitation for years. From hidden cameras in public bathrooms to coercive Telegram chatrooms that blackmailed minors, the nation has faced multiple high-profile digital crime scandals. Now, deepfake technology has added a dangerous new dimension—one that’s reaching into schools.
According to South Korea’s Ministry of Education, over 900 students, teachers, and school staff reported being targeted by deepfake porn just between January and November of the previous year. Universities are facing similar problems, prompting emergency task forces and legislative action.
A Shift in Law, But Not Yet in Action
In September, South Korean lawmakers passed new regulations making it a crime to simply possess or view deepfake pornography. The penalties include up to three years in prison or hefty fines. For creators and distributors, the maximum sentence increased to seven years.
Despite these legal advancements, the gap between policy and enforcement remains wide. Out of nearly 1,000 reported deepfake sex crime cases in 2024, only 23 led to arrests. Critics say investigations are slow, and police are ill-equipped to handle the complexity of digital forensics.
When Authorities Fall Short, Victims Take Action
Ruma, frustrated with the police’s slow progress, decided to take matters into her own hands. She teamed up with activist and journalist Won Eun-ji, who previously uncovered one of South Korea’s largest digital sex crime rings in 2020.
Won created a decoy Telegram account and spent almost two years undercover, infiltrating the same group that had targeted Ruma. The effort paid off—two students from Seoul National University were arrested. One received a nine-year sentence for creating and sharing deepfake pornography; the other was sentenced to three and a half years.
Still, Ruma sees this as just the beginning. “I’m happy, but not fully relieved,” she said, acknowledging that one victory doesn’t erase the broader issue.
Teachers and Students Among the Vulnerable
The technology’s realism has traumatized other victims as well. A high school teacher, referred to as Kim, discovered explicit deepfake images of herself circulating online. The original photo, taken without her consent in the classroom, had been altered to create pornographic content.
Kim and a fellow teacher identified the culprit themselves—a student—after realizing the legal process would take too long. Despite charges being filed, Kim said her trust in people and her sense of safety have been permanently altered.
Public reaction has been mixed. “Why is it such a big deal if it’s not your real body?” is a sentiment often expressed online, showcasing a concerning lack of empathy and understanding around the issue.
Tech Platforms Under Pressure to Respond
Platforms like Telegram and X (formerly Twitter) are now facing increasing scrutiny. Telegram, long seen as a haven for privacy and encrypted communication, has been slow to assist law enforcement. But in a recent shift, Telegram agreed to cooperate more actively with South Korean authorities, even removing nearly 150 pieces of illegal content.
Activist Won welcomed the move but remained skeptical. She believes that unless Telegram continues to make progress, governments should consider removing it from app stores altogether.
In January, South Korea achieved a milestone when police accessed Telegram user data for the first time during a deepfake investigation. Fourteen people, including six minors, were arrested for sexually exploiting over 200 victims.
A Long Road Ahead
While some perpetrators are now facing jail time, many victims still feel justice is far off. The psychological toll is immense, and the societal indifference often compounds the trauma.
“There are still far more victims suffering silently, while their abusers walk free,” Ruma said. “We’ve made progress, but real change still feels distant.”
Source: CNN






Leave a comment