As you are reading this, a porn video with your likeness might be spreading online with rising click-rates. Recently, some South Korean female netizens claim that women of all ages in South Korea are threatened by deepfake pornography, with underage students being particularly targeted, and have turned to international social media platforms like X, Weibo, and Instagram for help. Deepfake porn is pornographic content created by digitally altering videos using AI to swap someone’s face onto another person’s body. Persons targeted by deepfake manipulation do not offer their consent, as they are usually unaware of the violation. The reality of this phenomenon is terrifying, and given the nature of deepfake pornography and its violation of human rights, it should be punished as a first degree felony, like rape.
A Korean seeks help in social media of other countries to gain support and raise social awareness. @nosunae_520, from her perspective as a South Korean girl, revealed a list of schools in which students participate in spreading deepfake pornography (Weibo). The schools listed range from high school to primary school, suggesting that minors, including very young children are also the victims. South Korean President Yoon Suk Yeol states: “The victims are often minors and the perpetrators are mostly teenagers.” (Mackenzie) Teenage students, who are the primary users, engage in uploading photos of acquaintances on “Telegram,” including classmates and teachers. Subsequently, other users will manipulate these images to create deepfake content. Along with pornographic videos, victims also suffer from molestation due to the leak of their information.
Creating deepfake videos is not the first way perverted criminals have used crypto platforms as a shield to violate women’s human rights. Between 2018 and 2020, the “Nth Room” drew global attention. Victims’ personal information and creepshots are used to blackmail victims, forcing them to submit to sexual exploitation and satisfy the voyeurism of 60,000 users in Telegram group chats (“[DEBRIEFING] “Nth Room”: A Digital Prison of Sexual Slavery”).
To understand why deepfake video and the “Nth Room” case should be considered as morally reprehensible as rape, one must first be familiar with the concept of “sexual exploitation.” Sexual exploitation is defined as the abuse of a vulnerable position, power dynamic, or trust for sexual gain (UNHCR). The power dynamic is the key factor in determining exploitation. In “Nth Room,” the victims are called “slaves” and are threatened to carry out “missions” that are violent and degrading, including carving the words “baksa” or “noye” (meaning slave) into their skin and fully exposing their faces and raise their little fingers as a symbol (“[DEBRIEFING] “Nth Room”: A Digital Prison of Sexual Slavery”). These actions clearly violate women’s rights. The threat of men dominating and subjugating women exploits gender power dynamic and directly affronts the autonomy and dignity of women.
When victims are minors, the negative impacts of deepfake videos can be particularly severe, affecting their mental well-being, personal boundaries, and overall sense of security. For example, when women’s personal information is leaked and their faces are swapped using deepfake technology, their dignity is degraded, and their lives become a sacrifice to entertain male voyeurism. Just like rape, producing deepfake porn bypasses consent, exploiting women and disregarding their dignity and right to say no. It is surely unjust to prioritize men’s right to sexual fantasy over the rights of women and girls to sexual autonomy and integrity (Durham University). This behavior is non-consensual and infringes upon the rights of the individual whose image is used without consent. When deepfake videos target underage children, even
more significant psychological harm results, such as: privacy violations, decreased quality of life, and the development of fear related to assault.
Clearly the effects of deepfake pornography are devastating and unacceptable; therefore, deepfake abuse should be recognized as a form of violence against women and its criminalization should be urgently pursued. While some argue that creating deepfake pornography is just a harmless sexual fantasy, akin to imagining it in one’s mind, the act of creating a digital file that can be shared online without consent constitutes a violation of privacy and can be used for malicious purposes. The production of deepfake videos is associated with minimal costs, yet the impact on victims is irreparable. The ease of producing deepfake videos contrasts with the profound harm inflicted on individuals who become victims. The low cost and minimal threat of punishment of these crimes encourages more people to commit them. Beyond banning deepfake abuse, it is essential to educate people to prevent gender-based harm and eliminate patriarchal views that sustain such behavior. The degradation of women’s dignity can further promote hyper-masculinity and ideas like “women are born to entertain men.”
Statistics indicate that 96% of deepfake content consists of pornographic material, with almost all of it featuring women (McGill). Also, according to Annenberg School for Communication doctoral candidate Sophie Maddocks, the technology, although able to remove men’s clothes, are typically trained on images of women, many of which are shared non-consensually (“Home | Annenberg”). This phenomenon can be interpreted as evidence that the very nature of deepfake technology is subject to infringing upon women’s rights. The high prevalence of pornographic deepfake content underscores the urgent need to address the implications of this technology. It is crucial to develop strategies to combat the spread of harmful and exploitative deepfake content, particularly when it targets women.
The mishandling of the “Nth Room”, lacking adequate consequences and demonstrating insufficient government action, has led to the recurrence of similar incidents as seen in South Korea and other places around the world. If this issue is not properly addressed, the cyber-hell experienced by South Korean women could become a global reality.