None
Brief news summary
NoneAn analysis has shown that there has been a 550 percent increase in the number of deepfake videos discovered online this year compared to 2019. The majority of these videos are deepfake pornography. This surge in deepfake content creation can be attributed to advancements in technologies like generative artificial intelligence (AI), which allow for the generation and alteration of digital content to mimic real individuals or situations through video, image, or audio. Victims of deepfake pornography have shared the devastating impact it has on their lives, with one individual describing it as a "lifelong sentence". Machine learning algorithms are responsible for producing hyper-realistic deepfake content, which can be utilized by malicious actors for purposes such as targeting victims, blackmail, and criminal or political manipulation. According to a comprehensive report on deepfakes in 2023, deepfake pornography comprises 98 percent of all deepfake videos found online, with 99 percent of the targeted victims being women. The report was conducted by analysts from a website called homesecurityheroes. com, which aims to protect people from online identity fraud. Their study involved examining 95, 820 deepfake videos, 85 dedicated online channels, and over 100 websites associated with the deepfake ecosystem, culminating in the production of the 2023 State of Deepfakes report. Key findings from the report indicate that it now only takes less than 25 minutes, at no cost, to create a one-minute deepfake pornographic video of an individual using a single clear image of their face. The website's analysis found that the majority of deepfake pornography videos focused on South Korean women.
Following South Korea, the nationalities most targeted by deepfake pornography were the United States, Japan, and the United Kingdom. Notably, South Korea had the highest number of victims of deepfake porn, which the report's analysts attribute to the global popularity of K-pop. Among the top 10 most targeted individuals, three of the four members of Blackpink, considered South Korea's biggest girl pop group, were included. Chris Nguyen, the head of research analysis for the report, explained that K-pop idols have extensive visibility and fan bases both domestically and internationally, making them more susceptible to the creation and dissemination of deepfake pornography reaching larger audiences. Additionally, there is exceptionally high demand for content featuring K-pop idols, which some exploit by generating explicit deepfake content, particularly on dedicated adult websites, to attract attention and increase traffic. Nguyen also pointed out that South Korea's strict regulations on pornography likely play a role in driving the creation and distribution of such content, possibly due to the Streisand effect, where attempts to conceal or censor information result in increased awareness and interest. The analysts discovered that seven out of the top ten most visited pornographic websites hosted deepfake content, indicating the widespread popularity of this type of content. They emphasize the need for discussions surrounding attitudes towards deepfake pornography, with a focus on ethical considerations regarding its creation. The researchers attribute the sudden rise in deepfake content to two factors: the emergence of generative adversarial networks (GANs), a type of machine learning framework used for generative AI that enables the production of deepfake content, and the easy accessibility of tools built on top of GANs, allowing almost anyone to create deepfake content quickly and inexpensively.
Watch video about
None
Try our premium solution and start getting clients — at no cost to you