Content Note: This article makes references to nonconsensual image abuse
I'm sure most of us have by now seen the entertaining deepfake video where Barack Obama calls Donald Trump a "total and complete dipshit." The video sparked an interesting debate about being vigilant of online sources, and many pondered over the instances in which politics could suffer in the face of this new technology. What was not at the forefront of this debate, however, was the other 90-95% of deepfakes on the web, which are mostly nonconsensual pornography, 90% of which portrays women.
For those more unfamiliar with “deepfake porn,” the term refers to the practice of transposing a person's face on an image or video of a sexual nature, thereby producing a pornographic video that the person who seems to be in it never consented to. As AI technology, unfortunately, evolves in this field, only one photo of a person is now needed to produce a video. To gauge the scope of this epidemic, an AI bot that gained particular popularity across Russia in 2020 is said to have created nonconsensual deepfake porn with the faces of over 680,000 women. These fake videos have very real and widespread consequences as they completely violate the consent of those affected and pose a massive new avenue to violence against women.
As expected, the pornographic industry is not one to self-regulate, especially when it comes to protecting women's rights and safety. Instead, the industry has taken advantage of the deepfake's potential. One website that uses AI technology to undress women in clothed images had more than 38 million visits in 2021. Other horrific pages have even emerged, which generate videos for users within minutes after they upload a photo of whoever they want deepfaked. Although pages focused on celebrity deepfake porn get millions of hits daily, which is problematic in and of itself, the disturbing thing about this new technology is that anyone who has ever taken a digital photo could be next. Deepfake porn might even be an avenue to reflect on broader social attitudes. What seems to be a popular desire to impose the face of a celebrity or woman you know onto a (probably) violent porn video is a reflection of the patriarchy and rape culture. The debate surrounding ethics and pornography is already complex without this additional nonconsensual violation.
Although mainstream sites such as Pornhub have already banned deepfakes in February of 2018, the internet is unfortunately very vast, and deepfakes are only growing on underground sites. In fact, 94% of deep fake pornography is found on specially dedicated websites. Especially for celebrities, it's near impossible to know of every video and even harder to claim copyright on all the photos used. It's also challenging to legislate at which point a video or image appears to depict the individual who has not consented to it. The pornographic videos used are often found on mainstream sites, published with consent and the photos imposed on them have also often been published on the internet with consent. At what point does the joining of these videos with one another violate that consent? The law, therefore, falls short even for "normal people" whose photos have been abused in this way. Under current legislation, a photo shared without consent needs to include the person's genitals or breasts to be considered revenge porn, but what happens when an image is edited so that it appears to be this way but, in actuality, isn't? The failure of the law to catch up to contemporary issues allows for too many cases to fall through the cracks.
To end on a more hopeful note, perhaps, some states, as well as the UK, have taken steps to legislate deepfake porn. Since 2019, California has had two bills that target deepfakes, one allowing individuals to sue if their images are used on pornographic material. Virginia's revenge porn law also covers deepfakes and photoshopped images under their revenge porn law since 2019. Since November of 2022, the UK has also discussed amending their revenge porn laws to include deepfake content. Until the laws eventually catch up to reflect the contemporary ways in which technology is being used in violence against women, we can only continue to educate and stay educated about nonconsensual pornography and how to report it.
Written by Ellen Gisto.