During my year abroad in France, I not only learned French but also about French culture and French sexism. One particular subset of cyber-sexism strikingly common in France, Belgium, Italy and Turkey—yet unfamiliar to me before I attended a conference organized by my host university—was “Fisha” accounts. Although hearing about yet another gender-based threat presented by the internet is never something pleasant, the courage and determination of those who spoke made me feel a little less like that one excavator digging out the massive ship that was stuck in the Suez Canal. The excellent panel was composed of members of the organization Stop Fisha, as well as members of the European, Belgian and French Parliament.
The word “Fisha” from the French word “afficher” means to display publicly with shameful connotations. The act is a type of nonconsensual image abuse in which social media accounts on Twitter, Snapchat and Telegram, and similar content-based platforms with usernames such as FishaParis or Fisha(area code) distribute nude images non-consensually. These images had originally been shared in confidence with an intimate partner, and either this partner or someone else who found these images, has then submitted them to these larger social media pages. Additionally, these pages often dox these girls alongside posting their intimate images. A deeper investigation by Stop Fisha even found that these photos were being traded and sold internationally between accounts. With 38% of women having experienced online violence and 85% having witnessed online violence against another woman, cyber violence and, more specifically, nonconsensual image abuse is a gendered issue.
We must, of course, consider this issue intersectionally as well. Studies show that gendered violence disproportionately affects women and girls from less economically privileged backgrounds, women of color, young people, and LGBTQ+ people. These groups are also among those least likely to afford or access legal and structural support.
Stop Fisha is an organisztion based in Paris composed of students and lawyers that aims to educate the public and empower those who have fallen victim to this type of violation by tracking down the accounts and reporting them to Pharos and e-EnFrance, which both work to tackle cybercrimes. The movement started in 2020 during the pandemic when these accounts began cropping at an incredibly rapid rate. Run and organized by the young people, the generation that has mastered the internet, the organization acknowledges that nude photos will be taken and that realistic measures to educate about how to do this safely is more effective than simply advising against it. The organization also helps provide legal support to those who have experienced cyber-sexism or online gender-based violence and hopes to reform the legal system in this process.
One of the main challenges presented by this type of abuse is regulation and accountability. According to the Communications Decency Act Section 230, a social media platform is not liable for the contents published by third-party users. Legislated before the rise of social media, this act allows social media platforms to avoid taking action to make their platforms safer. No direct monitoring of content means this responsibility falls on the victims and other social media users to report inappropriate content, by which time the damage has been done. PornHub faced a related scandal explored in the new Netflix documentary Money Shot, in which videos remained uploaded without the consent of the person in the videos for weeks despite this user reporting the content. This controversy is particularly difficult to regulate due to many making a living on the platform posting their own content and should, therefore, not be censored. Both Fisha content and the PornHub scandal reflect how the traffic and advertising that these content platforms run on are directly contradictory to a model that would protect women by taking down popular content. Incentivising platforms to be accountable for what is posted on them is one of the main means by which the regulation of inappropriate and nonconsensually uploaded content is not left to NGOs (Non-Governmental Organisations) and victims of nonconsensual image abuse. Putting pressure on even one platform can have a significant impact on the free market of social media, where reputation is critical for user engagement.
The fact that these acts take place in a liminal space between the analogue and digital world perhaps makes the consequences feel less tangible. Harmful comments or direct messages are presented as almost an inevitable consequence of being online, while similar comments in person would never be brushed under the rug. This can also be considered through a gendered lens by which gender-based violence is often dismissed and normalised. Resultantly, the consequence of online behavior is minimal. The vastness of the internet, coupled with the normalization of online harassment, means few reports of harassment or violence online and even fewer cases end up being punished. The international element of the internet and a present lack of an internet-wide law further complicates the enforcement of consequences. With few consequences for sexist comments and inappropriate direct messages, social media platforms and wider society are also condoning macro behaviours such as non-consensual image abuse and Fisha-like accounts.
Written by Ellen Gisto.