Please enable javascript in your browser to view this site!

Sexual Health Blogs

Speaking in Code: Algospeak and Its Implications in Sexual Health

Speaking in Code: Algospeak and Its Implications in Sexual Health

What Is Algospeak, and Why Are We All Speaking It?

Picture this: you're sitting with something difficult–maybe a health experience you've never talked about out loud, a part of your identity that you feel passionate about sharing, or a question about your sexual health that you've been quietly carrying for years. You decide to post about it. You choose your words carefully. You hit share. And within hours, the post is gone, or worse, it's still there, but nobody's seeing it, quietly buried by a system that flagged it before it ever reached anyone who may have key insight to share, or could have benefited from what you had to say.

This is the everyday reality for millions of people navigating content moderation on social media platforms, and it's the reason a whole new kind of language has emerged: algospeak.

Short for "algorithm speak," algospeak refers to the coded substitutions people use online to talk about sensitive topics without triggering automated content moderation filters. Instead of writing "sex," someone types "seggs" or "se//." The meaning is preserved, but the word itself slips past the algorithm. "Unalive" substitutes for suicide. “Corn” could refer to porn. "Cornucopia" stands in for homophobia. There are plenty more examples! The ingenuity is genuinely impressive: these workarounds aren't accidental typos or lazy shorthand. They're a living, evolving vocabulary born in response to censorship.

Here at the Sexual Health Alliance, we know this firsthand. Many of our own social media pages–including our Instagram and TikTok–rely on algospeak regularly. When we want to post educational content about sexual health, we'll write "se//ual" instead of "sexual," or find other creative ways around filters that weren't designed with health educators in mind. 

So, let's dive in! Though, a fair warning: what follows is just my opinion as a content writer and everyday media consumer who sees algospeak in action both on and offline.

The Case for Algospeak and sexual health

There's something worth appreciating in algospeak's existence, even setting aside the frustration of needing it in the first place. It has become a lifeline for communities that content moderation systems have historically treated as collateral damage, like LGBTQ+ creators, mental health advocates, harm reduction workers, and sexual health educators, among others. Not because they're doing anything harmful, but because automated systems are lack the nuance to distinguish between a post glorifying dangerous behavior or doing harm from one trying to prevent it.

Some of the clearest benefits (again, in my opinion):

  • It reaches people who need it. A harm reduction worker discussing drug safety, or a sexual health educator explaining consent, can often get that information to their audience through coded language even when plain-text versions get removed.

  • It protects marginalized voices. Communities that are disproportionately shadowbanned (LGBTQ+ users, sex workers, mental health advocates, and so on) can continue talking about their own lives without constant fear of deplatforming.

  • It builds community. Shared coded language creates in-group solidarity. The word "unalive," for example, has become embedded enough in certain online mental health spaces that it carries its own texture. It has become more than a quick, coded alternative, now encompassing a whole context of shared experience.

Language has always evolved in response to social pressure. Algospeak is just a very contemporary example of that process. I have to acknowledge and admire the creativity of it.

The Problems It Creates

That said, algospeak is not a clean solution. It trades one set of problems for another, and it's worth being honest about that.

The most fundamental issue is that it creates a two-tiered communication system: those who are fluent in the code, and everyone else. And "everyone else" includes people who matter quite a bit.

  • It excludes outsiders. Older generations, parents, researchers, and journalists trying to understand what's being discussed online can find algospeak genuinely opaque. That gap has real consequences — especially in mental health contexts where a parent might miss warning signs entirely. As a researcher, I see this firsthand. Academic research and publications are constantly trying to keep up with new language to understand broader patterns of human behavior, but the rigorous research and dissemination processes will never be faster enough to stay current.

  • It doesn't actually solve censorship. If content can only reach its audience by disguising what it's about, it's still effectively censored. It just slips through less efficiently for a while. 

  • It's a race with no finish line. The same techniques that help a sexual health educator reach her audience also help bad actors evade detection. Platforms update their filters. The language shifts again. How long before "se//" gets flagged too? How many layers of substitution until the original meaning is unrecognizable?

  • It can obscure harm as well as health. Using coded language to discuss really important topics–especially sexual and mental health–can send the wrong message. Some people may internalize it as something that should be hidden or disguised, while others may become desensitized, taking these topics less seriously than I would argue they should.

  • It fosters distrust in platform policies. When everyone knows the workaround, the rules start to feel arbitrary, which makes it harder to build the kind of trust that good moderation actually requires.

Where This Leaves Us as sexual health professionals

There's no clean verdict on algospeak. Whether it's net positive or negative depends almost entirely on who's using it and why.

What it points to, ultimately, is a problem coded language alone can't fix. Automated moderation systems need to get better at distinguishing between communities discussing difficult topics in good faith and content that actually causes harm. Until they do, people will keep finding workarounds, and the language we use online will keep evolving in ways that leave some people in the loop and others completely lost.

For us in the sexual health space, I would argue that algospeak is a tool we use because we have to. The goal has always been open, clear, accurate information about sexual health for everyone. Algospeak gets us part of the way there. 

Want to become an in-demand sexual health professional? Learn more about becoming certified with SHA!


Written by Jesse John, B.S. 

Jesse is a clinical psychology doctoral student at Rowan University in New Jersey. Their research focuses on sexual decision-making, sexual violence, and relationship experiences. The author identifies as a Queer, neurodiverse, white, non-binary person, which informs the way they write and see the world!