Before the phrase ‘fineshyt’ was everywhere this summer, Lazer Dim 700 was using it in his videos. It’s something the American rapper has said he first heard where he’s from in Georgia. The spelling – or intentional mis-spelling – however, is a product of algospeak, a language that people develop to avoid content moderation by social media algorithms that automatically detect and censor certain words or phrases, like using “unalive” instead of “kill” and “getting cracked” or “segs” for “sex”. Where once you could have called Amaraee’s summer anthem “Fineshyt” a “bop”, the word now means something entirely different online.

“Right now, the word ‘bop’ is trending as a substitution for the phrase OnlyFans creator or online sex worker,” says linguist Adam Aleksic, author of Algospeak. “People use that as a slang term, but they don’t understand that it emerged and got popularised because people were trying to circumvent social media guidelines.”

While each algospeak adjustment may seem small, the continued attempt to prevent content from being flagged online is changing our relationship to language. It’s a fluid process of popularisation and adoption. The more people use specific phrases, words or ways of spelling, the more the algorithm rewards it, and, in turn, the more people see it (and use it). It’s also changing how we discuss so-called taboo content and ideas. “It’s an example of real-time survivorship bias, because you build a language of what’s represented,” says Aleksic. “People just start to adopt the censored versions as part of their everyday vocabulary, probably in many cases where they don’t need to censor themselves.” This, of course, doesn’t have major ramifications when it comes to many trending words like ‘fineshyt’. But what does it mean to engage with political ideas through algorithms that reward self-censorship? 

Linguistic change has always been inevitable – poets and philosophers have been worried about the decline of the English language for decades – so the conversation about whether algospeak is “ruining” vocabulary seems futile. Language has always evolved over time, and even if the internet may be speeding up that process, most complaints usually come down to a preference for a past way of doing or saying things.

What is worth considering with algospeak, however, is the way it’s shifting political action, especially in the context of the political climate today. As President Trump purges the federal government of “woke” initiatives and words (like DEI, transgender and Black), and moves to clear the path to sell TikTok’s US assets to a group of American investors, the reality of a media-by-oligarchy has become even more extreme. On platforms run by figures on the global right, like Elon Musk’s X, Aleksic suggests that some people are adopting algospeak sardonically, using self-censorship as a form of power. “Putting an asterisk in the middle of the words ‘Trump’ or ‘men’ is also an act of reclamation,” he says.

Ask people across political lines who “owns” or dictates language online or in popular culture, and you’ll get different answers. Those on America’s right have claimed that “woke liberals” are obsessed with censorship, inhibiting free speech with cancel culture. Those on the left might point to the people who own the popular platforms – like Musk and Mark Zuckerberg (who has been cosying up to Trump) – and Trump’s scare tactics purposely deterring people from being associated with left-wing groups and ideas. More recently, however, after Trump’s DEI word bans, the suspension and expulsion of students protesting the genocide in Gaza at universities like Columbia, and the suspension of “Jimmy Kimmel Live!” after backlash over Kimmel’s remarks on the assassination of Charlie Kirk, there’s been more talk about “the woke right”, who want to limit or ban liberal speech. 

It’s an example of real-time survivorship bias because you build a language of what’s represented. People just start to adopt the censored versions as part of their everyday vocabulary, probably in many cases where they don’t need to censor themselves.

Nancy Costello, director of Michigan State University of Law’s First Amendment Clinic, consults with students about speech issues and press issues on campus. She says that, while universities have always been conscious of protests, recently, there’s been more outrage by students witnessing Trump’s clampdown on LGBTQ+ issues or DEI programs. “After Donald Trump started running for president, First Amendment issues became front and centre,” she says. “It started with people with more conservative voices saying we feel like the liberals have shut down our voices, then he was suggesting that if someone wants to do something about it, he’d have their back. The question became whether he was inciting violence or pushing the size of the envelope a lot in terms of free speech, which trickles right down into schools.” 

While Costello says speech on campus has to be considered disruptive to the educational environment of the school for it to be shut down, this still includes it being foreseeable that such speech could be disruptive, depending on past history. “I’ve had some schools shut down pretty passive protests, like dressing in the colours of the Palestinian flag,” she adds.

Online, where the watermelon emoji has come to stand for solidarity with Palestine, the potential acts of suppression are even more blurry and confusing: Aleksic describes what the algorithm does and doesn’t do as a “black box”. “Nobody really understands what is happening, so people tend to hypercorrect and engage in speculative self-censorship,” he says. “We always try to accommodate our communication for the audience we think we're getting, so we’re performing for several things at once.” There’s a 2017 research paper that calls this the “algorithmic imaginary”. 

Right now, amid the shift away from DEI speak, Aleksic says content guidelines have loosened on platforms like Meta, to the point that there’s more racist content coming through. “You can just say the N word on Instagram reels right now and it doesn't suppress you,” he says. “That’s not something that was the case before, but these small platform tweaks, which are responding to political shifts, actually do dramatically impact the reality we perceive on social media.”

At the same time, many marginalised communities are forced to use algospeak because their inherent identities are seen as political or tied to “controversial” topics. For example, lesbians on TikTok would call themselves “le dollar bean” in early 2020, until another form of algospeak took over: “WLW”. In countries that censor conversations around LGBTQ rights (like Russia or Trump’s current war on America’s transgender community), this form of self-censorship may feel like a necessity to find community online without potential repercussions. This spreads across the internet and, therefore, the world. After all, TikTok's local moderation guidelines have been found to ban pro-LGBTQ content, even in countries where homosexuality has never been illegal.

Language serves as both a tool for asserting and reflecting identity, which means it easily becomes the tool in which people seek to gain political power. “The act of who gets to use a word and what context it gets to be used in is a deeply political thing,” says Aleksic. “But the people who control the platforms don’t even care about politics as much as staying relevant to the cultural moment, so they can keep making money.”

Here’s where it ultimately comes down to attention. “Algorithms are changing our language; they're also incentivising attention-grabbing mechanisms that are then replicated as language,” adds Aleksic. Since algorithms work better with more labels, they reward us for using more and more algorithmically-minded categorisations, which can ultimately end up reinforcing restrictive ideas around identity. Also, they often reward self-censorship with more engagement and views. 

The act of who gets to use a word and what context it gets to be used in is a deeply political thing. But the people who control the platforms don’t even care about politics as much as staying relevant to the cultural moment, so they can keep making money.

American linguist Lal Zimman, associate professor of linguistics at UC Santa Barbara, says over the past decade, they’ve noticed a shift in public awareness and interest in how language relates to trans people. “It’s only when trans people’s critiques and creative ways of using language that are more affirming have started to be taken seriously by people outside of trans communities, that we’ve seen a really strong push back and people feeling the need to potentially censor themselves,” they say. This year, there was a memo released on what words Democrats should avoid using, as they apparently turn voters off. Still, the focus remains on singular words alone without context or education. “It’s almost a search-and-replace thing instead of thinking about the message behind it,” says Zimman. “There are a lot of populations, like working-class people, whose use of language is denigrated, who are not listened to as a result. So we need a much broader approach to recognising the impact that language has in people's lives.”

Ban a word on social media and people will create a misspelt, emoji-riddled hashtag for it, because ideas and movements can and will transcend any singular word. “Language has always been part of targeting social justice movements,” says Zimman. “But I think there’s been a real failure by some progressives to really explain why we care about language in the ways that we do.”

How we engage with language, including new and seemingly silly words, can impact people’s lives and inform how we understand reality. Algospeak may be a tool to play into or bypass social media algorithms – self-censoring unknowingly to be part of a fun new trend, or intentionally to find community or spread the word about a belief or cause – but all of this only exists within the algorithmic imaginary, which is controlled by the ultra-rich. Our imaginations, however, have the potential to be much more expansive. As Zimman says: “The whole conversation is really focused on words, right? And that shapes how we think about language.”

More on these topics:Life & Cultureinternetsocial mediapoliticalAmericaCensorship