There’s growing evidence that Russia, China, and the US are using social media influencers’ power for political gain
If you look up #RussianLivesMatter on TikTok, you’ll see that the tag currently sits at 19.2 million views. The more widely used #RLM has racked up 31.2 million. In one video, a woman in pigtails does the Bella Poarch nose scrunch while forming the letter ‘Z’ with her hands, a symbol of support for Russia’s military. Another shows a man on one knee, holding up an English sign denouncing “Russophobia”, “info wars” and “nationalism”. Many feature identical speeches and on-screen text – such as in the letter ‘Z’ videos, where female users proclaim that’s “how real women pose.”
These videos were surveyed in a report published by left-leaning watchdog Media Matters in early March, uncovering a highly-coordinated pro-Russia propaganda campaign made up of almost 200 Russia-based TikTok influencers. Posted by accounts ranging from micro-influencers to popular beauty gurus and prank channels, the gist of the Kremlin’s messaging is that Russia is defending its own people against western-incited aggression in Ukraine and is now being unfairly persecuted in the mainstream media.
TikTok itself has issued a ban on posts from Russian accounts due to the country’s ‘fake news’ law, in which spreading false information about the war is punishable with up to 15 years in prison – yet users have found new ways to bypass restrictions.
Additional investigative research published by Vice uncovered an anonymously-run Telegram account soliciting Russian TikTok influencers to publish propaganda. In since-deleted posts, operators of the channel directed influencers on what emojis, text and audio tracks to use, and gave step-by-step instructions on how Russian accounts can circumvent posting restrictions. The operators specified that posts had to go live on a required date and meet a minimum view count, and also asked those interested to state their rates.
It’s hardly surprising that the Kremlin has swept TikTok into its attempt to build support for its invasion of Ukraine. The one-billion-user strong platform has already demonstrated its capability to sway public opinion – just look at the proliferation of ’debates’ concerning the COVID vaccine and mask mandates on the app – and Russia is by no means the first state to weaponise social media in this way. China has built a network of social media personalities who promote government policies via their content and parrot the state-approved stance on world affairs. It’s also been reported that the White House briefed 30 TikTok influencers over Zoom on the US approach to the Russia-Ukraine war last week – but according to Zhang Tengjun, deputy director of the Department for Asia-Pacific Studies at the China Institute of International Studies, “the White House’s real purpose is to brainwash those influencers and make them serve US narratives against Russia.”
Though it’s tempting to go back to our daily doom-scrolling without considering that we’re consuming nationalist propaganda, social media theorists and researchers warn that our mindless content consumption amid the ongoing conflict carries a greater weight, and perhaps even a sort of ethical responsibility. Cade Diehm, the founder of the technology research organisation New Design Congress, says that as we engage with propaganda – regardless of whether we agree with a video or not – it’ll be boosted by TikTok’s algorithm and pushed out to more viewers. “Because social media algorithms interpret almost any user actions in very simple terms, where a tap equals positive engagement and a like, react, or share equals stronger engagement, it is easy to automate and manipulate a piece of content’s metrics, essentially ‘training’ a social media algorithm to consider the piece of content as relevant.”
@hanna.montanas Russian Lives Matter #RLM ♬ Swang - TYSM FOR 80K
When it comes to world news, the algorithmic effect is amplified. Videos are designed to appeal to emotions such as anger and grief to grab a viewer’s attention and generate more likes and shares. “For content that has a relationship to a major world event, this process is even more accelerated because its spread is supported both by deliberate manipulation of said content’s metrics, and the organic engagement and sharing, driven by the userbase’s interest in the larger news context,” Diehm says. “Unlike other user interests, world events tend to flatten and shrink the proximity between users. All of this means that content has an opportunity to spread rapidly.”
Since TikTok – and social media platforms in general – are incentivised by engagement metrics, maximising their usage is prioritised over virtually every other concern, including combating propaganda and the circulation of deep fakes. Diehm points to scale and uniformity as playing a major role in what he calls the “weaponised design of social media systems.” With propaganda and deep fakes displayed alongside reputable news, the coalescence of what’s real and what’s fake “creates a crisis of comprehension, where everything is easy to use but making sense of the world through these interfaces becomes difficult and overwhelming.”
In tandem with TikTok users encountering political content that’s trending, shared with them or while they’re in the process of searching out factual news, Russian influencers could be compelled to engage in trends simply because they know it’ll be chucked to the top of user’s feeds and easily amass views. “Influencers need to tap into trending content to remain competitive in an attention economy. If they stay neutral they will quickly sink to the bottom of the newsfeed,” researcher Joshua Citarella says.
“Brittle narratives and economic anxiety now drive vast numbers of people out onto the web in search of answers. Young people are especially likely to trust disinformation or ‘hidden truths’ encountered over social media as opposed to an increasingly illegitimate mainstream.” – Joshua Citarella
The popularity of getting information from your favourite influencer operates in tandem with what Citarella, who studies online political subcultures, has observed to be a growing lack of public trust in mainstream media organisations and institutions. “Brittle narratives and economic anxiety now drive vast numbers of people out onto the web in search of answers,” he explains. “Young people are especially likely to trust disinformation or ‘hidden truths’ encountered over social media as opposed to an increasingly illegitimate mainstream.
So what are TikTok users left to do but bid Charli D’Amelio farewell and obliterate the app from their iPhones? Well, there are a few precautions they can take first. On and off TikTok, Diehm reminds that the fundamentals of propaganda are basically the same: “being sensitive to outlandish claims, spectacular footage and tales of extreme heroism should invoke a degree of skepticism.”
“Creators and influencers who renounce their positions or seem contradictory are also subtle cues for skepticism,” he adds. Beyond that, “building media literacy, and identifying and supporting journalists who practice rigorous reporting are all things that empower users to rely less on the appeal of authenticity offered by social media accounts or influencers.”
If you need legit information right then and there, content validation can be as quick and simple as a reverse-image search. That alone, he says, is a “disruptive and simple technique that can jam information warfare efforts.