Logan Paul filming a dead body for views is a predictable outcome of shock-for-clicks culture
If we are to learn anything from YouTube’s recent Logan Paul controversy, it’s that it was inevitable. Content creators know that shock and disgust is what gets them trending, and when views are high, so are the earnings.
After the influencer uploaded a vlog that showed a dead body in Japan’s Aokigahara (aka “suicide forest”), the company — after several days of inaction — have now cut business ties with Paul. This means he is no longer one of their promoted creators. However, while YouTube appear to have made good on their promise to explore “further consequences” for Paul, he is currently still free to continue creating, and making money, on the site. (YouTube declined to comment for this piece.)
Critics have wondered whether the company has lost control, or if there’s a reluctance to implement its power. Not only has Paul avoided a penalty, but the week of the scandal has actually been his most successful week since joining the site. On the day the suicide forest video went live, he gained over 17,000 subscribers. The next day, it jumped to more than 81,000. Even though he reportedly chose not to monetise the original video, his apology vlog, which amassed 28 million views (four times more than his normal viewing average), cashed in. It’s estimated it could have earned him between $12,000 and $97,000 – and YouTube benefits financially too. This is due, in part, to the algorithm that pushes widely shared videos to the forefront – but he can’t aggregate views just based on algorithms alone. He needs us for that.
Nathaniel Tkacz investigates the political aspect of technology and teaches Digital Media and Culture at Warwick University. He thinks that the problem runs far deeper than Paul, and lies in the ethos and mechanism behind the site. “Not only is our media designed to take advantage of our cognitive weaknesses, but there is an active science devoted to studying and measuring what works and what doesn't,” explains Tkacz. “This behavioural design is now working at the level of large populations and through automated techniques. We're directed through 'user journeys', and 'funnelled' this way and that, often without our knowledge. Ours is the age of the invisible nudge.” ‘Nudging’ is a shorter way of describing how designers and developers encourage you to behave in a certain way online. It isn’t exactly by force, but you are probably largely unaware of it.
For content creators like Logan Paul, in order to be rewarded by the algorithm, you have to play on intense emotions to be ultra visible on YouTube. Creators will see how well happy vlogs do, how people respond to sadness, and how lucrative disgust is. It’s similar to Paul’s musings in the suicide forest vlog itself. When he said, “That's the life, this daily vlog life guys. I have chosen to entertain you guys every single day,” Paul effectively blames the current content cycle for his need to push extreme boundaries. “Platform media is built on harnessing emotion and effect, on channelling desires. We shouldn't be surprised that there is a darker side to all this,” explains Tkacz.
However, Tkacz thinks that blaming extreme content on the need to be seen and shared is a cop-out. “One of the things algorithms have done is complicate the notion of responsibility. In the coming years, we will need to figure out a better way to hold algorithms and their makers to account – especially since they do play a major role in the circulation of content, and therefore of the unfolding of contemporary culture.”
“Not only is our media designed to take advantage of our cognitive weaknesses, but there is an active science devoted to studying and measuring what works and what doesn’t. Ours is the age of the invisible nudge” – Nathaniel Tkacz, Digital Media and Culture expert
There’s definitely evidence to support the thesis that, as views become a more valuable currency than artistic merit, the trajectory of internet scenes is changing. Controversial “Soundcloud rappers” like 6ix9ine have garnered millions of views on YouTube in their explicit quest to go viral — in 6ix9ine's case, even as news buzzes about a sexual assault allegation. The Paul scandal exhibits the same pattern. Controversy begets virality. An increase in views lead to reaction videos and monetised spin-off vlogs from creators looking to stay visible and game the algorithms.
In recent years, concerns have been raised about how YouTube responds to, and even benefits from, this shock-for-clicks culture. Take the lucrative “sharenting” trend, where children have been exploited as the stars of YouTube channels. One such example is Michael and Heather Martin, parents who lost custody of their children last May and were sentenced to five years probation for abusive prank videos. Michael and Heather Martin are still vlogging, and have even done a special Logan Paul forgiveness vlog, wishing him “the best of luck”. PewDiePie also chimed in to call Paul “a sociopath”, despite his own history of blatant anti-semitism and open use of the word “nigger”.
Regardless of all of these controversies, YouTube has repeatedly failed to take action. Of course, the company’s refusal to shut down channels of controversial creators is tacit support – but what about those who keep watching?

We, as users, may not be totally aware of how we interact with content, but popular sites certainly are. In early 2012, data scientists manipulated the Facebook timelines of a selection of users. Some people were shown happier content and positive words, others saw sadder than average posts. By the end of the experiment, the researchers found that users were more likely to post especially positive or negative words themselves. This “emotional contagion” study sparked outrage from critics, but showed how deeply platforms study and manipulate human behaviour to increase engagement.
Jessa Lingel, a professor monitoring digital culture at Annenberg School for Communication, adds that it’s not as simple as “blaming” YouTube in these circumstances. “YouTube and Facebook have committed to hiring more people to monitor offensive content. Shock and disgust are very subjective emotions and responses, so it's difficult to come up with a firm set of rules around what constitutes (them). As a general rule, shock and disgust result from violating social norms. But even within a single country or city, social norms can vary by age and background,” she says. “Social media companies are definitely aware of this problem and actively struggling with how to cope.”
“We need to figure out a better way to hold algorithms and their makers to account – especially since they do play a major role in the unfolding of contemporary culture” – Nathaniel Tkacz
Ultimately, we should all be aware that watching videos that we find disgusting or reprehensible, only rewards disturbing behaviour. “Responsibility lies with different parties, for example, the users uploading the content, the platforms gaining directly from this content for not educating its users enough,” affirms Tkacz. “We can also ask more difficult questions, such as: why we are attracted to watching live beheadings, horrific car crashes, or suicides?”
Calls to clean up the internet are almost as old as the internet itself, from the days of rotten.com or early internet porn. “Initially, I don't think platforms like YouTube cared that much, explains Tkacz. “But a lot has happened in the last decade. Think Gamergate, the alt-right, The Silk Road, #MeToo, and so on. People are no longer willing to see the platforms as neutral mediators of social life. Whether it's ‘fake news’ or the knowledge of widespread behavioural targeting during elections, I think there will be much more of a push to hold the platforms to account.”
How YouTube responds to the recent flurry of scandals will set the tone for user-generated content web-wide. Yet it’s a mammoth task to discourage the macabre human urge to watch shock and to absorb the taboo. When the company fails to act, we’ll only see more outlandish vlogs and copycat Logan Pauls. But in the meantime, it’s up to all of us to know our role in the mechanism. To throw ourselves at the mercy of algorithms without questioning our own role in the site (as a viewer or a creator), and without monitoring our own behaviour, allows our culture to be driven to senseless extremities by soulless code.
Update: Youtube has temporarily banned ads on Logan Paul’s channel after he released videos of him tasering dead animals and giving CPR to a fish. “In response to Logan Paul's recent pattern of behaviour,” the platform tweeted on Friday, “we've temporarily suspended ads on his channels.”