Anyone who’s used a dating app knows that it can sometimes be a cesspit of unsolicited dick pics, pretentious softboy musings, and derogatory comments. Now, Bumble is set to tackle the latter, with a new ban on users who body shame their matches.
The app, which gives women control over who they talk to, will block users who write derogatory comments about people’s physical appearance in their profile details or in conversations with their matches.
Bumble will use an algorithm to flag certain terms, including language deemed fatphobic, ableist, racist, homophobic, and transphobic. Moderators will then trawl the accounts which have been flagged, and determine whether further action is needed. Accounts which use body shaming language will first get a warning, but repeated incidents will result in the user being permanently removed from the app.
“At Bumble, we have always been clear on our mission to create a kinder, more respectful, and more equal space on the internet,” Naomi Walkland, the app’s head of UK and Ireland said in a press release. “Key to this has always been our zero-tolerance policy for racist and hate-driven speech, abusive behaviour, and harassment.”
“With these changes, we’re making it clear that body shaming is not acceptable on Bumble,” Walkland continued. “We always want to lead with education, and give our community a chance to learn to recognise this language and improve. However, we will not hesitate to permanently remove someone from the app.”
New research by Bumble found that 23 per cent of British people have been body shamed online on dating apps or social media. 87 per cent said they feel dating is a space where you’re more physically judged than other areas of life, while 71 per cent said they’re more likely to make unsolicited comments online.
The body shaming ban is Bumble’s latest move in making its app a safer space for users. In 2016, it banned shirtless bathroom mirror selfies, while in 2019, it launched an AI feature that automatically detects and blurs unsolicited nudes.