Dating can be hard enough, let alone when it’s been thrust more so into the virtual world thanks to the seemingly never-ending lockdowns of 2020. Always with women in mind, Bumble has announced that it has banned body-shaming from its app—an incredible move to make the dating experience safer and kinder for all.
Updating its terms and conditions to explicitly ban unsolicited and derogatory comments made about someone’s appearance, body shape, size or health, the app will use an algorithm to filter out language that can be deemed fat-phobic, ableist, racist, colourist, homophobic or transphobic.
“We have always been clear on our mission to create a kinder, more respectful, and equitable space on the internet, and our zero-tolerance policy for racist, harassing and hate-driven speech is an important part of that,” says Lucille McCart, Associate Director, APAC PR + Comms at Bumble. “We believe in being explicit when it comes to the kind of behaviour that is not welcome on our platforms and we’ve made it clear that body-shaming is not acceptable on Bumble.”
The move comes amid growing concern about abuse on dating apps. In a recent Bumble survey of 1,400 single Australians, 64 per cent of users said that people are more likely to make unsolicited comments about their body online, and 45 per cent stated that someone they have dated has made an unsolicited comment about their body either in person or online
“Our moderation team will review each report and take the appropriate action,” McCart said. “We always want to lead with education and give our community a chance to learn and improve. However, we will not hesitate to permanently remove someone who consistently goes against our guidelines.”
The app uses automated safeguards to detect comments and images that go against its guidelines and terms and conditions, which can then be escalated to a human moderator to review. Bumble users can also report someone for body shaming within the app’s Block + Report tool.
It’s also not the first tool Bumble has introduced that helps protect their users.
In 2017, the company partnered with the Anti-Defamation League to ban all forms of hate speech, hate symbols and harassment. This was followed by a ban on photos with guns for people who are not law enforcement or veterans after a number of U.S. shootings.
In 2019, Bumble introduced Private Detector, a feature that uses artificial intelligence to automatically detect and blur unsolicited nude images. The feature then alerts the recipient who can choose to view, delete, or report the image.
“Find something else about their profile to talk about. Or, if you’re not interested in someone, you can swipe left,” the company added in a blog post this week. “If you’re not sure if a message will come across as body shaming, a good rule of thumb is simply not to comment on another user’s body or health at all.”