Bumble Open Sourced Its AI Tool to Fight Against Unwanted Cyberflashing

Bumble's Private Detector, its artificial intelligence tool, protects users from lewd photos. It screens images sent from matches to see if they contain inappropriate content. The tool was designed to catch unsolicited nudes and flag shirtless selfies and images of guns.

Bumble
ERIC BARADAT/AFP via Getty Images

However, in a recent blog post, the dating app announced it will open-source Private Detector and make the framework available on Github. This aims to have a wider tech community adopt it to make the internet a safe place.

The Rise of Unwanted Cyberflashing

Online nudity has become increasingly prevalent in the last couple of years. According to a study, almost two-thirds of people admit to sending provocative content. The researchers behind the study found that 25% of people reported receiving a sext.

Websites like Reddit and Twitter have been cracking down on online harassment and unwanted sexual content. But the problem still persists, and the digital dating world is especially susceptible.

Also Read: Bumble Security Flaw Exposed | Stalkers Can Find Victims' Locations

No More Unwanted Nudity with Artificial Intelligence

Bumble's Private Detector has been widely regarded for being an effective tool for removing explicit material from chats.

In the announcement of its open-sourcing, Bumble said it already uses a mix of machine learning, natural language processing, and human moderation to keep all users safe.

As Bumble has proven, the lesson of the day is that the best technology, when put in the right hands, can have a positive impact on the safety and well-being of everyone online.

Bumble is aware that people are still being harassed on dating apps. In an effort to make it a safer environment, the company is working on new safety tools. Even though the dating app has successfully built a community dedicated to removing all forms of disrespect, there is still more to be done.

Since the technology is open-source, developers can help find algorithms that can identify problematic photos more accurately. It can help the dating app further improve Private Detector.

Additionally, with more developers coming in, the technology might become more sophisticated. The code might feature more definitions for nudes, for example, and can differentiate between innocent and explicit photos.

Bumble and other dating apps can continue to innovate in the realm of online privacy and safety. Their efforts to address cyberflashing and other forms of cyberbullying are part of a larger movement to protect their users.

Users are concerned about the privacy of their online activity, and with the help of Bumble and companies like it, users can expect better protection online.

In the end, the goal is to keep the internet safe, and to promote the respectful treatment of one another across the world. Do you think open-sourcing Bumble's Private Detector is a good idea? Let us know what you think in the comments below!

Related Article: Dating App Guide: How to Change Your Location in Tinder, Bumble, and Hinger [2022]

This article is owned by Tech Times

Written by April Fowell

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics