Good News in Lewd Land: Dating App Bumble is Open Sourcing its AI That Detects Unsolicited Nudes

If you’ve ever spent time on social media or a dating app, you may have experienced the dread of unsolicited nude.

Dating app Bumble uses an AI tool to detect unsolicited nudes and automatically blur them, giving users on the receiving end a decision whether or not to see them uncovered.

And now, as part of a larger commitment to combat “cyber-flashing,” Bumble is sharing its success.

They are open-sourcing Private Detector (which is a hilarious pun of a name) which first debuted in 2019 on GitHub. This will open the program up for commercial use, distribution and modification.

While it’s not exactly the most complicated software in the world, open-sourcing Private Detector (still laughing about that one) allows smaller companies with fewer resources a chance to access the programming and protect their users.

Since first releasing Private Detector in 2019, Bumble has worked closely with US legislators to enforce legal consequences for people who send unsolicited nudes.

Bumble said in a press release, “Even though the number of users sending lewd images on our apps is luckily a negligible minority – just 0.1% – our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task.” The company added, “There’s a need to address this issue beyond Bumble’s product ecosystem and engage in a larger conversation about how to address the issue of unsolicited lewd photos – also known as cyber-flashing – to make the internet a safer and kinder place for everyone.”

Bumble claims Private Detector offers an impressive 98% accuracy.

One Comment

Comments are closed.