Bumble is excited to share a version of the company’s premier safety feature, Private Detector™, to the wider tech community. Now accessible on GitHub, the release will allow other tech companies to adapt and create features that help enhance safety and accountability against online abuse and harassment.
Since the early days of Bumble we have cared deeply about community safety and heavily invested in providing the tools people need to have a safe experience across Bumble Inc.’s family of apps.
In 2019, Bumble launched Private Detector across Bumble and Badoo apps in response to the rise of cyberflashing, the sending of unsolicited lewd photos digitally. Private Detector is an AI-powered feature that automatically detects and blurs lewd images and warns users about the photo before they open it.
Rachel Haas, Bumble’s VP of Member Safety stated, “At Bumble, safety is at the heart of everything we do and we want to use our product and technology to help make the internet a safer place for women. In 2019, we led the charge by designing Private Detector and taking a clear stance against online harassment. Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online.”
Outside of the product ecosystem, Bumble’s also been a strong advocate in helping to pass bills that ban cyberflashing. When Private Detector launched in 2019, Bumble teamed up with legislators across the aisle in our home state of Texas to pass HB 2789, which effectively made sending unsolicited lewd photos a punishable offense. This year, Bumble reached another milestone in public policy by helping to pass SB 493 in Virginia and most recently SB 53 in California, adding another layer of online safety in one of the most populous states in the United States.
As Bumble continues to help curb cyberflashing and online abuse, we look forward to creating a ripple effect of positive change across the internet and social media at large by open sourcing a version of our Private Detector™ feature.
You can learn more about Bumble’s Private Detector open source here.
Payton Iheme, Bumble’s Head of Public Policy for the Americas said: “At Bumble, we are focused on our mission to help create healthy and equitable relationships through kind connections and build a space where women feel safe and empowered online. Bumble was one of the first apps to address cyberflashing by giving the power to our community to consensually decide if they would like to see certain photos and creating a safety standard if not.
We’ve been working to address cyberflashing and to help create more online accountability for years but this issue is bigger than just one company, and we cannot do this alone. We are thrilled to be able to open source a version of our Private Detector AI modeI and allow others to apply our tools to help combat online harassment at large.”