Swipe Left: Bumble Faces Allegations of Illegally Collecting User Biometric Data

A recent class action lawsuit has been filed against Bumble Inc., the popular online dating and networking application, alleging that the app profits from facial scans collected from its users without proper consent.

Advertisements

In the realm of online dating, users are acutely aware of the need to safeguard their personal information from potential romantic interests. However, what users might not anticipate is the misuse of their data by the very dating apps they entrust. A recent class action lawsuit has been filed against Bumble Inc., the popular online dating and networking application, alleging that the app profits from facial scans collected from its users without proper consent. This lawsuit, filed in an Illinois federal court, contends that Bumble fails to adequately inform users about the purpose and duration of collecting, storing, and using their biometric information during the photo verification process.

According to the lawsuit, Bumble violated the Biometric Information Privacy Act (BIPA) by collecting, storing, and using biometric information without obtaining a written release from the users. The legal action argues that this not only infringes on user privacy but also exposes them to “serious and irreversible privacy risks.” Bumble is accused of specifically collecting and utilizing facial scans during its photo verification process without transparently informing users of the specific purpose and length of term for which their biometric information is being processed.

This revelation is particularly alarming as it involves the capture of a highly sensitive aspect of users’ identities – their facial features. The lawsuit highlights the potential misuse of this data, indicating that Bumble allegedly exposes its users to privacy risks by collecting, using, and disseminating their biometric data to “further enhance” itself and its online app-based platform. Users who had their biometric data stored on the app are left vulnerable to identity theft, unauthorized tracking, and other improper uses.

This class action seeks a jury trial and demands declaratory and injunctive relief, along with both actual and statutory damages for the affected users. This lawsuit follows previous legal actions against Bumble, such as a $1.3 million settlement in 2021 addressing claims of gender discrimination within the app.

The case underscores the growing challenges related to user data protection in the digital age. With the increasing reliance on facial recognition technology, it becomes imperative for companies like Bumble to prioritize transparency and obtain proper consent from users. The potential consequences of failing to do so, as outlined in this class action, highlight the importance of stringent regulations and accountability in the evolving landscape of online privacy, emphasizing the need for users to be vigilant about the handling of their personal data by the apps they use.he illicit use of stolen biometric data by cybercriminals adds another layer of concern. If this sensitive information were to fall into the wrong hands, cybercriminals could exploit it for identity theft, unauthorized access, or other malicious activities, posing a direct threat to the affected users. This underscores the urgency for robust security measures and ethical data practices to safeguard users against potential cyber threats associated with the mishandling of biometric information.

Join 16 other subscribers

Advertisements

audible - now streaming: podcasts, originals, and more. Start your free trial.

Advertisements

Amazon business - everything you love about amazon. for work - learn more

Advertisement

Advertisements

Trending Topics

AI Business Consumer cyber-security cybersecurity Email Gaming Government Hacking Home Malware Mobile Open Source Phishing Privacy Scams security Shopping technology Vulnerabilities

More News

Podcast Corner

Cybersecurity Awesomeness Podcast – Episode 151 Cybersecurity Awesomeness Podcast

In this episode of the Cybersecurity Awesomeness Podcast, Chris Steffen and Ken Buckler offer a comprehensive recap of RSAC 2026, cutting  through the noise of 40,000 attendees to deliver critical takeaways from the industry’s "Super Bowl." While AI dominated nearly 80% of vendor booths, the hosts differentiate between "marketecture" and meaningful innovation. They emphasize that deploying agentic AI without robust Data Security Posture Management (DSPM) is a recipe for unmanaged data sprawl and "Shadow AI" risks, where sensitive proprietary information is accidentally leaked into public models.A significant portion of the discussion focuses on the maturation of identity management, noting a shift toward granular guardrails for AI agents to prevent overprivileged access. The duo also debunks the myth of AI as a headcount replacement for SOC analysts, highlighting its lack of "tribal knowledge" and innovative problem-solving. Beyond the AI hype, the conversation touches on the urgency of Post-Quantum Cryptography (PQC) and the evolving role of the CISO—transitioning from a "head nerd" to a strategic risk manager under new regulatory mandates. Ultimately, the episode serves as a reminder that foundational data governance remains the true anchor in a high-velocity threat landscape.

Leave a comment

Discover more from Cyber News Gator

Subscribe now to keep reading and get access to the full archive.

Continue reading