A serious data breach has just been discovered on the tea dating consulting app, causing the personal information of more than 33,000 women to be made public online and becoming the target of a series of harassment acts.
American media reported that leaked data was quickly exploited to create harassment maps on Google Maps, in which thousands of sticks marked directly displayed the victim's address.
Google promptly removed these maps from Google Maps, saying that this behavior violated its anti- harassment policy. However, the damage has spread as data continues to appear on many forums, applications and social networking platforms.
Cybersecurity researchers have also discovered an extreme case of data turning into an online game, in which players are required to rate women's self-conscious photos and rate them as attractive.
Within three weeks, the Tea app became the focus of discussion on forums like 4 Chan with more than 12,000 related articles, many of which were insulting and harassing.
The consequences of the incident do not stop at cyberspace. More than 10 women have filed a class action lawsuit against Tea's parent company, accusing the company of being sloppy in data protection and putting their lives in danger.
This is not the first time tea has been involved in a scandal. Previously, the platform was accused of deliberately breaking into Facebook groups to advertise, even impersonating women to track men's behavior.
These actions sparked a strong wave of criticism of the app's transparency and social responsibility.
Starting with the goal of becoming a new dating consulting platform, Tea has now become a typical example of a warning about the risks of poor data management.
The incident once again highlights the fact that women are often the most vulnerable to online security breaches.
Tea's data breach not only shakes confidence in dating apps but also raises urgent questions about legal liability, as well as the need to build stricter security standards to protect personal information in the digital age.