According to a new report by Tech Transparency Project (TTP), the organization that monitors major technology platforms, researchers have discovered 55 deepfake "nude photo" applications on the Google Play Store and 47 applications on the Apple App Store.
These applications allow users to delete digital clothes or graft a real face into a naked body using generative AI technology.
What worries public opinion especially is that the above applications can be downloaded easily, just using personal photos that are often shared publicly on social networks.
TTP said that when searching with keywords such as "nudify" or "undress", a series of applications still appear on both of the world's most popular app stores.
During testing, the research team discovered that many applications have the ability to create quite realistic suggestive or nude images, even though the original photos have no sensitive elements at all.
This is a type of deepfake image, a technology that is causing fierce controversy because it infringes on privacy and personal honor.
Notably, the total downloads of these applications exceeded 700 million times, bringing in more than 117 million USD in revenue.
Apple and Google, with their roles as distribution intermediaries, also enjoy part of the profit from transactions in applications.
This number makes the question of content censorship responsibility of the two technology giants even harsher.
The survey also pointed out an alarming reality that many applications are rated as suitable for teenagers, even children.
For example, the DreamFace application is rated for users aged 13 on Google Play and 9 on the Apple App Store, but has the function of creating sensitive images using AI.
Under public pressure, Apple said it had removed 24 applications after the report was released, but this number is still much lower than the 47 applications identified by TTP.
Google confirmed that it has temporarily suspended some policy-violating applications, but declined to disclose the specific number of them removed.
TTP's report was released in the context that xAI's AI tool Grok was also found to create a large amount of sensitive images without consensus, including content related to children.
This further raises concerns about the fact that technology platforms have not effectively controlled the risks that AI creates.
According to experts, the "nude" application using AI not only seriously harms mental health, but also contributes to increasing violence and discrimination against women and girls, who are often victims of deepfake pornography.
Faced with this situation, many countries, including the UK, are considering completely banning AI tools that erase digital clothing, considering creating and distributing them as illegal acts.