The explosion of generative artificial intelligence (AI) models is raising growing concerns about privacy and data ownership.
While technology companies silently update service terms to collect and use user data for AI training purposes, individual users lack information or tools to protect their data.
Recently, file sharing service WeTransfer has encountered a wave of protests when adding a clause allowing users to download files to "improve machine learning models".
In response to backlash, the company was forced to withdraw this clause. However, the incident shows that public awareness of personal data being extracted to serve AI is increasing.
From tweetads, blog posts to photos on Instagram, anything public on the Internet can become a resource for AI models.
Some artists and creators have filed lawsuits to protect their work's ownership, while ordinary users can still take small steps to limit their data collection:
Adobe: Personal users can go to the privacy page, find "Contained Analysis to Improve Products" and turn it off.
Google Gemini: In the Gemini interface, go to Activities > drop-down menu > turn off the Activities of the application.
Grok (X.com): Go to Settings > privacy and safety > Grok, Delete data sharing.
LinkedIn: Go to personal records > Settings > Data privacy > turn off the option to allow the use of AI training data.
ChatGPT & DALL·E ( OpenAI): Go to Settings > Data Control > remove the "improving the model for everyone" option; if you want to remove the image from the DALL-E training dataset, you need to submit the form to OpenAI.
However, the refusal to ensure data had not been used before. Many companies keep their training data list confidential to avoid legal problems.
Some platforms, such as Meta, collect public data from users over the age of 18, unless they live in the EU where personal data protection laws are more stringent.
In the context of the constant development of AI, users need to be alert and proactively learn about security options to control their personal information.