Ars Technica | April 30, 2026 at 4:00 AM

Google says it respects user privacy in AI, but the reality is not so black and white. As Gemini seeps into every nook and cranny of the Google ecosystem, the amount of data Gemini retains depends heavily on how you access the AI, and opting out of data collection can mean running straight into so-called "dark patterns" β€” UI elements that work against the user's interest. Google claims it doesn't use the content of your Workspace data to train foundational models, but Gemini can use tools to connect to Workspace and other Google products. Gemini outputs can include summaries and snippets of email or files, and that data can then become fodder for AI training. Google says its AI models can be trained on Gemini inputs and outputs, and while it tries to "filter and reduce" personal information going into AI training datasets, there's no way for users to know how well this automated process works. To fully block AI training on your data, you need to turn off a feature called Gemini Apps Activity β€” but doing so means losing your chat history, forcing users to choose between privacy and convenience.

Read more