These are AI tools integrated into specific platforms or designed for particular professional uses. Their data handling practices are often more focused on privacy and security within the organizational context.
Gemini for Google Workspace:
Organizational Data Isolation: Interactions within Gemini in Google Workspace remain within your organization. Google will not share your content outside your organization without your consent.1
Usage Monitoring: Google Workspace administrators can view organizational and user-level usage data of Gemini within their domain. This includes the number of active users, feature usage in different apps (Gmail, Docs, Slides, Sheets), and chat interactions with the Gemini app.
No Training on Prompts: Gemini for Google Cloud does not use your prompts or its responses as data to train its models.
Code Customization: If you use code customization features (like in Gemini Code Assist), your private code is securely accessed and stored to provide those services. This data is essential for delivering the code customization you've requested.
Data Residency Controls: Gemini Advanced offers options for IT managers to enforce data residency and privacy policies, allowing control over where data is stored and processed.
Encryption: Data is encrypted in transit when you submit prompts to Gemini.
Gemini Code Assist:
Network Access Control: Network administrators can configure their networks to restrict access to Gemini Code Assist based on user domains for enhanced security.
Monitoring Usage: Usage metrics are automatically collected in Cloud Monitoring, allowing organizations to track daily and 28-day active users and the number of responses sent.
Paid AI programs often offer enhanced features, better performance, and, importantly, more robust privacy and data control options compared to free versions.
Enhanced Privacy Policies: Paid services typically have more stringent privacy policies and provide users with better control over their data. Enterprise-level agreements may include specific clauses2 to protect data privacy and ensure regulatory compliance.
Data Ownership: Paid AI services generally offer more control over data ownership, often restricting the provider's ability to use or share data without explicit consent.
Security Measures: These services frequently come with enhanced security features, including adherence to regulations like GDPR and HIPAA, advanced encryption options, and continuous monitoring.
Support and Accountability: Paid services usually include dedicated support and clear accountability structures for data breaches or misuse.
Transparency and Compliance: Paid AI services are more likely to offer greater transparency in data handling practices and comply with industry standards and regulatory requirements.
Data Collection: Free ChatGPT collects your conversations (prompts and responses), account information (email, name, contact details), device data (device type, operating system), usage data (location, time, version), and log data (IP address, browser).
Training: By default, the conversations you have with the free version of ChatGPT can be used to train and improve their models. This helps the AI become more accurate and better at solving problems.
Opting Out: You can opt out of having your data used for training through their privacy portal by clicking "do not train on my content." Additionally, using the "Temporary Chat" feature ensures your conversations are not saved in history, used to create memories, or used for training.
Data Usage: OpenAI states that they do not use your content to market their services or create advertising profiles of you. The primary use of your data is to improve the models.
Data Retention: OpenAI retains certain data from your interactions to understand user needs and preferences, which helps the model become more efficient over time. They take steps to reduce the amount of personal information in their training datasets.