Generative AI tools are powerful, but they’re not private. When you enter data into tools like ChatGPT, that information may be stored, processed, or used to train future models. As faculty, it’s essential to protect student privacy and institutional data. Think of it this way: if you wouldn’t post it publicly, don’t put it into AI.
From North Carolina State University's AI Guidance and Best Practices:
AI tools like ChatGPT store every piece of info you share – don’t include anything you don’t want in the cloud! (e.g. Personal ID details, client / stakeholder info, proprietary research, accounts, discriminatory language, etc.)
Faculty must not input any personally identifiable student information into ChatGPT or other public AI tools. This includes:
Student's name
Family members names
Addresses/Email Addresses
NSHE or student ID numbers
Indirect identifiers
Information that could be used to identify a student when combined with other data or based on context
Even copying and pasting a student’s email into ChatGPT is a potential FERPA violation if it includes identifying information.
The only tools teachers can legally input student data into are FERPA-compliant platforms that have an active contract with the institution.
AI tools often process and store data externally, putting that data at a higher risk of unauthorized access or misuse.
Before putting anything school related into an AI tool, ask: Did I remove names, specifics, and identifiers?
Tip: Use anonymized summaries like: “How should I respond to a student who missed class for a funeral?” instead of pasting the full email.
Do not put sensitive college documents or files into ChatGPT, including:
Internal memos or reports
Unreleased policies or drafts
Meeting minutes
Unredacted student submissions or evaluations
Any file stored on shared drives or Canvas with access restrictions
If it’s meant for internal use only, it should not be processed by an AI tool.
Next: AI Syllabus Statement