Audience: staff and students who are about to paste something into an AI tool.
Purpose: tell you what’s OK to upload, what needs checking/anonymising, and what you should never put into a public AI.
Audience: Staff and students who are about to paste something into an AI tool.
Purpose: To tell you what’s OK to upload, what needs checking/anonymising, and what you should never put into a public AI.
Your own writing (to improve clarity)
Assignment briefs (without internal staff notes)
Publicly available texts you’re analysing
Generic interview/STAR/Gibbs prompts
Practice questions
Student work you are marking
Peer feedback / discussion board posts
Internal teaching materials
Case studies with some context
Drafts of research ideas
Placement scenarios (fictionalised)
👉 Prefer an institutionally approved / private AI for these, or strip names and identifiers first.
Patient / service-user / school pupil data
Real placement / clinical notes
Personal data about named individuals
Commercially sensitive partner information
Unpublished research data with consent limits
👉 Use your university’s secure routes instead, or don’t use AI for that part at all.
Public/consumer AI tools may store or reuse what you type.
You may be under GDPR, ethics, PSRB or placement agreements that forbid sharing client/patient data.
Students often don’t realise their reflective pieces contain identifiable details.
So: if in doubt, don’t paste it.
When asking students to submit a process pack (prompts + AI output + reflection), tell them to anonymise any real-person/context data first.
If you want AI to help with feedback, paste your marking notes, not the full student script.
If your School has an enterprise / institutionally approved AI, link to it here and recommend that first.
Create a teaching version of your case before using AI (change names, locations, dates, identifiers).
Never share real patient/client/school records with a public AI.
If the task is assessed, ask your tutor which version to submit.
If AI helped you structure a reflection, say so (see How to acknowledge GenAI)
Students:
AI may generate text/images that look original — you still need to cite real sources.
If AI “suggests” a source, check if it actually exists in the library/databases.
Saying “AI wrote it” does not make plagiarism OK.
Do not upload your taught material to public GenAI tools without staff permission
Staff:Â
Model best practice, if you generate teaching materials with AI, check for copied/attribution-required material.
AI can be very helpful for:
simplifying assignment briefs (“explain this for a first-year student”)
rephrasing feedback into clearer English
giving alternative explanations for the same concept
drafting alt-text or stem sentences for students who need more structure
But, the official/assessed version of the brief/feedback is still the one from your module.
This page is general UK HE advice. If your university / your School issues stricter rules, follow those.
See Core guidance → Local / institutional policy first for advice.