This framework outlines four color-coded levels of how students can use AI tools in their schoolwork. It explains what AI use is allowed at each level while promoting honesty and academic integrity. (“AI” here means generative AI tools like Google Gemini, ChatGPT or AI image generators.)
The burden is on the student. If a teacher cannot verify that the work is by the student and the student alone, they cannot submit the work. The student must provide evidence (citations, annotations, verbal interviews, drafts, etc.)
Please see the Academic Integrity page for more information.
Allowed AI Use: None. All work must be done without any AI assistance. This level applies to tasks meant to measure individual skills and knowledge, so no AI help is permitted.
Permitted: Relying on your own knowledge, class notes, textbooks, or teacher materials. (Absolutely no AI involved.)
Not Permitted: Using Gemini or ChatGPT, or any AI to get answers, write text, or even to rephrase/clarify questions. Even a quick question to an AI for a hint or explanation is not allowed, since you must work independently.
Citation/Disclosure: None. There’s nothing to cite because no AI is used. Using AI anyway at this level is considered cheating.
Allowed AI Use: Using AI as a research and brainstorming helper in the early stages. You can ask AI to brainstorm ideas, make an outline, find facts, or explain concepts. However, the final submission must be entirely in your own words. AI should not write any part of your answers.
Permitted: Using Gemini to generate project ideas or simplify a tricky concept, similar to how you might use a search engine. Having an AI tutor give you practice questions or summarize background information to help you study.
Not Permitted: Letting AI write any part of the content you submit. Do not copy-paste a paragraph from an AI into your essay, or use an AI to fully solve a math problem and turn it in. No AI-generated sentences or images should appear in your final work at this level.
Citation/Disclosure: Disclosure is required. Even if you only used AI to brainstorm or plan, you must add a brief note to your assignment (e.g., 'Used Gemini to brainstorm essay topics'). You do not need a formal MLA citation if no AI text is quoted, but you must document that the tool was used in your process.
Allowed AI Use: Using AI to improve or edit your own work. You must write the initial draft or solution, and then AI tools can help refine it. This includes grammar/spell checking, suggesting clearer wording or structure, or giving feedback on your writing. The ideas and content must originally be yours; AI is just helping polish them.
Permitted: Running your essay through an AI like Grammarly or Gemini for suggestions, then revising your draft yourself. Asking an AI to rephrase sentences you wrote to make them clearer. Using an AI image editor to enhance a graphic you created.
Not Permitted: Using AI to add new paragraphs or solve problems that you couldn’t solve on your own. Don’t have the AI write your conclusion or do your math homework for you. AI should not introduce new content or answers that weren’t already in your own draft.
Citation/Disclosure: Yes – be transparent. Note in your assignment that you used an AI tool for editing (for example, add a brief footnote or comment) and keep a copy of what the AI suggested. If AI suggestions significantly changed your phrasing or you used an AI-generated image, give credit to the tool (e.g. “Image enhanced with Gemini”). Teachers might also ask to see your original draft alongside the revised version to verify your work.
Allowed AI Use: Full collaboration with AI is allowed, and some assignments might even require using AI tools. AI can generate text, images, or solutions alongside you as a co-creator. However, you must guide the process and make sure the final work reflects your own understanding and effort.
Permitted: Working with an AI chatbot to jointly write a story or essay (you and the AI take turns writing and then edit the result). Using AI to produce a first draft of code or an essay, then you debug/refine it and add your own insights. Using an AI image generator to create illustrations for your project, then adding your own commentary to those images.
Not Permitted: Turning in AI-generated work “as is,” with no real input or review from you. If you let an AI write a report and submit it unchanged, that’s unacceptable – you are expected to check the AI’s work, correct errors, and ensure you understand everything in it. Do not use AI in a way that replaces your own understanding or involvement in the assignment.
Citation/Disclosure: You must clearly acknowledge any content created by AI. For all Level 4 assignments, you must submit an 'AI Process Log' or a link to your full AI chat transcript alongside your final work to demonstrate your 'Human-in-the-Loop' oversight
The "Human-in-the-Loop" Bias Rule:
AI models are trained on the internet, which means they can reproduce human prejudices. Whether you are in the Yellow, Green, or Blue zone, you are responsible for filtering out bias.
What to look out for: The AI assumes certain professions belong to a specific gender (e.g., referring to all engineers as "he" and all nurses as "she"), or it explains a historical event entirely from a Western perspective while completely ignoring the indigenous viewpoints.
What to do: If you encounter biased language, harmful stereotypes, or exclusionary assumptions in the AI's output, do not use that text. You are expected to report this to your teacher so the class can discuss the limitations of these tools.
The "Fact-Check" Hallucination Rule:
AI tools are designed to predict the next logical word, meaning they can confidently present false, fabricated, or outdated information—known as "hallucinations."
What to look out for: The AI might invent fake historical dates, generate completely made-up book citations, provide incorrect mathematical formulas, or summarize a novel incorrectly just to make its response sound plausible.
What to do: You must independently verify any facts, dates, statistics, or concepts generated by the AI using reliable, teacher-approved sources (such as textbooks or library databases) before using them in your work. You are the "Human-in-the-Loop"; submitting an assignment containing unverified AI hallucinations is considered a failure of academic responsibility, not a software error.