This Months Newsletter
Chatbots will have their place in the world; however, users should be made aware that these are unable to feel or show empathy, and these are simply programmed responses.Â
When teachers are creating AI agents, they should include instructions to inform the learner that they are an AI support tool, and they can also be instructed to use non-personifying language.Â
Guidance from the Welsh Government on what you need to know about using Generative AI.
A link to an AI policy which can be adopted by your school.Â
This policy provides your school with clear guidance on how it can safely make use of AI.
Support for staff on how AI can be used in education to enhance pedagogy.Â
You have several AI applications available to you through the Hwb platform, which have passed the Welsh Government's digital scrutiny.
Microsoft Copilot is available for staff to use; this is Microsoft's Large Language Model (LLM), similar to ChatGPT and Google Gemini. Through Hwb, you have Enterprise Data Protection, which gives you additional security when adding information to support your prompts.
Adobe Express & Canva have several AI features which can be used to generate or manipulate images.
Microsoft Reading Progress allows teachers to assign generated reading material, which can be used within reading assessments.
Do Not Use any AI applications which have not been approved by your school's Data Protection Officer (DPO) with a Data Protection Impact Assessment (DPIA).
 Do not use AI applications with learners which have not been approved by your school's Data Protection Officer (DPO) with a Data Protection Impact Assessment (DPIA), as most AI applications are for learners aged 18+.
It is therefore necessary to ensure that our/your understanding of AI is developing to a point where learners can be safely introduced to the uses of AI in the world they will grow up in.
At this point, there are two main considerations for the teaching and understanding of AI in Education.
AI Ethics - these are the moral principles and guidelines which govern the development and use of AI applications and tools
Prompt development - this is the understanding of crafting and refining the input (prompt) used to guide the AI application towards producing the output
Using Gemini AI, and using the prompt:
This document outlines a staged approach to introducing and developing a robust understanding of AI ethics from Reception through to Year 11. The plan progresses from foundational concepts in younger years to more complex, societal issues in the later years, with each stage featuring a specific ethical expectation and two corresponding activities—one related to general life and one to artificial intelligence.
Ethical Expectation: Understanding what is real and what is not.
AI Activity: Robot or Real? The class sorts pictures of toys and simple robots versus pictures of animals and people. The discussion focuses on identifying what a living thing is and what a machine is, and how machines can be designed to look or act like living things.
General Life Activity: Storybook Sort. After reading a story with fantastical elements (e.g., The Gruffalo or Where the Wild Things Are), children draw or point to characters that could be "real" and those that are "pretend." The teacher facilitates a discussion on why some things are imaginary.
Ethical Expectation: The importance of giving credit and knowing who made something.
AI Activity: Who Wrote This Story? The teacher uses a simple AI text generator to create a short, silly story. The class reads it and the teacher asks, "Who wrote this?" This leads to a discussion about how the computer followed instructions but a person (the teacher) still told it what to do, highlighting that people are behind the technology.
General Life Activity: Art Show Credits. The class creates artwork. Before displaying it, they have a discussion about writing their names on their work to show who made it. They practice saying "I made this" or "This was made by [name]," connecting ownership to creation.
Ethical Expectation: Fairness and the idea that rules must be fair for everyone.
AI Activity: Robot Chore Master. The class "codes" a robot (a fellow student) to complete a simple task, like picking up blocks. They create a rule set. The teacher then introduces an "unfair" rule, like "The robot can only pick up red blocks." The class discusses how this is unfair and redesigns the rules to be fair to all blocks.
General Life Activity: Fair Play, Fair Rules. During a simple game (e.g., Simon Says), the teacher introduces an unfair rule that benefits only one person. The class discusses why it feels bad and works together to create a new set of rules that are fair for every player.
Ethical Expectation: Data is personal and should be kept safe.
AI Activity: What Does My Robot Know About Me? Using a simple, non-AI chat toy or app, students input information. The teacher then asks, "What did we just tell the robot?" leading to a discussion about not sharing private information like their full name, address, or phone number.
General Life Activity: Secret Information. Students are given a small piece of "secret" information (e.g., their favourite colour, but no one else knows it). The teacher then asks the class to share a non-secret fact (e.g., their favourite fruit). They discuss the difference between public information and private information and why some things should be kept private.
Ethical Expectation: Recognising bias and that everyone is different.
AI Activity: The Sorting Machine. Students are given a set of drawings of different people. The teacher uses a "sorting machine" (a large box with two slots) to sort the pictures based on a single characteristic, such as "pictures with red shirts." The teacher then explains that an AI might do this, but humans can see more, like what a person is doing or their expression. The class discusses how focusing on just one thing isn't the full picture.
General Life Activity: Our Class is a Rainbow. The class discusses how everyone in the room has different hair colour, eye colour, height, and interests. They create a group collage, with each student adding an element that represents something unique about them. This activity reinforces the value of diversity and highlights that no two people are the same.
Ethical Expectation: AI's role is to help, not to replace, human decision-making.
AI Activity: The Smart Assistant's Advice. The class works with a hypothetical smart assistant that suggests a route to school. The AI recommends a route that is shorter but crosses a very busy road. The class discusses the "AI's" advice and decides it's not the best idea. They conclude that while the AI has good information (distance), a human needs to make the final decision based on safety.
General Life Activity: The Class Rules Committee. Students are given a problem, like "How do we get everyone to share the playground equipment fairly?" They create a list of potential solutions. They then discuss how a teacher can give advice, but the students must make the final decision and take responsibility for their actions.
Ethical Expectation: Understanding how AI "sees" the world through data.
AI Activity: Training the Image Sorter. Students are given a large stack of animal photos (e.g., dogs, cats, birds). They are tasked with "training" a new AI by labelling each picture as "dog," "cat," or "bird." The teacher then introduces a picture of a rare breed of dog that the "AI" (a student following instructions) misidentifies. The discussion highlights the importance of having diverse and accurate data to prevent errors.
General Life Activity: The Memory Game. The class plays a memory game where they must remember a sequence of objects shown on a tray. The teacher removes one item, and the class tries to identify what's missing. This demonstrates how human memory and observation work, reinforcing the idea that our brains process and learn from data constantly.
Ethical Expectation: The concepts of privacy and digital footprints.
AI Activity: Personalised Ads. Students discuss a hypothetical scenario where an AI assistant tracks their favourite movies and then recommends a film to a friend. They debate whether this is a good or a bad thing, leading to a discussion about how digital services collect data and what that means for their privacy.
General Life Activity: The Story of Me. Students write a short story about themselves, but they are given strict instructions on what to include (e.g., favourite food, sport) and what to leave out (e.g., street name, last name). The activity highlights the difference between public-facing and private information.
Ethical Expectation: Algorithmic bias and its real-world consequences.
AI Activity: The AI Job Interviewer. Students work through a case study where an AI is used to screen job applicants. The AI starts favouring candidates who went to a certain type of school, even though the school has no relevance to the job itself. Students analyse why this is happening and discuss how unconscious biases in the programmers can become explicit biases in the code.
General Life Activity: Hidden Biases. The class discusses a historical or social event where a group of people was treated unfairly, such as a school rule that favoured one group of students over another. They analyse the reasons behind the unfairness and discuss how easily bias can creep into rules and systems.
Ethical Expectation: Understanding the power of generative AI and synthetic media.
AI Activity: Spot the Deepfake. Students are shown several short videos, some of which are real news reports and some of which are AI-generated "deepfakes." They work in groups to identify the fake ones, discussing the clues and the potential dangers of synthetic media in spreading misinformation.
General Life Activity: Media Literacy. Students analyse different news articles and social media posts, looking for signs of sensationalism, bias, or manipulated facts. They discuss the importance of cross-referencing information and being critical consumers of media.
Ethical Expectation: The economic and social impact of AI on jobs and society.
AI Activity: AI and the Future of Work. Students research a profession (e.g., artist, doctor, truck driver) and create a presentation on how AI is likely to change that job in the next 20 years. They discuss the ethical implications of job displacement and the need for new skills and social safety nets.
General Life Activity: The Industrial Revolution. Students study the effects of the Industrial Revolution on society, focusing on how new technology changed the way people worked and lived. They draw parallels between the historical period and the current rise of AI, discussing the challenges and opportunities.
Ethical Expectation: The responsibility and accountability of AI creators.
AI Activity: The Moral AI. Students are given a complex ethical dilemma (e.g., a self-driving car facing an unavoidable crash). They must "program" a set of ethical guidelines for the AI to follow, debating which values to prioritise. The discussion centres on who is responsible for the outcome: the programmer, the company, or the user?
General Life Activity: The Ethical Debate. Students debate a real-world ethical issue, such as gene editing or animal testing. Each side must present a well-reasoned argument that takes into account different stakeholders and potential consequences. This activity reinforces the skills needed to engage in complex ethical reasoning, a crucial skill for addressing future AI challenges.
Using Gemini AI, and using the prompt:
This document outlines a staged approach to introducing and developing a robust understanding of prompt development from Reception through to Year 11. The plan progresses from foundational concepts in younger years to more complex, societal issues in the later years, with each stage featuring a specific prompting principle and two corresponding activities—one related to general life and one to artificial intelligence.
Prompting Principle: Clear instructions.
AI Activity: Tell the Robot What to Do. The teacher presents a simple toy robot that can be "programmed" with basic commands (e.g., "move forward," "turn left"). Students are asked to give clear, one-step instructions to get the robot from one point to another. The class discusses how the robot can't understand vague language like "go over there."
General Life Activity: Dress the Teddy. Students work in pairs. One student gives specific instructions to their partner on how to dress a teddy bear (e.g., "Put the blue hat on its head," "Tie the red ribbon around its neck"). They discuss how specific words help their partner understand exactly what to do.
Prompting Principle: Specificity vs. vagueness.
AI Activity: Make the Computer Draw a Picture. The teacher uses a simple drawing tool. Students give prompts like "Draw a house" (vague) and "Draw a red house with a green roof and a chimney" (specific) to see the different results. This activity demonstrates how a more specific prompt leads to a more detailed and accurate outcome.
General Life Activity: The Recipe Game. Students give a partner a recipe for a pretend sandwich, first with a vague instruction ("Make me a sandwich") and then with a specific one ("Take two slices of bread, add a slice of cheese and a piece of lettuce"). The class discusses how the specific recipe leads to a better sandwich.
Prompting Principle: Adding details and adjectives.
AI Activity: Describe the Monster. The teacher uses a simple image generation tool. Students describe a monster to the AI. They start with a simple prompt like "A monster" and then add details like "A big, fluffy, purple monster with three eyes and a spiky tail," observing how the image becomes more unique and complex with each addition.
General Life Activity: Describe a Toy. A student hides a toy and describes it to the class using more and more details until someone can guess what it is. This reinforces the importance of using descriptive language to convey information.
Prompting Principle: Considering the "audience" or context.
AI Activity: Write a Story for an Audience. The teacher uses a text AI. Students are asked to write a prompt to generate a story for a specific audience, like "a story for my little brother who loves dinosaurs" or "a scary story for a sleepover." They see how the AI's output changes based on the audience.
General Life Activity: Letter to a Friend vs. Letter to a Principal. Students write two short letters: one to a friend to ask them to play, and one to the principal to ask for a new sports ball. They discuss the difference in tone and language, understanding that who they are talking to changes how they should write.
Prompting Principle: Iterative refinement.
AI Activity: Improving the Image Prompt. Using a tool like Adobe Express, students generate an image with a simple prompt (e.g., "a wizard"). They then refine the prompt by adding details, style, and setting ("a kind wizard with a long beard and a tall hat, sitting in a magical forest, in the style of a cartoon"). This activity shows how small changes to a prompt can dramatically improve the final result.
General Life Activity: Build-a-Model. Students are given a set of LEGOs or building blocks. They build a simple structure, then a partner suggests ways to improve it, and they rebuild it based on that feedback, demonstrating the value of refining a project.
Prompting Principle: Prompting for different formats (lists, tables, paragraphs).
AI Activity: Get the Right Answer. Students use a text AI to get information. They ask for information about animals, first as a paragraph, then as a list, then as a table, and compare the outputs, noting how the prompt changes the structure of the information they receive.
General Life Activity: Organizing Information. Given a set of facts about a historical figure, students are asked to present the information in different formats: a short story, a bulleted list of key dates, and a timeline.
Prompting Principle: Ethical considerations in prompting (e.g., avoiding harmful prompts).
AI Activity: Ethical Prompting. The teacher gives students a prompt that could be used in a positive way (e.g., "write about a school sports day") and then discusses how it could be twisted to be negative or hurtful (e.g., "write a story about a loser at a sports day"). They create guidelines for safe and respectful prompts.
General Life Activity: Words Matter. The class discusses the impact of their words on others. They role-play a scenario where a compliment is given in a nice way versus a rude way and see the difference in reaction.
Prompting Principle: Understanding the AI's limitations and biases.
AI Activity: The AI and Me. Students use a text or image AI to ask for information or generate content about a specific culture or topic. They then fact-check the output and discuss any inaccuracies or stereotypes, understanding that the AI's "knowledge" is a reflection of the data it was trained on.
General Life Activity: My Biases. Students take a quiz or complete an activity that helps them identify their own biases, leading to a discussion on how personal views can influence their understanding of the world.
Prompting Principle: The use of "constraints" in prompting.
AI Activity: The Confined Story. Students are given a text AI and a story to write. They are given constraints, like "write a story set in space, with only three characters, and it must end with a surprising twist." They see how the constraints shape the creative output.
General Life Activity: Building with Constraints. Students are given a small box of random materials (paper, string, glue, sticks) and asked to build a specific object (e.g., a car, a bridge). They learn to be creative within limits.
Prompting Principle: The concept of "zero-shot," "one-shot," and "few-shot" prompting.
AI Activity: AI's Learning Curve. The teacher explains the concepts of providing examples to an AI. Students test this by asking an AI to classify things. First, without any examples ("zero-shot"). Then with one example ("one-shot"). Then with a few examples ("few-shot") to see how its performance improves with context.
General Life Activity: Teaching a New Skill. Students try to teach a partner a new skill (e.g., a simple card trick). They first try to explain it with no demonstration, then with a single example, and finally with multiple examples, seeing the difference in how quickly their partner learns.
Prompting Principle: Prompting for complex, multi-step tasks.
AI Activity: The AI Research Assistant. Students use an AI to help them with a research project. They learn to break down a complex task into smaller, sequential prompts. For example, "First, give me an overview of the causes of climate change. Second, provide a list of solutions. Third, write a conclusion that summarises the main points."
General Life Activity: Project Plan. Students are given a project to plan (e.g., a class fundraiser). They work in groups to break the project down into smaller, manageable steps and assign roles.
Prompting Principle: The ethical implications of prompt injection and manipulation.
AI Activity: The Prompt 'Hijack'. Students are given a simple text AI and a core instruction (e.g., "always be polite"). They then try to "hijack" the prompt by using clever language to make the AI say something rude or inappropriate. This leads to a discussion of cybersecurity and AI safety.
General Life Activity: Social Engineering. Students discuss case studies of social engineering scams (e.g., phishing emails) where people are manipulated into doing something they shouldn't. They learn to identify the psychological cues used in such attacks and how to protect themselves.