School leaders:
Use the themes to direct staff professional learning sessions
Engage staff in discussions by reviewing the varying perspectives from these interviews and conversations - find out their opinions
Teachers:
Identify concepts and ideas that you want to further investigate
Connect with the people who you want to learn more from
Academics:
Identify themes for further research
Industry Professionals and Developers:
Listen to these perspectives to identify what matters to people
Find free solutions for supporting low-resourced schools, teachers and students to address the issues arising
These quotes and ideas have been shared by members of the AI in Education Community of Practice as well as collated from individual interviews.
You may want to conduct a professional learning session with your staff based on one of the themes or quotes from the meetings and interviews.
As you review the perspectives, consider these reflexive and reflective questions.
1. What are the implications of integrating AI into your school structures?
2. How can you address the impact AI will have on your school systems and teaching practices?
The following are key ideas, perspectives, and insights shared by some of the Community of Practice members in our Meetings Padlet. All green text links to that person's LinkedIn if you want to connect with or follow them.
Dan Bowen: Advocated for Project-Based Learning (PBL) as a way to address assessment integrity concerns; discussed Swiss Cheese Model of assessments.
Anonymous4: Highlighted the need for open conversations around AI use, its potential for programmatic assessment, and how GenAI is largely an administrative tool in primary settings.
Veneta Webster: Emphasised the need to explicitly teach the purpose of learning and when to use or avoid AI.
Mark Parry: Noted the contrasting assessment and feedback models between high school and higher education. Prompted questions about the task’s purpose.
Selena Tenakov: Raised a key point: "If there's no assessment linked to it, students won’t engage"—calling for rethinking what is valued.
Anonymous 3: Suggested students co-create assessments and rubrics to deepen engagement and ownership.
Lavanya Gopakumar: Emphasised starting with multiple perspectives before introducing AI concepts.
Anne Forbes: Stressed the importance of concrete, offline experiences before applying abstract AI ideas.
Alfina Jackson: Highlighted how AI can personalise learning via UDL and free up teachers to focus on wellbeing.
Anonymous 2: Discussed AI's role in primary schools mainly as a teacher support tool, not direct instruction.
Cassandra Colvin: Raised concerns about academic integrity being framed as punishment, and encouraged exploring GenAI as a learning presence, citing the ACAD and COI frameworks.
Anonymous1: Called for collaborative agreements with students on ethical AI use, including transparency in student-AI contributions.
Juliana Peloche: Cautioned that effort is necessary for growth, even with AI.
Veneta Webster: Reiterated: “Growth requires human effort.”
Lee Barrett: Reflected on the changing landscape of education, asking what we should stop worrying about and what needs more attention.
Dr Jennifer Chang Wathall: Encouraged focus on learning as a journey, not just outcomes.
Trustworthy Ladybug: Asked how to promote curiosity and independent learning rather than copy-pasting from AI.
Selena Tenakov: Pointed to the need for AI education for pre-service teachers, and proposed including parents and caregivers in AI literacy efforts.
Cassandra Colvin: Framed the teacher as a curator of experiences, designing across the epistemic, social, and set domains. Suggested AI be seen as a presence in learning, not just a tool.
Dan Bowen: Discussed tensions between direct instruction vs. PBL, and how politics and measurement shape these.
Mark Parry Emphasised Universal Design for Learning and AI’s role in scaffolding accessible learning.
Alfina Jackson: Supported using adaptive AI tools to enhance student autonomy.
Annelise Dixon: Questioned whether AI stifles or enhances higher-order thinking, asking how to use it without harming cognition.
Geoff Matheson: Warned against prioritising outputs over learning, as AI can produce polished products with shallow understanding.
Lee Barrett:: Highlighted student agency and passion—shared insight that when students have a voice, they explain where AI is useful or not.
Clare Roden: Emphasised the human connection in PBL + AI projects and AI’s potential to support 4Cs (Critical thinking, Communication, Collaboration, Creativity).
Groovy Ray: AI is great for upfront planning and research support but suggested students should solve the problems without AI to deepen understanding.
Jon Adams: Shared that Minecraft and AI can provide creative, purpose-less spaces for students to imagine and build.
Alfina Jackson: Provocatively asked whether students turn to chatbots because they lack supportive human relationships.
Optimistic Duck: Warned we must be cautious yet acknowledge students already use AI tools at home.
Nassima Kennedy: Urged a focus on reflection and process over product.
Adorable Chipmunk: Argued for viewing AI as part of a broader cultural and curricular shift, not just an isolated tool.
Cassandra Colvin: Referenced Yuval Noah Harari and posthumanism, reflecting on the future of AI learning intent and emotion.
The following are key insights from interviews held with a range of educators and academics. All blue text links to their LinkedIn accounts.
1. Teacher Workload, Burnout & Role Expectations
“Is this what we signed up for?”
Raised this question when reflecting on how teaching has shifted from pedagogy to paperwork and data collection. She noted that only 40% of the role is now teaching, with the rest absorbed by admin, data demands, and student wellbeing—contributing to burnout and feelings of being overwhelmed.
“I didn’t sign up to be a psychologist.”
Shared frustrations about the emotional and behavioural support now expected from teachers in addition to instructional roles. He also questioned the value of traditional reporting systems in light of AI’s potential to streamline assessment.
“60% of teachers feel burned out, and 75% have lost enthusiasm.”
Highlighted the scale of teacher disengagement and advocated for AI to support—not add to—teacher workloads. He warned against top-down AI initiatives launched without sustainability or consultation.
Sarah Howard
Raised concerns about teachers becoming “data workers,” echoing Neil Selwyn’s warnings. She stressed that the growing burden on educators stems not from resistance to change, but from the failure to communicate value clearly:
“You should do it this way” isn’t persuasive if the value isn’t demonstrated.
“Teachers tend to list what’s core to teaching… and that becomes a barrier to embracing new possibilities.”
Tim Dasey observed that deeply held professional values can unintentionally act as barriers to innovation. He noted that ethical concerns are central to teacher identity and must be addressed carefully to avoid paralysis in progress.
“If you had five more minutes in a day, would you use it to care more for a child or check a spreadsheet?”
This rhetorical question challenges current workload expectations and invites reflection on what teachers truly value.
“The three biggest imposts on teachers’ time are lesson/curriculum planning, marking work […] and face to face teaching and mentoring of students.”
McVeity sees AI as a powerful support for reducing teacher workload by taking on repetitive, administrative tasks such as curriculum planning and marking. She emphasises that relieving teachers of these burdens allows them to focus on their core mission: mentoring and emotionally connecting with students.
“Students often will not remember what they learnt in school, but they will remember how you made them feel.”
“If AI can take the heavy lifting off teachers in both these areas, then that leaves teachers free to spend more time with students as teachers and mentors – and that is their most important and lasting role.”
“AI becomes a tool to amplify the impact of effective teaching.”
Bakker stresses that AI must support—not replace—teachers. She believes in reducing administrative load so teachers can focus on building student confidence and skills, aligning AI tools with high-quality pedagogy.
2. AI’s Role in Structural Change & Educational Reform
“Do we even need reports anymore?”
Provoked reflection on long-standing systems like summative assessments. He suggested AI presents a chance to rethink what schooling looks like—from curriculum design to classroom operations.
“Do we have to take structures away, or can we streamline thanks to AI?”
Called for thoughtful reimagining, not just abandoning current structures.
Juliana Peloche
Criticised tertiary teacher preparation programs for being detached from classroom reality, saying educators are told to "just persevere" through change without meaningful support or frameworks.
Sarah Howard
Challenged the "grammar of schooling" and called for teacher education to become more horizontal and design-based:
“Some teachers will embrace change, some won’t. Leadership needs to create the space and vision.”
“AI must not be an additional thing—it must be transformative.”
Emphasised that the conversation must focus on sustainable systems change and that AI should support teachers rediscovering purpose, not act as a bolt-on initiative.
Tim Gander
Urged a move away from compliance-heavy systems to trust-led, adaptive practices.
He described AI integration as a cultural and pedagogical shift, not a technical upgrade:
“This is about practice, pedagogy, and people — not just platforms.”
Dr Sabba Quidwai
Warned against automating broken systems:
“Stop giving them traffic lights. Start helping them make real decisions.”
She advocates for redesigning educational structures around passion, purpose, and agency.
“Being a passionate, inspiring presence will matter more than being a history expert.”
Dasey suggests that the future role of teachers will shift away from content expertise toward mentorship and emotional presence, signalling a structural transformation in teaching roles.
“The conversation has to move past ChatGPT. Think about what’s next.”
He urges educational leaders to plan beyond current tools, anticipating broader systemic shifts involving agentic AI.
“We’re moving from knowledge to meta-knowledge — from tangible to abstract. That’s the skill to cultivate.”
This indicates a fundamental change in the purpose and structure of learning itself.
3. Student Voice, Equity & Inclusion
“Let’s not dictate what students will be doing; let’s have them as part of the conversation.”
Argued strongly for including students in shaping AI practice in education, especially in rural and remote contexts. Highlighted the risk of tech-first approaches without student-centered reflection.
“We need to think about what societal rules are needed. It’s not just about trust — it’s about the system around it.”
Dasey reflects on the need to design systems that govern AI use fairly, particularly in high-stakes contexts like asylum rulings. He also notes:
“Children may believe computers over teachers.”
This underscores the urgency of supporting student agency and critical discernment.
Tim Gander
Led the creation of an equity map in New Zealand to identify and address educational needs:
“We’re building an equity map for the whole country — it’s about seeing the gaps and knowing who can solve them.”
“How do we build these skills into all students regardless of background?”
Raised questions around equity and AI’s influence on agency and power. She was particularly concerned about AI amplifying existing divides if not designed with inclusion at the core.
Contrasted two perspectives:
– Sal Khan’s belief in AI as a democratiser of education
– Eric Schmidt’s concern that AI could deepen inequality
Ellefsen warned that AI must not become politicised and detached from what’s happening on the ground in schools.
Dr Sabba Quidwai
Critiqued colonial undercurrents in AI narratives and emphasised resilience in the Global South
She highlighted the importance of student readiness and agency, particularly for older students.
4. AI Literacy and Pedagogy
“People want us to tell them what AI literacy might mean.”
Led a project (Creative HE) that involved students and educators co-defining AI literacy. She created open-access resources showcasing real uses of ChatGPT in education and promoted collective sense-making.
The questions I get aren’t ‘How do I use ChatGPT?’ – they’re ‘How do I answer when another teacher asks if it’s ethical?’”
AI literacy, in Dasey's view, involves navigating complex ethical conversations—not just tool functionality.
He outlines six pillars for AI curriculum:
– How to use AI
– How AI is built
– Different types of tasks AI can perform
– AI in society
– Future societal shifts
– Ethical implications
He stresses that:
“Ethics isn’t a separate unit; it must be infused in every conversation about AI.”
“There’s a whole world that just won’t acknowledge AI is useful for anything. It paralyses progress.”
He views AI literacy as intertwined with myth-busting and confidence-building.
Tim Gander
Grounds AI literacy in pedagogy, not novelty:
“Regardless of the tool, can you evaluate whether it supports your pedagogy and outcomes?”
Sarah Howard
Sees prompting as deeply pedagogical and linked to metacognition and design thinking:
“The way you design the question determines the thinking process.”
She is developing a “prompt space” to support this emerging skillset.
Urged educators to move away from rigid, capitalised programs and instead offer bite-sized provocations:
“Programs fail when they go massive. Give people something they can own.”
Dr Sabba Quidwai
Encourages scenario-based reflection and critical engagement, cautioning against blind AI adoption:
“You can use it here, but not here. Why did I decide to use it here or not here?”
“Teaching great writing is more important than ever in the age of AI.”
She emphasises critical and creative thinking in writing instruction to empower students to use AI tools responsibly, recognising that writing skill and judgment still matter.
“Once trained by a human, AI can pick up subtleties in writing.”
McVeity illustrates how AI literacy must go beyond mechanical aspects, helping students and teachers understand how to analyse nuanced textual features like tone, character relationships, and implied meaning.
“Build AI into your philosophy, don’t just use it.”
Supports a process-focused, coaching-based model using AI. Encourages students to evaluate, structure, and reflect—not just consume or correct.
5. Professional Learning and Leadership
“Strong AI leaders were not those that thought they would be—they just realised.”
Described how his AI Champions program built capacity through pod-based learning, where unexpected leaders emerged through experience and experimentation rather than status or prior confidence.
Dr Sabba Quidwai
Calls for investment in deep community ties and mentoring over quick fixes:
“Be the person they’ll come to tomorrow when they need help.”
“Choose to see the positive. Focus on what we can control.”
Encouraged proactive, bottom-up adoption of AI and warned against fear-based policies or banning AI out of discomfort.
“Take a few ideas. One thing. Make it real.”
Valued professional learning that is resonant, realistic, and practical—not panels or top-down delivery.
Tim Gander
Champions a Community of Practice model in NZ that fosters organic collaboration:
“We just got a group together — the CoP is growing organically.”
He also advocates for shared spaces to link teachers, researchers, and industry:
“Too many people are working in bubbles — we need a melting pot of opinions.”
6. Assessment, Feedback, and Reporting
“90% of current assessments could be done by a language model.”
Called for honest conversations about what AI can already do, especially in relation to tertiary assessments. She questioned the continued use of unreliable AI detectors, advocating for teaching ethical usage instead.
Matt Esterman
Challenged current reporting systems:
“Do we even need reports anymore?”
He envisioned a future where AI enables more authentic, useful feedback loops, reducing the need for traditional summative assessment formats.
“We’re collecting a lot of data—there’s real potential to use AI to spot patterns.”
Outlined a long-term vision for using de-identified student data to inform reading interventions and instruction through triangulation and pattern recognition.
Sarah Howard
Warns that simplified AI tools can eliminate the “productive struggle” and risk doing too much of the thinking for students:
“Will ease take away the productive struggle?”
She believes flawed AI responses can be turned into learning opportunities:
“If you get a generic response from the AI, it’s a learning opportunity — ask it better questions.”
“AI can do a first pass or act as a moderator when marking students’ work. It can also generate explicit feedback based on a rubric and is more consistent than a human marker.”
She details how AI can improve marking reliability and provide faster feedback, especially in humanities subjects.
“Ask it to analyse the growing relationship between the siblings and how it changes over time […] give examples of their dialogue to support this […] explain how the harsh weather adds pressure to the holiday […] the ending is optimistic, why?”
This example highlights how AI can assess sophisticated narrative elements, potentially exceeding what is possible with standardised marking rubrics like NAPLAN.
“[AI will] provide real-time, formative feedback to students during the writing process, ensuring every student can progress and achieve their potential.”
“[AI will enable teachers to] get regular, automated insights into student writing, enabling them to target specific areas for improvement and boost outcomes.”
7. Ethics, Privacy, and Data Ownership
“Once it’s in digital form, you have to retain control—otherwise others could commercialise it.”
Expressed concern about copyright and control over AI-generated content. Emphasised that any use of data and AI must remain within ethical boundaries that protect teacher and student rights.
Dr Pauldy Otermans
Designed her AI Teacher platform with privacy at the forefront, stating that:
“edAI LLM runs on the user’s device for privacy—not cloud-based.”
Also prioritised transparency and teacher control in lesson design.
“It has unethically scraped data from copyright material worldwide – including authors.”
“It can now ‘write in the voice of Somerset Maughan’, etc. […] notice how many characters are called ‘Peter and Michael’ and are outdated and sexist and ageist?”
“AI is churning up huge water and electrical resources and has negated the climate change gains in just a few years.”
McVeity expresses a broad ethical critique—focusing not just on education but also on the societal risks of unregulated AI, including bias, copyright violation, environmental costs, misinformation, and identity manipulation.
“George Clooney selling not just coffee in real life, but anything from muscle supplements to illegal drugs.”
This concern over deepfakes illustrates the challenge of trust and integrity in a rapidly evolving media landscape.
Advocated for transparency in AI tool design:
“Most educational tools don’t tell you their pedagogical rationale.”
Suggested a #pedagogicalRationale label to help educators evaluate tool assumptions.
Raises concerns about AI agency and the ethical implications of student interactions with machines:
“We haven’t taught people how to interact with machines across contexts.”
8. AI Tools, Chatbots, and Teaching Applications
“The chatbot lets me have better conversations about their drafts—not just repeating myself to every student.”
Uses chatbots and Copilot to support writing instruction, helping students brainstorm, structure, and revise work while enabling teachers to offer deeper feedback.
Dr Pauldy Otermans
Described her AI Teacher avatar that delivers lessons based on uploaded documents:
“If the answer’s not in the paper, it draws from its broader knowledge base.”
Students asked more questions than in traditional settings, showing higher engagement.
Sarah Howard
Discusses the importance of tool selection based on learning goals—citing Canva as an example of a creative platform that enhances student expression:
“AI must be used to open possibilities, not replace thinking.”
Runs “techie breakies” and online lunchtime sessions demonstrating specific tools, embedding AI support within school systems. Also facilitates the AI Champions course with weekly pods that explore new tools and reflect on classroom use.
9. Vision, Innovation, and System Change
“AI is not the focus—the system is. What are we using it for?”
Advocated for human-centred AI use, particularly in remote and agricultural settings. Called for integrated, problem-based learning rather than siloed or tech-driven solutions.
“People need to feel valued immediately—honour their work, give them visibility.”
Emphasised the importance of building community through visibility and recognition.
“What are the interesting practices schools have chosen—where they have seen real success?”
Argued for case study-driven learning and professional wisdom over hype. Called for scalable, equitable models of AI use rooted in what’s working locally.
Raised big-picture questions:
“What is the central learning experience in a world embedded with AI?”
“Is AI heading toward being an invisible infrastructure, like electricity?”
Posed philosophical provocations about the long-term implications of AI on learning and human purpose.
Dr Sabba Quidwai
Challenges quick-fix solutions and insists that true transformation must be intentional, contextual, and human-led:
“Rather than automating broken systems, redesign them with purpose.”
She emphasises that:
“Suffering builds character and resilience,”
and urges leaders to focus on community building and intrinsic motivation.
“We have to be comfortable with uncertainty. That’s how innovation happens.”
Dasey highlights that collective experimentation and a tolerance for ambiguity are essential for moving education forward.
“If everyone is learning alone, it won’t work.”
He warns against isolated exploration and advocates for shared innovation practices.
“It’s not a communication problem anymore — it’s a management problem.”
He anticipates a future where leadership involves orchestrating AI agents, not just human teams—pointing to a need for new organisational clarity and metacognitive leadership.
“The heavy reliance of universities (and thus schools) on essay writing as an assessment tool is outdated and wide open to AI taking over.”
She urges schools and universities to innovate—not just adapt—by rethinking writing tasks altogether.
“So many other ways to write […] Blogs, web copy, interviews, emails, job CVs, LinkedIn posts… and yet we teach essays in schools – hardly used in the real world.”
This highlights a call for modernising curriculum to reflect real-world communication, rather than reinforcing academic traditions susceptible to AI automation.
10. Leadership, Professional Learning & Capacity Building
“Strong AI leaders were not those that thought they would be—they just realised.”
Observed that AI leadership often emerged from unexpected staff, not always the confident or tech-savvy ones. He facilitated the AI Champions course which focused on building leadership through real classroom trialling and reflective pods.
“Used cases put a lid on what we think is possible.”
Warned against limited professional learning that narrows teacher imagination. Advocated for a shift from fear-driven narratives to empowering and scalable implementation.
“Go from one teacher to 100.”
Promotes structured, supported rollouts that build teacher capacity without overwhelm.
Maria Dolce
Provides sustained, in-context professional learning aligned with her Achievement Integrated Model, helping teachers design prompts around gifted education, metacognition, and student voice.
“The ones who know their craft [gain most from the model]. Younger teachers often lack pedagogical depth.”
11. School Structures, Models & Case Studies
Henno Kotze (Alpha School)
Described a radical AI-based school model with no traditional teachers:
“Strong motivational system keeps students focused on outcomes.”
The school uses guides (not instructors), mastery-based curriculum, spaced retrieval, and XP-based gamification to personalise learning at scale.
“Build AI into your philosophy—don’t just use it.”
Aligns his writing instruction model to District C's team-based, coach-led problem-solving structures, pushing toward sustainable school-wide integration.
“We plan to integrate AI into Teacher Hub to further support teachers with planning, assessment and personalised learning.”
“Over the past two years, we have researched EdTech partners whose mission aligns with ours, ensuring that any AI enhancements reflect our commitment to high-quality teaching and learning.”
She provides a concrete example of how AI will be embedded into school-based tools and infrastructure to support teachers.
“Visions of the future—like the teacher as Mentor—are disconnected from current teacher realities.”
Voiced concern about models that romanticise teaching roles while ignoring the lived complexity and administrative burden.
12. Student Empowerment, Inclusion & Intergenerational Learning
Dr Sabba Quidwai
Cautions against introducing AI to young students too early due to developmental limitations:
“Younger learners often lack the maturity to distinguish between human and machine.”
“They don’t have teachers available—AI helps them teach themselves.”
Highlighted the rise of intergenerational tutoring (e.g. young people helping adults) and AI as a bridge where teacher shortages persist. Advocated for upskilling and self-directed learning.
“The teacher’s role will remain sacrosanct—AI should follow teacher-designed scaffolds.”
Stressed that AI should be used to enhance—not replace—the teacher’s craft, particularly in scaffolding for Band 1–3 students.
“Let’s not dictate what students will be doing—let’s have them as part of the conversation.”
Championed including student voice in AI implementation, especially in remote or under-resourced contexts. Supports intergenerational learning models like Professor Vinesh’s reading project.
“How do we build these skills into all students regardless of background?”
Focused on equity and giving all students access to skills like agency, discernment, and critical thinking in an AI world.