Pearls of Wisdom
The notion of practical wisdom is borne from bell hooks’ work, suggesting academic development should be grounded in experience and practical in nature. Mindful of the ‘gritty irritant’ GenAI was considered by many to be, yet hopeful for the gems of innovative pedagogical practices it might afford, we came to the idea of ‘pearls of practical wisdom’: an index of innovation advice, made up of 50-word summaries of those using GenAI, produced either by staff, or students, or through student-staff collaborations.
The following 28 pearls provide valuable insights into the integration of AI in education, and in particular GenAI, emphasising inclusivity, ethical considerations, and collaborative learning. They highlight the importance of active student engagement, critical reflection, and responsible use of AI models and GenAI tools. However, they also acknowledge challenges such as the need for strong input by educators to ensure content quality, addressing biases in AI outputs, and balancing AI's benefits with ethical concerns. The pearls demonstrate innovative approaches to AI integration, such as using GenAI for linguistic structures and involving students in AI experimentation. While frameworks like the 'IDEAS' offer structured guidance, the pearls underscore the ongoing need for critical evaluation and refinement of AI models and GenAI tools in education to maximise their benefits while mitigating potential risks and limitations.
The pearls are listed in three chapters: AI integration into education, foundations and prompting, and finally, specific examples of how AI is applied in education.
AI integration in education
In this chapter, the authors advocate prioritising inclusivity in AI-based assignments by promoting open-access tools and providing resources for all students. They stress the importance of ethical considerations, transparency, and fairness in leveraging AI in education, while also highlighting its role as a supplement to educators, rather than a replacement. Embedding GenAI into course design should be done thoughtfully, avoiding forced integration. Authors suggest embracing students' use of AI within guidelines and providing guidance on ethical usage. Furthermore, authors propose utilising AI to efficiently handle student queries and developing clear evaluation descriptors for AI-assisted writing. Overall, the integration of AI in education aims to prepare graduates for AI-driven workplaces while ensuring inclusive and ethical practices.
- Kerem Öge (University of Warwick), UK
Prioritise inclusivity in AI-based assignments to bridge disparities. Utilise and promote open-access AI tools, encouraging creativity and levelling the playing field. Provide resources and tutorials for various open-access AI tools for uninitiated students. This approach promotes equity, allowing everyone to participate and excel regardless of financial constraints.
- Nayiri Keshishi (University of Surrey), UK
When leveraging AI in higher education, prioritise ethical considerations, ensuring transparency, fairness, and data privacy. Strive for inclusive design, catering to diverse learning needs. Embrace AI as a tool to augment, not replace, educators. Regularly evaluate its impact and involve stakeholders in decision-making.
- Rebecca Sellers (Leeds Beckett University), UK
The use of GenAI should be embedded into course design in the same way as any other technology or assessment, with sound pedagogical reasoning. Just because it is a hot topic, it should not just be forced into a course, the appropriateness should be thought through.
- Edward Crump (Kingston University), UK
The biggest challenge faced when proposing the use of AI is clarifying the relevancy of its application in comparison to pre-existing methods. There is a critical relationship between the perception of its usefulness from a student with regard to their current studies, and the consistency of its adoption and use.
- Matt Hams (Kingston University), UK
Used as part of creative exploration, prompt-writing with GenAI tools can act as a reflective activity, encouraging students to explore the relationship between their discipline's language and a generative visual outcome. This involves considering how changes in vocabulary can impact the generative output as an interpretation of their ideas.
- Monica Ward (Dublin City University), Ireland
Accept the fact that some students will use GenAI tools anyway and allow them to use it if they choose to do so (within guidelines). Make sure they know how to use these tools ethically and ask them to reflect on what they have learnt (to avoid outsourcing their learning).
- René Moolenaar (University of Sussex), UK
Convening a module with 100s of students is a significant undertaking, particularly for responding appropriately to their questions in a timely manner. AI can help with this. The solution is a module bespoke GPT, trained on the information we have provided it with, and ‘disconnecting’ it from the internet.
- Robert Liu-Preece (University of Warwick), UK
I wrote specific descriptors to evaluate students' use of AI, based on marking criteria for UG Politics students. Aiming at steering student use, designed to make expectations clear to students, allow reverse engineering of training and a mechanism for marking AI assisted writing. Available here: https://blogs.warwick.ac.uk/wjett/entry/ai_marking_criteria/
- Ari Seligmann (Monash University), Australia
AI challenges the predominance of providing loose instructions and focusing on assessment products by individuals. If we provide more guidance on steps, workflows and tools to be used and incorporate transparent documentation of the iterative collaborative production processes into submissions, then the learning(s) and responsible journey become visible to all.
- Tom Gurney (University College London), UK
AI integration in education is vital to prepare graduates for an AI-driven workplace. However, human input, alongside the use of AI, remains crucial for critical thinking tasks. Educators must ensure AI integration in assessments fosters inclusive and ethical use, especially considering the accessibility challenges students face for premium paid services.
- Isabel Fischer (University of Warwick), UK
Expertise as the main locus of academic identity can act as a barrier to the adoption of emergent technologies such as GenAI, widening the gap between educators, students, and employers. By harnessing expertise as a springboard towards facilitated exploration of ‘unknown’ topics or new technologies, educators can embrace vulnerability and positively integrate GenAI (Fischer & Dobbins, 2023: https://journals.sagepub.com/doi/10.1177/10525629231201843).
Foundations and prompting
In this chapter, authors recommend that educators should cultivate a participatory environment in their classrooms, encouraging students to actively engage with and shape AI use, fostering knowledge, innovation, and responsible AI use. Sharing case studies and providing a safe space for exploration can ease AI anxieties and empower colleagues to engage with AI responsibly. Utilising AI as a tool to stimulate intellectual development rather than provide answers is essential. It should be viewed as a co-creative collaboration, acknowledging its strengths and weaknesses, and integrating them into real-life applications. Critical reflection is necessary when integrating AI tools into essay writing processes, balancing their benefits and drawbacks.
12. Vishal Rana (Flinders University), Australia
For successful AI integration in pedagogy, educators should foster a participatory environment. Encourage students to actively engage with and shape AI applications, deepening their understanding and ethical use. This method cultivates knowledge, innovation, and responsible AI use, with teachers guiding and ensuring ethical AI practices in the curriculum.
13. Paul Astles, Eleanor Moore, James Openshaw, Katia Shulga and Mary Simper (Open University), UK
To ease AI anxieties and empower colleagues to explore responsible engagement with AI, we have shared case studies and built a safe space for exploration of tools using sound pedagogical principles. We openly acknowledge and guide colleagues about the bias in AI outputs, empowering an informed reflection on ethical use.
14. Jennifer Rose (University of Manchester), UK
AI Assistance needs to be used to help you develop intellectually! To turn on, not turn off thinking. By asking AI to ask you questions, rather than requiring answers, it can help you think in new ways, challenge your perspective and develop your thinking. It can become your personal coach!
15. Andy Winter (Coventry University), UK
When using AI, think of it as a co-creative collaboration. Bounce ideas off one-another, ask its thoughts on your original idea, can it innovate on it? Yes, it can, it always will, BUT always check and look deeper than the confident response it will give, even when it is wrong!
16. Aimee-Leigh Youngson (University of Salford), UK
Discussing AI capabilities in session, and acknowledging their strengths and weaknesses are not enough. They need to be embraced and integrated – with their flaws – into sessions for real life application. We do not need to be experts already - we can learn along with our students and genuinely collaborate.
17. Evdokia Stergiopoulou (University of Greenwich), UK
Integrating ChatGPT in the process of essay writing can be a double-edged sword in the hands of university students. ChatGPT may offer invaluable assistance in tasks like note-taking and summarising, but it also raises concerns about dependency and efficiency. Therefore, critical reflection is required to balance its benefits and drawbacks.
18. Beverley Pickard-Jones, Mike Morris and Fay Short (Bangor University), UK
Students often struggle to define powerful thesis statements that provide clear direction and focus for their essays. AI chatbots can be used to offer personalised, iterative guidance in developing and improving thesis statements to ensure clarity and precision. Accessible and scalable, they accommodate diverse learners and disciplines.
19. Lisa Harris (University of Exeter), UK
Rather than ‘AI-proofing’ assessment, we asked students to pose a specific question within ChatGPT v3.5 (for equity of access), then refine the response through further enquiries both within and beyond ChatGPT. They were assessed on their critical reflection of the output’s quality and value obtained from working through this process.
Examples of how AI is applied in education
To effectively use generative AI in creating teaching materials, this chapter illustrates that rigorous editorial input and fact-checking are essential to ensure content quality and compliance with intellectual property and ethical standards. For example, generative AI was employed to produce linguistic structures for teaching, requiring careful examination but significantly reducing time spent. Students' perspectives on AI usage, gathered through thematic analysis, inform practical actions like teaching critical thinking and risk awareness. Assignments designed to combine human and AI-generated writing aim to enhance students' analytical skills and writing proficiency. Workshops and collaborative exploration within organisations facilitate understanding and integration of AI in various job roles.
20. Sarah Hack (University of Surrey), UK
Organise your AI-related thinking and resources. For example, use the ‘IDEAs framework’ which includes three sections:
• ‘Introducing’ - activities which foster early critical thinking and engagement with GenAI tools.
• ‘Developing & Empowering’ - activities which support students in their development as learners.
• ‘Assessing’ – assessment-related activities and resources.
The link to a copy of the framework can be found here: https://aldinhe.ac.uk/product/learnhigher-resources/the-i-d-e-as-framework-a-resource-to-help-structure-thinking-about-the-use-of-genai-in-learning-teaching/
21. Madeleine Stevens (Liverpool John Moores University), UK
AI as an alternative timesaver to drive understanding: Using the flipped classroom pedagogical approach, level 7 students used AI to gain an understanding of complex elements of research philosophy, as well as to gain an understanding of different sampling techniques where students presented, compared and discussed their findings.
22. Claire Timmins (University of Strathclyde), UK
ChatGPT was used to create examples of linguistic structures to use in teaching. This required close examination before using, but reduced a lot of time involved. An example prompt for a grammatical clause structure was ‘Can you give me some SAVA [subject - adverbial - verb - adverbial] sentences about Harry Styles?’ resulting in ten relevant sentences.
23. Mike Richards (Open University), UK
We cannot use GenAI to create teaching materials without understanding the need for strong editorial input and fact-checking of the outputs; as well as ensuring there are no breaches of intellectual property law or ethical concerns in the resulting content.
24. Andrew Firr (University of Chester), UK
Contemporary AI-powered tools can streamline academic research by effectively organising extensive databases, analysing relevant literature, and summarising insights, thereby saving students’ time. Although many of these tools initially offer open access to their functionalities, users should note that they typically require a subscription or fee after any introductory trial period. Additionally, users should be aware of potential limitations such as inherent biases and the absence of a nuanced understanding or personal interpretation that human researchers can provide.
25. Robert Liu-Preece (University of Warwick), UK
I designed an AI writing assignment. Students choose a cultural artefact important to them. Free write on insights the artefact provides. Then generate an AI version and finally combine the two bits of writing. To notice differences between human / AI analysis and practice writing with AI to improve the product. See here for further information: https://blogs.warwick.ac.uk/wjett/entry/collaborating_with_ai/
26. Eleanor Moore, Paul Astles, James Openshaw, Katia Shulga and Mary Simper (Open University), UK
Supporting academics to exploit opportunities of AI is vital. We built a team of AI experts to experiment with tools and engage with the student voice. We iteratively created and tested generic resources to adapt for different subject areas. This way, AI approaches and experimentation can be embedded in HE modules.
27. James Openshaw, Paul Astles, Eleanor Moore, Katia Shulga and Mary Simper (Open University), UK
Student voice on GenAI is valuable. For example, we invited our students to share their thoughts about using GenAI for their studies. Thematic analysis of the responses provided practical actions such as teaching students how to be critical of GenAI and aware of the risks of using such technology.
28. Katia Shulga, Paul Astles, Eleanor Moore, James Openshaw and Mary Simper (Open University), UK
Responding to rapid developments in AI, we created a team to explore its potential in our different job roles. Short workshops allowed us to discuss challenges and benefits with teams across the organisation, avoiding duplication of time and resources and feeding into a developing community of practice.
Dr Isabel Fischer and Professor Letizia Gramaglia, Pearls of Wisdom leads