Guidance on Using Generative AI in Teaching and Learning
Generative AI technology, when used with clarity, shows teaching innovation which UPEI encourages.
UPEI encourages teaching innovation and supports instructors in trying new approaches to teaching and learning and different educational technologies. As an instructor, you have the academic freedom to determine how, when and whether to use generative AI in your classes. It is important that you provide clear explicit direction to students about the use of artificial intelligence or tools like ChatGPT to complete learning activities, assignments, tests or exams, and if so, the extent to which it is allowed. It is equally important to share why you are making these decisions with students. This will help them in understanding the rationale for particular conditions you place on the use of AI in your course.
What is UPEI's position on using generative AI?
To support instructors in preparing for the fall semester, here are some recommendations and considerations to bear in mind. Developments in this area continue advance rapidly, therefore further updates to guidance and policies around academic integrity in the context of generative artificial intelligence are anticipated.
Learning about Artificial Intelligence at UPEI
Artificial intelligence is used across all academic subjects in various ways and has become, or is becoming, integral to our everyday lives as educators in higher education. The influence of artificial intelligence tools on teaching and learning will continue to be an area of focus for the Teaching and Learning Centre, the Generative Artificial Intelligence Taskforce (GAIT), the Academic Integrity Working Group and the Senate Committee for the Enhancement of Teaching. These groups will continue to stay abreast of developments to advise and support academic units and instructors around the implications and opportunities artificial intelligence technologies present in our courses. This includes professional development opportunities for instructors to learn more about using AI in instructional settings.
Products and services that use Generative AI
Image credit: Sonya Huang, Sequoia Capital
Generative Artificial Intelligence (genAI)
When we refer to generative AI (genAI) ChatGPT often springs to mind. However, genAI is a specific type of AI that generates new content including text, code, music, and images. Generative artificial intelligence uses models that learn the patterns and structure of their input training data and then generate new outputs (text, images, or other media) that have similar characteristics.
What is Chat GPT and how can it be used in education?
Predictive language models such as ChatGPT are technologies that use large statistical models to generate natural-sounding text. The technology ChatGPT, developed by OpenAI, is powerful enough to generate text-based responses like letters, recipes, essays, songs, etc. Essentially, it does a very good job of predicting what a human would write next; however, it does not understand the content it generates or determine whether or not the information is misleading (Weidinger, et al., 2022). ChatGPT, one of several generative AIs, was trained on an unprecedented amount of text data from the internet as it existed until 2021(Brock University, 2023).
ChatGPT can be used in education in various ways by providing:
Personalized learning by assessing knowledge, providing feedback, and recommending study materials based on individual needs.
Learning assistance by answering students' questions to help them understand difficult concepts.
Research assistance by suggesting relevant papers and articles can assist with literature reviews and answering research questions.
Language learning by conversing with students in the language they are learning. Additionally, it can provide feedback on grammar and vocabulary usage.
Writing assistance by offering tips to improve sentence structure, grammar, and word choice. It can also provide feedback on plagiarism and how to prevent it.
With thanks to the CTL at Queen's University for the section on using Chat GPT for education.
What recommendations does the Teaching and Learning Centre have regarding AI use?
Decide whether to use Generative AI in your classes and how
AI tools may not be relevant or appropriate for all university classes. In addition, instructors may wish to discourage the use of AI tools for specific pedagogical reasons or in light of concerns related to academic misconduct. For some pointers and ideas for designing learning activities and assessments without AI tools, this webpage Teaching Without the Use of AI from Carleton University offers several helpful considerations.
Assessment Considerations when Using Gen AI
Generative AI technology has created a need to revise assessment practices, and at the same time it offers some new possibilities for in-class learning. Assessments can be redesigned so that students are not submitting work that AI apps can easily produce. Drawing on research from the Centre for Research in Assessment and Digital Learning, (2023) at Deakin University, here are some considerations to keep in mind as you make decisions about assessments which use AI in your classes.
Enact principles of good assessment design
Deploy resources to assure assessment when it matters most.
Design feedback sequences to support learning.
Develop student capability to identify ‘what good looks like’ through assessment.
Devise multiple submission formats to make assessment more equitable.
Focus on evidencing that outcomes are met.
Adapt current assessment practices to account for genAI
Have open conversations about genAI with students.
Review rubrics and other forms of assessment criteria.
Specify assessment situations where it is appropriate or inappropriate to use genAI.
Design tasks to promote students’ portrayal of their unique achievements.
Develop and assess critical digital literacies.
Ideas for Redesigning Assessments
Expand/Replace Assessments
Consider expanding or replacing written assessments to:
focus more on the process of the writing assignment rather than on the final product
have students emphasize evidence of original thought and critical thinking as AI tools have been shown to be weak at demonstrating these higher-order skills
ask students to use current sources (post-September 2021)
ask students to apply personal experience or personal knowledge to course topics
create assessment questions that are based on the context of classroom discussion
or, replace a written assessment with a multimodal one
You may also want to update your grading criteria or rubrics to emphasize assessment of deeper discipline-specific skills such as argumentation, use of evidence, or interpretive analysis, rather than the mechanics of writing and essay organization. This can help re-weight your assessments in favour of student learning and away from skills easily performed by AI tools.
If you typically have reading response or expository essay assignments, you'll find ideas to redesign them in these blog links:
Integrate Generative AI Technology
Many educators are currently experimenting with integrating generative AI technology into their assessment design. Keeping in mind the limitations of AI, if you decide to incorporate such tools into assessments, some ways that students can use technology like ChatGPT to apply higher-order skills are to:
generate a ChatGPT response to a particular question, and then write an analysis of the strengths and weaknesses of the ChatGPT response
fact-check the responses that ChatGPT provides to identify incorrect information
generate a paper from ChatGPT and evaluate its logic, consistency, accuracy and bias, including any stereotypes it may reinforce
use ChatGPT to create an outline that students can then use to develop an essay
Note: If using the above strategies, be mindful of potential accessibility and equity issues that may arise when shifting assessment modalities.
Using Gen AI for In-Class Activities
Beyond formal assessments, generative AI tools can also be used in ungraded or low-stakes learning activities during class time. Bringing this tech into lectures or discussions can help students understand how and when to use AI technology effectively and ethically, and in ways that align with the norms and standards of your disciplinary context. Some learning activities you might consider include:
using AI generated text as the starting point for class discussion on a particular topic. What does it get right? What is it missing? How would it need to be revised to meet the scholarly standards of your field?
having small teams of students experiment in using AI to create text about a given subject, and then comparing the results (what grade would they assign its response using a course rubric?) and/or the process (what prompts and tweaks were needed to generate the text?)
engaging in a class debate against generative AI tech. Use the tool to generate counterarguments that can help them explore perspectives and strengthen their own arguments
asking your students! Gather anonymous feedback about whether they are using the tool, what value it provides them, and how they think it should be used in your disciplinary or teaching context
References:
Bearman, M., Ajjawi, R., Boud, D., Tai, J. & Dawson, P. (2023). CRADLE Suggests… assessment and genAI. Centre for Research in Assessment and Digital Learning, Deakin University, Melbourne, Australia. doi:10.6084/ m9.figshare.22494178
Bruff, D. (2023). Assignment Makeovers in the AI Age: Reading Response Edition. https://derekbruff.org/?p=4083
Bruff, D. (2023). Assignment Makeovers in the AI Age: Essay Edition. https://derekbruff.org/?p=4105
Carleton University (2023, August 21), Teaching Without the Use of AI. https://carleton.ca/tls/teachingresources/generative-artificial-intelligence/teaching-without-the-use-of-ai/
York University, (n.d.). AI Technology and Academic Integrity for Instructors. https://www.yorku.ca/unit/vpacad/academic-integrity/ai-technology-and-academic-integrity/
The Teaching and Learning Centre gratefully acknowledges the contributions of many colleagues at UPEI and at other Teaching and Learning Centres around throughout Canada in developing this section.
How can I build my digital literacy around AI?
Being AI Literate
Here's a primer on genAI prepared by Jason Hogan in the TLC. GenAI Basics
Being “AI literate” is the ability to understand, use, and reflect critically on AI applications. Here is an excellent guide to learn about :
Considerations when using AI tools
Frameworks to optimize prompts
Terms Related to Artificial Intelligence
AI Limitations
Indigenous Knowledge and Artificial Intelligence
AI Literacy Further Reading
Where can I find ideas for my classes?
Here's a webpage with ideas developed by Thompson Rivers University. Classroom Ideas
Tool for Determining Allowable Uses of AI with Writing Assignments developed by the University of Guelph
Reach out to the TLC we'd love to host an AI Ideas Exchange as a Lunch and Learn session.
How might I share my expectations with students?
Based on your course needs, your syllabus and / or assignment instructions should make it clear if, when, and how gen AI tools may be used by students as they complete learning activities and assessments in your course. See Appendix A in the Provisional Guidelines on the Use of Generative Artificial Intelligence (AI) in Teaching and Learning for Instructors at the University of Prince Edward Island – December 2023 for examples of statements you can use in your syllabus or assignment instructions.
Here are a few other approaches you might consider:
In this Dec. 2023 article in Educause Review, Michelle Kassorla shared how she communicates her expectations around Gen AI (GAI) and academic integrity to students in her classes.
Best Practices for Using GAI in My Class
GAI definitely has a place in academia, but you should use GAI tools very carefully. The following are "Best Practices" for using text-based GAI:
BE AWARE OF THE UNIVERSITY'S HONOR CODE. The honor code is no joke. You can fail a class, be suspended, or expelled for academic dishonesty. Make sure you are using GAI only to create outlines and drafts but NEVER the final copies of an essay. In addition, NEVER use ChatGPT or other GAI models to complete exams or discussion posts within the LMS program.
BE AWARE OF GAI "HALLUCINATIONS." If a large language model (LLM) like ChatGPT can't find an answer, it will make one up. This is called a "hallucination." GAI will hallucinate not only "facts" in your paper but also references. If you don't know what you are writing about, please do not use GAI.
START WITH A STRONG CLAIM. For any essay assignment, you must begin with a claim. The claim should be specific and arguable. If you need help drafting a strong claim, ask your friendly neighborhood chatbot to help you make it stronger. Remember, when you write this paper, you are making an argument for the way you see the material. To make a great outline or draft for your essay using GAI, you must begin with a strong, clear, and arguable claim.
USE WELL-CRAFTED PROMPTS. "Prompt engineering" refers to the way in which you enter the prompt into a GAI model. Prompts must clearly identify the task you want the GAI to perform, how you want the GAI to perform the task, and what output you want. Here is a link to some useful instructions about how to correctly prompt ChatGPT.
CAREFULLY READ THE DRAFT. You may have to ask the GAI model to revise the draft several times before you get anything you might consider plausible or usable. You can also ask ChatGPT how to improve your draft.
REVISE, REVISE, REVISE! Do NOT leave the draft as it is. You need to carefully revise it, often adding levels of detail and clarity to what the GAI model has given you. You will also need to add in-text citations and a reference page in MLA 9. You must add these separately, since most GAI models are incapable of adding real citations in the proper format.
USE ZOTERO. Zotero is an awesome tool that should help you throughout your college years and beyond. Please spend the 20 minutes needed to learn it. Don't use ChatGPT for any in-text citations, works cited, or reference pages.
FORMAT YOUR WORK. First, remove GAI formatting from your work. Remove the box and highlighting. Then make sure your work is formatted according to the required style manual.
CHECK YOUR WORK. Grammarly has become expensive, but there are some great GAI alternatives. Try Quillbot or Wordtune to check your work, especially in Google Docs.
USE GAI AS A TUTOR. Here is a prompt you might consider using to improve your writing:
I want you to act as a GAI writing tutor. I will provide you with an essay that I need help improving, and your task is to use artificial intelligence tools, such as natural language processing, to give me feedback on how I can improve the composition. You should also use your rhetorical knowledge and experience of effective writing techniques to suggest ways that I can better express my thoughts and ideas in written form. Write "Please paste your essay in the text box" and wait for me to paste my essay into the text box. When you have completed your analysis of the essay, end with the prompt, "Please paste your essay in the text box" so that I can paste my next essay.
What if I am looking for language that's neither "Absolutely no Gen AI" nor "Do whatever you want"?
Here's an interesting policy related to academic integrity, gen AI, paper mills, etc., in Jim Pryor's syllabus for an introductory honours philosophy class at UNC Chapel Hill.
Course Requirements and Expectations
In this course, you are also allowed to build on work that was started by others — whether this be papers you found online, or got from a friend, or were generated ChatGPT, LLaMA, or similar resources. In principle, it can be OK to begin with such sources, but there are rules and limits for how you have to proceed. And as I’ll explain, it will generally work out worse for you to do this, both educationally and in terms of your class performance.
When turning in essays for this class, every student — whether they used other people’s work as a starting point or not — will have to turn in not only their final product, but also earlier notes, drafts, and a log of their work process. If you built on other people’s work, or work generated by an AI, those original sources have to be provided, and you need to document where it was found (and/or what prompt was used to generate it). There won’t be any automatic penalty for using such resources. But your grade for the assignment will be based on what contributions you made to the final product. So if you, say, started with a mediocre paper found on a paper mill or output by ChatGPT, but then you transformed and refined it into something much better, using it as a springboard for your own original thinking, and you’re completely forthcoming about having done this, that’s okay, and you can get a decent grade from doing that.
However, in many cases, the distance between what you start with and your finished product may not be very substantial, and your grade for the assignment will certainly reflect that. This is one reason why it will be hard to get a good grade from these methods. Another is that often the sources you start with will contain ideas or argumentative moves that you’re not fully on top of, and that will be evident in the finished product. This will also be reflected in your grades.
So if you want to explore the freedom our course allows you to honestly incorporate/build on work started by others, and develop your skills of finding such work and adding to it to make it your own, you are free to do so. But the nature of what you’re doing will in most cases make it harder for you to get the best grades.
The important thing is for you to be explicit and straightforward about how you produced your assignment, and what resources you made use of. To the extent you do that, I will be charitable and fair in grading what value your own efforts added to the final result.
Misrepresenting any of this is a bad idea. Use of online papers or AI-generated work may be caught by the courseware. It may stand out when I’m reading your essay, and reviewing your notes/drafts/work log. Or your submission may be manifestly similar to work turned in by other students in the class, who relied on the same kinds of resources you did. In some such cases, I may invite students to orally explain and defend the reasoning of their paper. But it’s part of your assignments that you provide evidence that it’s your own work with the original submission. So there won’t be opportunities to suddenly come up with earlier drafts, after doubts have been raised.
If there are significant parts of a submission that a student hasn’t explained in terms they fully understand, that’s presumptive evidence that they relied on outside sources. If they did document all such sources, and how they made use of them, then although it will be a demerit in the paper that they’re using material they haven’t understood and made their own, this need not be an Honor Code Violation. But if there’s reason to believe there are further undocumented sources, or that they’ve misrepresented the ways and extent to which they’re making use of them, that will be presumptive evidence of a violation, and the case will be given to the Honors Court for investigation. This starts a formal process where ultimately, a committee of other students will decide whether wrongdoing occurred, and if so, what the penalty should be.
All assignments should be your own original work. Because of this, I disallow the use of generative AI tools in this course. Not all AI tools are generative, however, and some of those tools (I know) folks rely on to do their work. So here are some guiding examples that you should consider when deciding how to use various AI-based tools. If you have a question about a specific tool or use, please reach out to me.
Examples of AI tools that are ok to use in the course:
* Grammar and spelling checkers (e.g. Grammarly)
* Transcription or translation tools (e.g. OtterAI)
* Audio, image, or video generators, where the output is meant to to be funny (and appropriate for class) and not representative of research/data/outcomes
Examples of AI tools that are not ok to use in the course:
* Text generators (e.g. ChatGPT, Bard, Ernie Bot, LLaMA, Bing chat)
* Audio, image, or video generators, where the output is meant to represent research or data
* Data analyzers (e.g. Chartify, Rows.ai)
* Code generators (e.g. Copilot)
I care deeply about protecting your privacy as much as I can, and so I promise to not grade you or assess you using artificial intelligence of any kind.
As a data librarian, I am especially vigilant about the ways in which a lot of bad actors in this field have taken advantage of peoples’ data, and I encourage you all to critically examine the companies and people behind the tools you choose to use.
Bonnie Stewart from the University of Windsor has shared this policy, borrowed and adapted with permission from David Joyner at Georgia Tech:
“If you are unsure where the line is between collaborating with AI and copying from AI, we recommend the following heuristics:
Never hit “Copy” within your conversation with an AI assistant. You can copy your own work into your conversation, but do not copy anything from the conversation back into your assignment. Instead, use your interaction with the AI assistant as a learning experience, then let your assignment reflect your improved understanding.
Do not have your assignment and the AI agent itself open on your device at the same time. Similar to above, use your conversation with the AI as a learning experience, then close the interaction down, open your assignment, and let your assignment reflect your revised knowledge. This heuristic includes avoiding using AI assistants that are directly integrated into your composition environment: just as you should not let a classmate write content or code directly into your submission, so also you should avoid using tools that directly add content to your submission.
Deviating from these heuristics does not automatically qualify as academic misconduct; however, following these heuristics essentially guarantees your collaboration will not cross the line into misconduct.”
For more examples of course policies, you might want to examine Classroom Policies for AI Generative Tools
Are there ethical and privacy issues to consider beyond academic integrity when using AI in classes?
AI text generators such as ChatGPT are essentially very powerful word and phrase predictors which can quickly predict what word or phrase could come next given what has come before, based on associations have learned through their training data. Some concerns identified include:
Copyright and intellectual property: the materials used to create the data sets are largely taken without permission or informed consent, and it has not yet been legally determined who owns its outputs. Consider how using this tool might complicate our understanding of academic integrity and what it means to “do your own work.”
Labour issues: like many technological tools we rely on, ChatGPT is made usable because of underpaid and traumatic labour in the Global South. Consider how using this tool might trouble our collective values relating to EDI and decolonial principles.
Discrimination: because AI data sets come from our real world, with all its inherent racism, ableism, sexism, and so on, AI tools can also generate discriminatory outcomes. Consider how using this tool might trouble our understanding of equitable inclusion.
Climate change: the race to develop increasingly sophisticated Generative AI is not carbon neutral. Consider how using this tool might trouble our sustainability values.
When incorporating generative AI tools as part of course design, instructors should consider privacy and ethical issues related to :
Data privacy, ownership, authorship, copyrights
Unpaid labour and the commercialization of student text
Inequitable access
Inherent bias and discrimination
Lack of regulation
A team of students and staff at University College Cork have developed a Toolkit for the Ethical Use of GenAI in Learning and Teaching
Are there resources available for students?
Various universities and academic libraries have developed guides for students in relation to issues surrounding generative artificial intelligence and academic integrity. Here are two guides that have Creative Commons licenses that you might want to share as is or adapt.
Artificial Intelligence: A Guide for Students - from Thompson Rivers University
Artificial Intelligence - from University of Calgary
Artificial Intelligence tools and your learning - From Edinburgh Napier University
How do I cite an AI generator?
Any material you use but don’t create yourself is material that requires a citation. Monash University has a resource that addresses how to disclose, acknowledge, and cite the use of generative AI. You should also consider how you are disclosing the use of generative AI in your own teaching practice.
If students are permitted to use AI in coursework, APA and MLA have some guidance on citing:
APA - How to cite ChatGPT - https://apastyle.apa.org/blog/how-to-cite-chatgpt
MLA - How do I cite generative AI in MLA style? - https://style.mla.org/citing-generative-ai/
References
Bearman, M., Ajjawi, R., Boud, D., Tai, J. & Dawson, P. (2023). CRADLE Suggests… assessment and genAI. Centre for Research in Assessment and Digital Learning, Deakin University, Melbourne, Australia. doi:10.6084/ m9.figshare.22494178
Bender, E. M., Gebru, T., McMillan-Major, A., & Smitchell, S. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, March 2021, pp. 610–623.
Bruff, D. (2023). Assignment Makeovers in the AI Age: Reading Response Edition. https://derekbruff.org/?p=4083
Bruff, D. (2023). Assignment Makeovers in the AI Age: Essay Edition. https://derekbruff.org/?p=4105
Carleton University Teaching and Learning Services. (2023). Generative Artificial Intelligence Website.
Kirchenbauer, J., Geiping, J., Wen, Y., Katz, J., Miers, I., & Goldstein, T. (2023) A Watermark for Large Language Models. arXiv:2301.10226.
Kirchner, J., Ahmad, L., Aaronson, S., & Leike, J. (2023, January 31). New AI classifier for indicating AI-written text. https://openai.com/blog/new-ai-classifier-for-indicating-ai-written-text
Kovanovic, V. (2022). The dawn of AI has come, and its implications for education couldn’t be more significant. The Conversation, December 14, 2022.
Lawton, G. (2023). What is generative AI? Everything you need to know. TechTarget.
Mitchell, E., Lee, Y., Khazatsky, A., Manning, C., & Finn, C. (2023). DetectGPT: Zero-Shot Machine-Generated Text Detection using Probability Curvature. arXiv:2301.11305.
Mollick, E., & Mollick, L. (2023). Why All Our Classes Suddenly Became AI Classes: Strategies for Teaching and Learning in a ChatGPT World. Harvard Business Publishing – Education.
Monash University (2023). Generative artificial intelligence technologies and teaching and learning.
Taylor Institute (2023). A First Response to Assessment and ChatGPT in your Courses. Website.
Trust, T. (2023) ChatGPT & Education a Google document that describes what ChatGPT can do, what you need to know about ChatGPT, and how educators can use it. https://docs.google.com/presentation/d/1Vo9w4ftPx-rizdWyaYoB-pQ3DzK1n325OgDgXsnt0X0/edit#slide=id.p
York University, (n.d.). AI Technology and Academic Integrity for Instructors. https://www.yorku.ca/unit/vpacad/academic-integrity/ai-technology-and-academic-integrity/
The Q & A sections of the page were inspired by the University of Toronto’s Chat GPT and Generative AI in the classroom (2023).