The notion of Artificial Intelligence and its capability of mimicking human thought and writing have long been stuff of the movies - Bicentennial Man (1999), Ex Machina (2014), I,Robot (2004), HER (2013), Jurassic World: Fallen Kingdom (2018), The Creator (2023)...the list goes on and much further back (the original iteration of West World came out in 1973). And the potential sophistication of AI to mimic human expression - whether that be verbal, written, etc - seemed the stuff of movies for generations...until the advent of ChatGPT.
Now, we know that companies like Google, Meta, Apple, etc. have long gathered data on our behavior online to be able to maximize our spending habits, predict the information streams we consume, anticipate our interests, etc. But the ability to mimic human thought and expressive activities...that continued to seem stuff of the movies. Continues to be. But it cannot be denied - AI has entered the classrooms in ways we did not anticipate. And we have to address it.
Truth be told this is something that I have not wanted to confront. But the clear uptick in the use of AI and blatant plagiarism over the last few years in my courses has made me pause. First, there was denial and incredulity. Then there was a self-righteous anger informed by my own thoughts on academic integrity and honor. Then there was a mad dash to figure out how to "get in front of it" and identify work that was "clearly" articulated by AI. Today, I am in a place of simultaneous frustration and curiosity, though now I want to collaborate with my students (and colleagues) to cultivate a better learning experience that uses AI as a tool rather than seeing it as the final demise of the the education system.
The Student Code of Conduct (click on the tab that says "Read the Student Code of Conduct") says the following:
(Section 8) Academic or intellectual dishonesty such as cheating, plagiarism, or the use of generative tools (including but not limited to GPT-4, ChatGPT, Claude, Cohere), without the permission of the instructor to produce responses to school tasks or activities. Cheating is defined as taking an examination or performing an assigned, evaluated task in a dishonest way. Plagiarism is defined as the unauthorized use of the written language and thought of another author/tool without proper quoting or citing and representing the author's/tool’s work as the student's own. A student may not use generative tools to produce content that the student submits as the student’s own thoughts and/or language.
The question we wrestle with, in all honesty, is how we detect the use of AI. We can run it through numerous "AI Detector" sites, but that can result in contradictory reports. It's also extremely time-consuming - I used to do this. Not that I will NOT use these detector sites ever again, but I will be more judicious about it.
The bottom line is that I am more interested in YOUR thoughts on what we're learning. I want to know how you understand the material. If you are having trouble with the material that is presented, I would much rather have a conversation about it than have you stress over it. What you have to say does not have to be in "perfect standard English," I want to know how you are making connections between the materials and the life you live and the knowledge you already have in your head.
The reality is that WE - as educators - can tell, most of the time. Or at least those of us who've been doing this for quite some time get a sense - this is our "spidey tingle." And it is frustrating - mostly because we feel that you are expending more energy trying to get around learning the material we would like you to engage rather than actually doing the work. If that is the case - what is the point? Why are you even taking this course? Or any courses at all? If the point is to get through an educational process without learning fundamental skills, knowledge, and expertise in a field or discipline, you have chosen to become a detriment to your career path. Can you imagine someone who moved through - say - a medical program utilizing AI to complete all their assignments. And then they are hired in a clinic or hospital to perform medical procedures and assess your medical conditions - but cannot because they do not actually have the background knowledge to complete those procedures NOR the ability to assess your situation with confidence. We could joke and say that they can just look up how to do something on YouTube - but would you want that? How about a lawyer? A teacher? A politician?
Now...I'm a historian. And I sense that there are way too many people today who feel understanding history isn't life-critical. That makes me sad in a lot of ways, because it IS critical.
The work we do in Ethnic Studies, social sciences, humanities, and the arts - it cultivates and exercises your ability to be a critical thinker. It helps make connections. The work we do often helps you answer the question WHY. Why did Hamas attack people at that Music Festival? Why has Israel responded with such force? Why did the United States choose to vote "no" on the ceasefire agreement moved through the UN? Why are there still school shootings? Why do people offer "thoughts and prayers" rather than actionable policies? Why does there continue to be inequality - informed by race, gender, immigrant status, sexual orientation, language - in the "land of the free, home of the brave?" Why is there a class specifically about the immigrant experience in the US - isn't that ALL of US History?
Why is that important? So that you can discern falsehoods, don't fall for fake news, are able to dig deeper into the story and ask questions, wonder about the impact of policies, actions, events. But, most important to me, is that you recognize how we - all of humanity, this world we live in - are connected and the individual choices that we make has impact, big and small, on the world around us. In addition, I worry and wonder about whether or not we are capable of having conversations about things that are hard, things that are divisive, and things that we need to debate together.
My greatest fear about the direction we COULD go as a species, as caretakers of this world we live in, is if we no longer choose to be thinkers and dreamers and critics and leave the "difficult" things to machines is summed up in the movie Idiocracy (2006). If you haven't seen it...I'm sure it's streaming somewhere.
I know that AI is something that we can't ignore nor avoid. Nor can we "ban" our students from using it. It is inevitable, it is ubiquitous, and it is being encouraged in so many of the fields that we are sending our students to work in. SO...we'll be experimenting with AI through our work this semester. How can we use AI in a generative and thoughtful way, one where it becomes a useful tool rather than a substitute for genuine thought and work. We'll discuss this further in our first few classes. I hope you're down to think and work through this with me.