We couldn't complete a class on the digital humanities without doing a project on generative AI. This project aimed to explore what was human in the act of writing. After analyzing a text, we plugged a prompt into our brains and wrote an analog essay, and then we plugged the prompt into ChatGPT and the algorithm churned out an essay. I analyzed once more, and after reading what the binary code 'thinks' about the human experience, I wrote an open letter to my fellow Mercer students.
For my project, I chose to analyze Italo Calvino's short story, "The Dinosaurs." This story encapsulates what storytelling is and how the truth gets lost between generations. It also explores the idea that when it comes to storytelling, the 'truth' can be relative to one's experiences.
The prompt we received for our analog essay was to analyze what aspects of the story can be used to answer the question, "Where is the "human" in human communication?"
For my analog essay I wanted to focus on the way that the New Ones in Calvino's story are representative of generative AI. I acomplished this by focusing on three key aspects of the story:
The way that the New Ones are generative AI as they continuously change their stories about the dinosaurs based the on Qwfq's actions
Fern-Flower is representative of a victim of AI because she believes the assumptions that are made about dinosaurs within the New Ones' stories, and it shows how she has lost her ability to think for herself
Despite being someone who experienced the era of dinosaurs, Qwfq is still susceptible to the lies that the New Ones tell within their stories, and he loses himself in the same way that people are begining to lose themselves in AI.
If you couldn't already tell, I am far from generative AI's biggest fan, and this essay only solidified my negative opinion. Within the mess of my annotations, you can see how the essay actually proved my point. The bot 'said' that the human experience is what makes the communication human, but it cannot have that experience because it is just a combination of ones and zeros (I think?). There is no soul behind the machine which automatically makes it incapable of replicating the thing that makes our interactions of humans so special.
Dear Mercer University students,
If you use AI in general, you are robbing yourself. If you use it to write for you, then you are robbing yourself of the emotions that come with being a human. Of course, I have bias as an English major. I love writing, but I know that not everyone does. It can be challenging and time-consuming, but that is all a part of the human experience. As an English student at Mercer, I have the privilege to study what makes human communication human and how that differs from what AI platforms can generate. I wrote an essay and gave ChatGPT the exact same prompt, and then it proved my point. Both ChatGPT and I analyzed how Italo Calvino’s “The Dinosaurs” demonstrates the human aspects of storytelling. If I could have AI generate an essay on the same topic, why would I bother to do my own work in the first place?
I connected with the short story emotionally, something that AI is incapable of as it is not a living being with emotions. Calvino’s “The Dinosaurs” is about a dinosaur, Qfwfq, that has survived the mass extinction, and it follows him as he experiences living with a new species, the New Ones. The New Ones tell fictional stories about dinosaurs that they believe are true despite not being able to recognize the dinosaur that is living among them. At the end of the story, Qfwfq’s own son is born and oblivious to the fact that he is a dinosaur. As a human reader, I was able to understand and relate to Qfwfq as he watched the loss of his culture unfold right in front of his eyes. I connected emotionally, something a robot or algorithm could never do. The emotional connection is what keeps people reading even if the emotions are negative. It is the reason you are reading this right now because you have some kind of emotion invested in my letter.
Not only can AI not emotionally connect with literature, but it also cannot generate anything with emotion behind it. Like I said before, the essay that ChatGPT generated proved my point. When I started reading the program’s essay, I honestly had to laugh. It generated an essay talking about human communication and used language such as “we” and “our” even though it is not human. I did not prompt it to write as if it were human, nor do I use the program enough for it to be programmed for any preferences that I may have. It kept talking about “our” needs as humans, and then it slipped up and said, “how humans use storytelling to explore their place.” (I thought this essay was about “our” experience as humans.) Then it slipped up again and gave the exact reason as to why AI could never replicate human communication. In its’ essay ChatGPT said that storytelling is a way of saying “I was there. I felt this. Let me explain.” Now I do not think I need to tell university students this, but generative AI is incapable of doing this. It cannot be there. It cannot feel. It can try to explain, but it will never be the same because there is no emotional connection.
I do not expect my short letter to completely change the way that you think about human communication and AI, but I know that you have some kind of emotional investment in this, and I want you to sit with that. Regardless of whether you agree or disagree, or if this made you angry, or if this made you feel validated in your own feelings towards generative AI, I want you to sit with the emotion. These emotions make us human, and that is what makes our communication so special because we are able to share our feelings and experiences with others in a meaningful way. If you are stripping your communication of ideas from any emotional connection, why communicate at all?
Humanely,
Mia Rosario