ENS 301.01: AI Usage in the Context of Environmental Morality
By: Mary Scher
By: Mary Scher
Artificial intelligence (AI) is an exponentially developing tool that has become intertwined with many facets of modern lifestyle, with one of its fastest growing uses being in academia (Nagelhout, 2023). In upper-level education, like high school and college, students may experience a heavier workload and turn to AI for assistance. However, students may not be aware of the severe environmental impacts resulting from training, running, and maintaining AI systems. One ChatGPT model consumed 185,000 gallons of water during its training phase only (George, et al., 2023). The combination of water and electricity consumption of AI gives it a bad environmental reputation, and could be enough to deter an environmentally conscious student from using it. The goal of this study is to gauge GVSU students' knowledge of AI's environmental impact, how this influences their perception of its purpose, and ultimately if they decide to us AI or not.
On top of the forementioned water consumption, a single large AI model’s training, such as ChatGPT, can use thousands of megawatt hours of electricity – this is about the same amount of carbon emissions released as hundreds of American households in one year (Ren & Wierman, 2024).
Every search entered into AI adds on to this impact, as more electricity and water are needed to power and cool the servers.
Renewable energy sources can be a possible solution for electricity needs, however, rare metals needed to construct green energy infrastructure such as solar panels are often hard to obtain and expensive (Wu, et al., 2022).
In 2022, Google was able to achieve running its data centers in Finland on 97% carbon-free energy, and 90% in Denmark (Google, 2023). In comparison, Google’s data centers in 2022 ran on 18% carbon-free energy in Taiwan, and only 4% in Singapore (Google, 2023).
Google is able to get away with these renewable energy gaps due to differing expectations in countries caused by systemic inequalities (Vaz, et al., 2017).
AI systems can be trained to survey different pollutant measurements from point sources in a certain area and use that data to predict pollution exposure (Krupnova, et al., 2022). These complex, expensive systems are often only employed in privileged areas that can afford to build and maintain them (Krupnova, et al., 2022).
AI models can also inherit bias internally if they are trained with biased data (McGovern, et al., 2022). This can lead to them replicating false statistics, and misinforming students, or even the general public.
Students with a technical-related background have the highest AI literacy scores (Hornberger, et al., 2023), but this definition of AI literacy does not include knowledge of environmental impacts.
When 2,167 students of different medicinal disciplines were surveyed about AI, over half did not know what it was or provided misinformation in their responses (Teng, et al., 2022).
AI literacy cannot be considered common knowledge due to its growing complexity and newness.
AI is already being worked into higher level educational curricula, specifically colleges and universities (Zawacki-Richter, et al., 2019).
Pro-environmental behaviors are behaviors that either result in negative impacts or positive improvements for the environment, such as recycling, riding a bike instead of car, or using single-use plastics (Ghazali et al., 2019). PEB theory is an amalgamation of the Value-Belief-Norm theory (VBN), and the Norm Activation Model (Sawitri et al., 2015), with additional theoretical constructs to provide more context for why one might engage in PEBs. These additional contexts include personal inputs such as age, gender, race/ethnicity, health, political views, and religious beliefs.
Values and a person's openness to change are considered, which then create beliefs, personal norms, and may lead to pro-environmental behaviors. Pride and guilt are anticipatory emotions that can affect how a person may behave as well. Like the two beliefs mentioned above, pride and guilt are influenced by personal values and beliefs (Onwezen et al., 2013). If a student values the environment, that may lead to them engaging in pro-environmental behaviors, which could include disengaging from AI.
With how current AI is and the continuing expansion of its uses, it is imperative that research covers how much people understand AI and its purpose, and what factors may deter them from using it. This is especially true with students, as many newer AI models are targeted and advertised toward educational purposes. If students are aware of the environmental consequences of AI, we must find out if this is enough to outweigh the potential pros and trigger a moral response of avoiding AI.
References
George, A. Shaji, George, A. S. Hovan, & Martin, A. S. Gabrio. (2023). The environmental impact of AI: A case study of water consumption by Chat GPT. Partners Universal International Innovation Journal, 1(2), 97–104. https://doi.org/10.5281/zenodo.7855594
Ghazali, E. M., Nguyen, B., Mutum, D. S., & Yap, S.-F. (2019). Pro-Environmental Behaviours and Value-Belief-Norm Theory: Assessing unobserved heterogeneity of two ethnic groups. Sustainability, 11(12), 3237. https://doi.org/10.3390/su11123237
Google. (2023). Environmental report 2023. Google. https://www.gstatic.com/gumdrop/sustainability/google-2023-environmental-report.pdf
Hornberger, M., Bewersdorff, A., & Nerdel, C. (2023). What do university students know about Artificial Intelligence? Development and validation of an AI literacy test. Computers and Education: Artificial Intelligence, 5, 100165. https://doi.org/10.1016/j.caeai.2023.100165
Krupnova, Tatyana G., Olga V. Rakova, Kirill A. Bondarenko, and Valeria D. Tretyakova. (2022). Environmental justice and the use of Artificial Intelligence in urban air pollution monitoring. Big Data and Cognitive Computing 6, no. 3: 75. https://doi.org/10.3390/bdcc6030075
McGovern, A., Ebert-Uphoff, I., Gagne, D. J., & Bostrom, A. (2022). Why we need to focus on developing ethical, responsible, and trustworthy artificial intelligence approaches for environmental science. Environmental Data Science, 1. https://doi.org/10.1017/eds.2022.5
Nagelhout, R. (2023). Academic resilience in a world of Artificial Intelligence | Harvard Graduate School of Education. https://www.gse.harvard.edu/ideas/usable-knowledge/23/08/academic-resilience-world-artificial-intelligence
Onwezen, M. C., Antonides, G., & Bartels, J. (2013). The Norm Activation Model: An exploration of the functions of anticipated pride and guilt in pro-environmental behaviour. Journal of Economic Psychology, 39, 141–153. https://doi.org/10.1016/j.joep.2013.07.005
Ren, S., & Wierman, A. (2024). The uneven distribution of AI’s environmental impacts. Harvard Business Review. https://hbr.org/2024/07/the-uneven-distribution-of-ais-environmental-impacts#:~:text=Summary.%20The%20training%20process%20for%20a%20single%20AI%20model
Teng M, Singla R, Yau O, Lamoureux D, Gupta A, Hu Z, Hu R, Aissiou A, Eaton S, Hamm C, Hu S, Kelly D, MacMillan K, Malik S, Mazzoli V, Teng Y, Laricheva M, Jarus T, Field T. (2022).Health care students’ perspectives on Artificial Intelligence: Countrywide survey in Canada
JMIR Med Educ.;8(1): e33390. https://mededu.jmir.org/2022/1/e33390
Vaz, E., Anthony, A., & McHenry, M. (2017). The geography of environmental injustice. Habitat International, 59, 118–125. https://doi.org/10.1016/j.habitatint.2016.12.001
Wu, C.-J., Raghavendra, R., Gupta, U., Acun, B., Ardalani, N., Maeng, K., Chang, G., Behram, F., Huang, J., Bai, C., Gschwind, M., Gupta, A., Ott, M., Melnikov, A., Candido, S., Brooks, D., Chauhan, G., Lee, B., Lee, H.-H., & Akyildiz, B. (2022). Sustainable AI: Environmental implications, challenges and opportunities. https://proceedings.mlsys.org/paper_files/paper/2022/file/462211f67c7d858f663355eff93b745e-Paper.pdf
Zawacki-Richter, O., Marín, V.I., Bond, M. et al. (2019). Systematic review of research on Artificial Intelligence applications in higher education – where are the educators?. Int J Educ Technol High Educ 16, 39. https://doi.org/10.1186/s41239-019-0171-0