Environmental impact is a major concern. If you didn't know, AI is not great for the environment and unless you're informed, you may not realize how bad. But, if you are informed, it also might not be as bad as you think.
I started showing a group of students how to use AI to write an APA-style bibliography citation (something I use it for regularly... yes, I'm that lazy) and was interrupted by a student who asked "Shouldn't we discourage use of AI entirely due to the environmental impact?" After I responded with what I considered a good rebuttal, another student shouted something along the lines of "AI is way worst! By 2030, global water consumption for AI data centers will be over 1.5 trillion liters annually!" - this wasn't an exact quote but a vague recollection of what they said... still, I was impressed that the student cared enough to have memorized a statistic they could shout out off the cuff.
Yes, we should be concerned about the impact on the environment. We, as users of this technology need to be mindful of our uses and encourage governmental regulations to push companies to improve their technologies and decrease their environmental impact. We should be very aware of the fact that the major environmental concerns are on clean water usage, energy consumption, and carbon emissions.
Do I think that the environmental impact means we should ban its use altogether? No. I'm a computer science teacher and this is the new age; this is the future of computer science. This technology will move forward whether we want it to or not and society as a whole will adopt it whether we individually want it to or not. Every technology before us has had an environmental impact and we've justified its usage because the pros outweigh the cons and we push for improvements that decrease that impact over time. There are arguments to be made about how much worse AI's impact is, but just as other technologies have decreased their impact over time, I do believe AI will follow suit, and it is already significantly less impactful in some areas than it was 3 years ago.
Here's some #'s to make you think:
[Under Construction]
Is it still unnerving how much of an impact it has and what its current trajectory is? Absolutely. We should discuss these concerns. Individuals can choose to avoid AI, just as they can avoid driving a car (also bad for the environment), smoking cigarettes (bad for your personal health), and politics (bad for social gatherings), but avoiding those individually doesn't change the fact that society has adopted them and other individuals will used them. It also doesn't change the fact that many companies and higher education have an expectation of AI experience & usage with potential hires and student learning.
I personally believe the future of AI is as avoidable as the future of vehicular transportation (which is to say, not avoidable), and like transportation the best thing we can do is set personal limits on how we choose to (or not to) use it and push for regulations to help encourage improvements/change.
Cooling.
Computation takes energy.
Energy produces heat.
Heat is an enemy to computers - it causes component failure, data corruption, and system instability. CPUs, the brain of a computer, have to throttle their performance, drastically slowing down to lower operating speeds and power consumption in order to cool off.
An effective strategy against this is water cooling. The process, according to BBC World Services "How AI uses our drinking water" is as follows
A coolant (not water) is piped over processing chips, absorbing the heat and moving it away from the electrical components.
The coolant makes its way towards a heat exchange unit where water absorbs the heat.
The water is then piped away to a cooling tower where the water cools or evaporates.
The remaining cooled water returns to the heat exchange where it repeats its journey several times before being discharged back into the water source.
Up to 80% of the water evaporates.
Evaporated water is no longer available in the water source for consumption, irrigation, or other necessary uses. 100% of evaporated water is clean and drinkable once condensed, but not all of that water will find its way back to aquifers. Shallow aquifers can be recharged within weeks or months, but deep aquifers in arid zones can have virtually no recharge, meaning in some places water is a non-renewable resource. Once it's used, it's never coming back.
Tech companies need to keep water consumption in mind when picking data center locations in order to pick areas with a high aquifer recharge rate in order to impact the environment the least.
According to BBC World Services "How AI uses our drinking water", cooling systems used in AI data centers have to use clean, drinkable water "to prevent bacteria growing or clogs and corrosion in the system."
Articles:
The New York Times: Microsoft Pledges to Save Water. In the A.I. Era, It Expects Water Use to Soar
The following information was presented by Google Gemini and had several sources in it.
Prompt: "Explain the environmental impacts of AI. Only use academic sources. Also provide information about its impact compared to other major environmental impacting technologies, such as vehicles, power plants, and factories."
Based on academic research and technical reports, the environmental impacts of Artificial Intelligence (AI) are primarily driven by the massive computational infrastructure required to train and operate large models. These impacts are categorized below, followed by a comparative analysis with other major industries.
1. Energy Consumption
Training: Developing a single large language model (LLM) requires immense energy. For example, training OpenAI’s GPT-3 is estimated to have consumed 1,287 megawatt-hours (MWh) of electricity, which is roughly equivalent to the annual energy consumption of 120 U.S. homes.
Inference (Usage): The "inference" phase—where the model answers user queries—consumes significantly more energy over the model's lifespan than training. Research indicates that a single AI-generated query (e.g., ChatGPT) can consume 10 times more electricity than a standard Google search.
Global Scale: Data centers, which house the hardware for AI, currently account for approximately 1–1.5% of global electricity use. The International Energy Agency (IEA) projects that electricity consumption from data centers, AI, and cryptocurrency could double by 2026.
2. Carbon Emissions
Carbon Footprint: The carbon footprint of AI includes both "operational carbon" (electricity burned during use) and "embodied carbon" (emissions from manufacturing chips and servers).
Specific Examples: The training of GPT-3 was estimated to emit 552 metric tons of CO2 equivalent.
Hardware Lifecycle: Academic studies emphasize that while operational emissions (electricity) currently dominate, the manufacturing of specialized AI hardware (GPUs) is carbon-intensive. For some hardware, manufacturing accounts for 10–30% of its total lifecycle emissions.
3. Water Consumption
Direct Cooling: Data centers generate massive heat and often use water-based cooling towers. Training GPT-3 alone is estimated to have directly consumed 700,000 liters of clean freshwater.
Usage Rate: It is estimated that a conversation with an AI chatbot (roughly 20–50 questions) consumes approximately 500 milliliters (one standard water bottle) of water.
Indirect Usage: Enormous amounts of water are also used to generate the electricity that powers these facilities, particularly in thermoelectric power plants.
The environmental impact of AI is often abstract, but academic literature provides specific comparisons to vehicles, power plants, and factories to contextualize its scale.
Vehicles
• Training vs. Driving: The carbon emissions from training a single large AI model (approx. 552 tons CO2e) are equivalent to the lifetime emissions of 5 average American cars (including fuel and manufacturing) or roughly 17 diesel vehicles.
• Global Impact: The entire Information and Communications Technology (ICT) ecosystem, including AI, accounts for 2–4% of global greenhouse gas emissions, which is on par with the aviation industry (often cited as a major polluter).
Power Plants
• Energy Intensity: A single hyperscale data center can consume as much electricity as 50,000 to 100,000 homes. This localized demand is so high that it can prolong the operation of fossil-fuel power plants that would otherwise be retired.
• Water Usage: Like thermal power plants, AI data centers are massive consumers of water. In some regions, data centers compete directly with local communities and agriculture for water resources, withdrawing billions of gallons annually.
Factories
• "Information Factories": AI data centers are effectively industrial factories that process data instead of raw materials. The IEA notes that a large AI data center can draw as much continuous power as an aluminum smelter, which is one of the most energy-intensive types of heavy manufacturing.
• Construction: The construction of data centers is highly carbon-intensive due to the use of concrete and steel. Academic analysis suggests the carbon peak during data center construction can significantly exceed that of residential or standard commercial building projects.
While AI has a significant footprint, academic sources also highlight its potential to reduce emissions in other sectors. AI optimization is currently used in factories and power grids to improve efficiency, potentially offsetting some of its own environmental costs. For instance, AI is used to optimize cooling in data centers themselves, reducing energy use by up to 40% in specific Google facilities.
Follow Up Prompt: "We often view AI as one giant beast wreaking havoc on our environment, but we forget that it's a shared resource and millions, possibly billions are using it. When we divide AI's impact per person using it, then compare that to the impact of an individuals environmental impact driving a car, what conclusions would we make?"
When we shift our perspective from the aggregate "beast" to the individual user, the environmental narrative changes dramatically. The data reveals that for a typical person using AI for text-based tasks (like writing emails or coding), the environmental impact is negligible compared to physical activities like driving.
However, this changes significantly if you move from text AI to video AI.
To understand the scale, we must compare the carbon emissions ($CO_2e$) of a single unit of activity.
The Car (1 Mile): Driving an average gasoline passenger vehicle emits approximately 400 grams of $CO_2$ per mile.
The Chatbot (1 Text Query): Estimates vary by model efficiency and energy grids, but recent academic and industry reports place a standard text query between 0.03 grams (highly efficient, e.g., Google Gemini) and 4.32 grams (less efficient/older estimates) of $CO_2$.
The Comparison: The "Grocery Store" Commute
If you drive 5 miles (8 km) round-trip to the grocery store, you emit roughly 2,000 grams of $CO_2$. To match the carbon footprint of that single short drive, you would need to:
Perform between 460 and 66,000 text queries.
Interpretation: Even using the most pessimistic estimates, you would have to send hundreds of AI prompts to equal the impact of driving just a few miles. For an efficient model (0.03g/query), a user could send 100 queries every day for nearly two years to equal the emissions of that one 5-mile drive.
The conclusion above applies strictly to text-based AI. If you use AI to generate media (images and video), the energy cost skyrockets.
Text Query: ~0.005 kWh (Low impact)
Image Generation: ~0.25 kWh (Moderate impact—equivalent to charging a smartphone)
Video Generation: High impact.
Generating just 5 seconds of AI video can emit as much carbon as driving 1 mile.
Activity
---The following is from a table and needs to be reformatted---
Est. Carbon Emissions (CO2e)
Equivalent Driving Distance
1 Text AI Query
0.03g – 4.3g
1 to 50 feet (0.3 – 15 meters)
1 AI Image Gen
~100g – 200g
0.25 – 0.5 miles (0.4 – 0.8 km)
5 sec AI Video
~400g
1 mile (1.6 km)
1 Gallon of Gas
8,887g
22 miles (average MPG)
---end of table---
When viewed per person:
Text AI is environmentally "cheap": The guilt associated with asking a chatbot a question is mathematically disproportionate to its impact. A single skipped car trip saves more carbon than a lifetime of restraint in using text-based AI.
Generative Media is the real driver: The "beast" emerges when millions of users shift from text to generating images and video. Generating a few minutes of AI video is environmentally comparable to a physical commute.
The "Shared Resource" Effect: The environmental threat of AI isn't the individual query, but the aggregate volume. Billions of queries add up to gigawatts of power, requiring new power plants. However, for the individual trying to reduce their footprint, driving less remains a far more effective action than using AI less.