Vol. 11| 3.21.25
In this week’s edition, we revisit a previous topic, the often-overlooked issue of AI’s water consumption, particularly in the context of cooling massive data centers.
As artificial intelligence continues to expand its role in industries ranging from healthcare to finance, the infrastructure supporting these advancements is also growing—along with its water footprint. The rapid deployment of AI-powered tools, including large language models, autonomous systems, and data-driven analytics, requires increasingly powerful data centers. These facilities, which house and operate AI servers, depend on sophisticated cooling mechanisms to prevent overheating. As a result, AI’s water consumption is rising in tandem with its adoption.
Moreover, the increasing complexity of AI models requires larger computational clusters, leading to an even greater demand for cooling resources. The rise of edge computing—where AI processing is performed closer to the user rather than centralized in massive cloud data centers—also adds to the water footprint as more localized infrastructure comes online. Additionally, as companies strive for real-time AI processing, the need for high-performance GPUs and specialized AI chips exacerbates heat generation, requiring more aggressive cooling strategies. This, in turn, translates into a more significant water draw per computational task than ever before.
Major tech companies are racing to build larger and more efficient AI data centers, but the scale of their operations often translates into significant water usage. For example, recent reports indicate that data centers operated by Microsoft, Google, and OpenAI collectively consumed billions of liters of water in a single year to support AI model training and inference.
As generative AI becomes more prevalent in everyday applications, from personalized search to automated customer service, the number of AI-driven data centers is projected to increase, further exacerbating the demand for water.
Additionally, AI companies are competing to develop more advanced machine learning models, which require intensive computational power and, by extension, significant cooling resources. As a result, not only are existing data centers expanding, but new facilities are being constructed at an unprecedented rate. Some of these centers are located in regions where water access is already a contentious issue, raising concerns about long-term sustainability. Furthermore, as AI applications grow in sophistication, the need for real-time, high-speed processing is driving investments in high-density computing clusters, which consume even greater amounts of both energy and water. If left unchecked, the growing water demand of AI infrastructure could outpace technological advancements in sustainable cooling, creating a long-term environmental challenge.
One notable example is our Finger Lakes region where plans for AI data centers have raised concerns among environmentalists and local communities. The area, known for its pristine water bodies and reliance on sustainable water management, has seen increasing scrutiny over how new data center developments could impact water resources. As companies explore expanding into regions with abundant fresh water, such as Upstate New York, they face mounting pressure to implement stricter water conservation measures and transparency regarding their environmental impact.
The Finger Lakes region of New York has become a focal point for debates surrounding the expansion of data centers, particularly those associated with cryptocurrency mining and artificial intelligence (AI) operations. Local communities, environmental advocates, and Indigenous groups have voiced significant concerns over the potential environmental and cultural impacts of these developments.
One prominent case involves the Greenidge Generation power plant, a former coal-fired facility that transitioned to natural gas and began cryptocurrency mining operations. Environmental groups, including the Sierra Club and Seneca Lake Guardian, have raised alarms about the plant's water usage and thermal discharges into Seneca Lake. They argue that the increased water temperatures could harm aquatic life and exacerbate harmful algal blooms, thereby threatening the lake's ecosystem and the livelihoods dependent on it.
As you might imagine, the impact of AI-related water consumption is not uniform across regions. Some of the largest data center hubs, such as those in Arizona, Nevada, and Singapore, are located in areas facing severe water stress.
The paradox is stark: AI-driven solutions are often touted as tools for addressing climate and sustainability challenges, yet the very infrastructure supporting these advancements can contribute to local water shortages. This tension underscores the need for more sustainable cooling technologies and greater transparency from companies regarding their water usage.
While everyday AI interactions—such as using a chatbot or running a search query—consume measurable amounts of water, the most significant water costs arise during the training of large AI models. A study by the University of California, Riverside, found that training GPT-3 alone required hundreds of thousands of liters of water. As newer and more complex models emerge, their training demands could outpace the industry’s current ability to mitigate water consumption effectively.
In response to growing concerns, AI companies and data center operators are exploring alternative cooling methods. Some are turning to direct-to-chip liquid cooling, which uses small amounts of liquid coolant circulated directly over server components, significantly reducing water waste. Others are investing in air-cooled systems and leveraging renewable energy sources to minimize overall environmental impact. Last year alone, “the U.S. Department of Energy allocated $40 million to support innovative data center cooling technologies. These projects aim to push the boundaries of energy efficiency and sustainability in data centers.”
The growing demand for data processing and storage has led to concerns about the environmental impact of these facilities. Traditional data centers consume a considerable amount of energy and resources, often relying on fossil fuels and contributing to greenhouse gas emissions. In response to these environmental challenges, the concept of green data centers has emerged as a sustainable solution to reduce the environmental footprint of data storage and processing.
A green data center is an environmentally responsible facility designed to minimize energy consumption, reduce carbon emissions, and optimize the use of natural resources while maintaining high levels of operational efficiency. These centers are equipped with energy-efficient technologies, sustainable building materials, and renewable energy sources to ensure a minimal environmental impact. Green data centers aim to strike a balance between technological advancement and ecological responsibility, thus contributing to the global efforts of reducing the carbon footprint associated with information technology infrastructure.
These centers may also serve as key enablers of advanced technologies, particularly artificial intelligence, by providing the necessary infrastructure to support the computational demands of AI applications.
As AI algorithms become more complex and require larger datasets and faster processing speeds, data centers must evolve to meet these demands while adhering to sustainability goals.
However, the challenge remains: AI’s water footprint is intertwined with its broader energy and resource consumption. Addressing these issues requires a concerted effort from policymakers, tech companies, and environmental researchers to develop regulations, incentives, and innovations that ensure AI’s growth does not come at the expense of global water security.