Introduction: AI Growth and the Hidden Water Cost
The rapid acceleration of artificial intelligence (AI) adoption has driven an unprecedented expansion of global digital infrastructure. Large language models, generative AI applications, real-time analytics, and cloud-based machine learning services rely on hyperscale data centers capable of processing vast computational workloads continuously. While attention has focused primarily on the energy consumption and carbon footprint of this expansion, water—a critical enabling resource—has remained largely invisible in policy and public discourse. Data centers depend on consistent access to water for cooling systems, electricity generation, and operational redundancy, making them deeply embedded within regional hydrological systems. Unlike energy, which can be generated remotely and transmitted across long distances, water availability is highly localized. At peak operation, large data centers may require up to five million gallons of water per day, placing their water demand on par with that of small municipalities serving between 10,000 and 50,000 people.[1] This brings regulation into focus, highlights the tension between sustainability goals and carbon dependence, and points toward the future need for robust business models to support AI hub expansion decisions. This insight examines the ways in which AI infrastructure consumes water, both directly and indirectly, explores regional variations in water stress, and evaluates policy and corporate strategies to manage these impacts sustainably.
How AI infrastructure uses water
AI infrastructure, particularly hyperscale data centers, relies on water as a critical operational input, yet the pathways of consumption are complex and often overlooked. Google reported that its data centers consumed approximately 5.6 billion gallons of water in 2023, representing a 24 percent increase compared to the previous year.[2] Water is used directly in cooling systems to manage the intense heat generated by high-performance computing, indirectly through the electricity that powers servers, and in the production of key hardware components such as processor chips. Understanding these different channels of water use is essential for assessing the environmental footprint of AI, identifying regional vulnerabilities, and informing both corporate and regulatory strategies for sustainable growth.
1. Data center cooling systems
The most visible direct use of water in AI infrastructure occurs through data center cooling. In practice, the most widely used approach, especially in hyperscale facilities, relies on cooling towers, where heat is dissipated through evaporation, consuming millions of liters of water annually in the process, especially in hot or arid climates. More advanced systems increasingly deploy liquid cooling, where water or dielectric fluids are brought closer to heat-generating components, improving thermal efficiency but still requiring significant water inputs at the system level. Water Usage Effectiveness (WUE™) has emerged as a key metric to measure data center water efficiency, typically expressed as liters of water used per kilowatt-hour of IT energy. While leading operators report declining average WUE values, aggregate water demand continues to rise due to the scale and concentration of new facilities.
2. Indirect water use through energy generation
Beyond direct cooling, AI infrastructure drives substantial indirect water consumption through electricity generation. Many power generation technologies remain water-intensive, particularly thermal power plants fueled by coal, natural gas, solar power or nuclear energy, which rely on water for steam generation and cooling. Hydropower, while classified as renewable, still depends on intensive water management for electricity generation, especially in regions where reservoir operations compete with agricultural or municipal demands.
The manufacturing of AI hardware, especially advanced processor chips, also introduces a further and often overlooked water footprint. Semiconductor fabrication is among the most water-intensive industrial processes, requiring large volumes of ultra-pure water for wafer cleaning, etching, and rinsing at multiple stages of production.
Additionally, water demand associated with AI infrastructure is also not constant. Peak usage often coincides with heatwaves, when cooling requirements intensify, and electricity grids are under stress. During such periods, data centers may draw additional water to maintain uptime, while backup generators and emergency cooling systems further increase consumption.
Geographic concentration of AI infrastructure and water stress
The global expansion of AI infrastructure is not evenly distributed. Instead, data center capacity is highly concentrated in specific regions where connectivity, energy access, land availability, and regulatory environments favor large-scale deployment.
The United States hosts the world’s largest concentration of data center capacity, with major hubs located in Northern Virginia, Texas, Arizona, and California. Northern Virginia is often referred to as “Data Center Alley”. While the region is not among the most water-scarce in the country, rapid infrastructure expansion has raised concerns about cumulative water withdrawals from local utilities and aquifers, particularly during periods of drought. Newer growth markets such as Texas and Arizona present more acute water risks. Texas benefits from a business-friendly regulatory environment and abundant land but faces increasing groundwater depletion, especially in central and western regions. California presents a different risk profile, as the state has water management frameworks and strong environmental regulations, chronic drought conditions, and climate-driven variability, which pose challenges for reliable data center operation.
In Europe, Northern European countries such as Ireland, the Netherlands, and the Nordic states have attracted significant data center investment due to cooler climates, relatively abundant water resources, and access to renewable energy. Ireland has become a major hub for hyperscale facilities, though the concentration of data centers has begun to strain local water and electricity systems, prompting regulatory scrutiny and temporary moratoria as these regions reached up to utilizing 20-25% of the local grid, while globally, data centers account for approximately 1-2% of total electricity consumption.[3], [4] Although this moratorium was primarily about electricity, it highlights how concentrated digital infrastructure growth can outpace local utility capacity and regulatory planning. Another example of the vulnerability of such infrastructure to extreme weather occurred in the United Kingdom in 2022, when a heatwave caused cooling failures and temporary shutdowns at Google and Oracle data centers in the London region.[5]
An often-overlooked dimension of U.S. President Donald Trump’s interest in acquiring Greenland from Denmark—beyond oil, gas, and critical minerals—lies in the island’s unique natural cooling potential. Covered by an ice sheet approximately 2,900 km x 1,100 km in size and up to 2 km deep, Greenland represents the world’s largest inhabited natural cooling reservoir and freshwater source. These characteristics position the island as a potentially attractive location for energy-intensive, hyperscale AI data centers and their supporting power infrastructure, where access to low ambient temperatures and abundant water could significantly reduce cooling-related resource demands. This is one of the reasons why leading technology figures like Bill Gates, Sam Altman, and Jeff Bezos have reportedly explored Greenland for long-term investments in recent years.[6]
The Middle East represents one of the most striking cases of AI infrastructure expansion in water-scarce environments. Gulf countries have announced ambitious investments in cloud computing, AI research, and digital services as part of broader economic diversification strategies. These developments are occurring in regions with minimal freshwater resources and extreme climatic conditions, where cooling demands are high year-round. Data centers in the Middle East rely heavily on desalinated water, an energy-intensive and costly process that links digital growth directly to energy consumption and emissions. While desalination provides a reliable water supply, it introduces trade-offs related to brine disposal and long-term sustainability.
China has pursued a dual strategy that combines coastal data center development, where water and energy access are relatively reliable, with inland expansion aimed at balancing regional development and leveraging renewable energy resources. However, inland regions often face water scarcity and greater climate variability. India’s emerging AI hubs are concentrated around major metropolitan areas such as Mumbai, Bengaluru, and Hyderabad, many of which already experience chronic water stress. Urban water shortages, aging infrastructure, and competing demands from residential and industrial users heighten the risk that data center expansion could exacerbate local water insecurity.
These regional patterns illustrate that the rapid growth of AI infrastructure intersects closely with local water availability, climate conditions, and utility capacity. Careful planning, investment in efficient technologies, and coordinated policy frameworks are essential to ensure that digital expansion does not compromise long-term water security.
Policy and regulatory responses
The rapid expansion of AI-driven digital infrastructure has outpaced the development of regulatory frameworks designed to manage its environmental impacts, particularly with respect to water use. While energy consumption and carbon emissions have become central elements of data center oversight, water governance remains fragmented, inconsistently applied, and often secondary within permitting and sustainability regimes. This gap highlights the need for standardized metrics and tools that can help regulators, utilities, and industry better assess and manage water demand.
WUE™ is one such standardized metric, developed by The Green Grid (TGG), a globally recognized consortium whose data center efficiency metrics are widely used by governments and industry. Building on the broad adoption of Power Usage Effectiveness (PUE™), WUE provides a trusted framework for assessing water efficiency alongside energy and carbon performance in data centers. The average WUE is estimated at around 1.9 L/kWh, a level increasingly used as a baseline target for new data center developments and commonly cited as a benchmark for facilities to outperform.[7]
Regulators in the UK, including the Water Services Regulation Authority (Ofwat) and the Environment Agency, have highlighted the cumulative impact of large-scale digital infrastructure on local water supplies and encouraged developers to demonstrate water efficiency, alternative cooling solutions, and alignment with regional water resource management plans.
In the United States, oversight of water supply and allocation varies by state, but regulatory mechanisms such as Arizona’s Assured Water Supply Program require large developments to demonstrate long-term availability of water before significant groundwater extraction or new supply commitments can be approved.[8] In addition, some municipalities are proactively extending regulatory controls for large users: in Tucson, Arizona, a new ordinance now requires any large user expected to consume millions of gallons annually, including data centers, to submit detailed water conservation plans and obtain explicit city approval before receiving access to potable supplies.
Singapore has taken a more direct regulatory approach to industrial water consumption, including for facilities with high cooling demand. Under the Public Utilities Board’s (PUB) water efficiency management practices, commercial and industrial users above specified thresholds must monitor, report, and manage water use, submit Water Efficiency Management Plans, and install metering systems to track usage.[9] National planning in Singapore also includes explicit goals for data center water efficiency within broader environmental policies: for example, the Green Data Centre Roadmap outlines strategies to reduce water intensity over the next decade as part of wider sustainability objectives.[10] This proactive and integrated approach demonstrates how policy, planning, and monitoring can work together to ensure that digital infrastructure growth is aligned with long-term water sustainability goals.
In the Middle East, governments across the region have long treated water as a strategic resource, leading to early and substantial investment in desalination, wastewater reuse, and non-potable water networks that support industrial and urban growth. While explicit requirements for water-use disclosure or standardized efficiency metrics such as WUE are not yet widely mandated, the policy environment is evolving through infrastructure planning guidelines, utility coordination, and sustainability frameworks linked to digital transformation initiatives.
Collectively, these examples demonstrate that region-specific policy frameworks, proactive utility planning, and emerging efficiency standards are essential for guiding sustainable AI infrastructure growth while safeguarding critical water resources.
Corporate responses and technological mitigation strategies
As regulatory frameworks lag the rapid expansion of AI infrastructure, major technology companies have increasingly turned to technological innovation and voluntary stewardship initiatives to mitigate water-related risks. Liquid cooling and immersion cooling technologies, now increasingly deployed for high-density AI workloads, transfer heat more efficiently than air-based systems and substantially reduce water demand by limiting evaporation. LiquidStack deployed a 40 MW hyperscale immersion‑cooled data center in the Republic of Georgia, achieving significant energy savings and substantially lower cooling water demand compared with traditional designs.[11] Several hyperscale operators, including Google and Microsoft, have publicly stated their intention to transition new facilities toward low- or zero-water cooling designs where climate conditions permit.
In parallel, many operators are reducing reliance on potable water by using alternative sources. These include treated wastewater, reclaimed greywater (lightly used wastewater from sinks, showers, and other non-toilet sources that can be treated and reused), seawater in coastal locations, and condensate captured from cooling systems. For example, data centers in Singapore and parts of the Middle East increasingly rely on non-potable water networks integrated into urban infrastructure planning. AI itself is also being applied to optimize cooling performance, with machine-learning systems dynamically adjusting airflow, liquid circulation, and temperature set points to reduce both energy and water intensity during peak demand periods.
Beyond efficiency measures, large technology firms have expanded corporate water stewardship programs that aim to offset operational water use through replenishment initiatives. Google, Microsoft, and Amazon have each announced commitments to become “water positive” by funding watershed restoration, rainwater harvesting, and community water access projects in regions where they operate. While companies promote water offsets and replenishment programs as solutions, their ability to fully address the local water impacts of data center operations remains disputed, as replenishment projects may not address localized supply constraints faced by municipalities during droughts or heatwaves.
In addition, corporate reporting on water use is far less standardized than energy or carbon disclosures, with limited third-party verification and inconsistent breakdowns by facility or region. As AI-driven compute demand accelerates, these transparency gaps raise questions about whether voluntary corporate strategies alone are sufficient to manage cumulative water impacts at scale.
Policy solutions for managing water use in AI infrastructure
A foundational policy solution for managing water use in AI infrastructure relies on the introduction of consistent and mandatory water-use disclosure. Policymakers should integrate water-use reporting into existing environmental and sustainability disclosure frameworks, using standardized metrics such as WUE to ensure comparability across facilities and regions. Facility-level water disclosure would enable regulators, public utilities, and local authorities to assess cumulative demand, identify high-risk locations, and support evidence-based planning decisions. Over time, improved transparency can also encourage efficiency improvements by enabling benchmarking and peer comparison across operators.
Improved coordination between national economic development authorities and local permitting bodies can help balance strategic digital infrastructure objectives with environmental constraints. Effective governance of data center water use will require combining region-specific regulation with shared global standards. Given wide variations in climate, water availability, and infrastructure, local regulatory flexibility is essential. Empowering utilities to set differentiated tariffs for large users, requiring efficiency commitments, and integrating data centers into long-term water planning can help ensure that AI infrastructure growth supports broader public water security goals.
As mentioned earlier, Tucson, Arizona’s enactment of a municipal ordinance requires high-demand water users, including data centers expected to consume millions of gallons annually, to submit detailed water conservation plans and obtain city approval before accessing potable water supplies, resulting in clearer accountability and improved alignment between infrastructure development and local water availability. Similar approaches are emerging in other regions, reflecting a broader shift toward integrating digital infrastructure into long-term water resource planning rather than treating it as an isolated industrial activity. At the regional level, the American Water Works Association (AWWA) has published guidance to help water utilities anticipate and plan for data center impacts on supply infrastructure, fostering proactive coordination between utilities, regulators, and digital infrastructure developers to avoid supply stress and optimize resource allocation. Similarly, the Environment Agency and Ofwat have signaled that cumulative water demand from large users, including digital infrastructure, should be incorporated into regional Water Resource Management Plans in the UK. Singapore’s Public Utilities Board requires industrial users above specified thresholds to monitor, report, and manage water consumption through Water Efficiency Management Plans, and metering systems encourage adoption of advanced cooling and non-potable water use to reduce pressure on potable supplies.
From a corporate perspective, while Google and AWS are implementing voluntary water stewardship programs, including the use of reclaimed or non-potable water for cooling and investments in watershed replenishment projects in water-stressed basins, these efforts remain largely operational in scope. Beyond such measures, these firms should increasingly position themselves to support longer-term solutions by contributing to regulatory frameworks, sharing best practices with public authorities, and investing in research and development for water-efficient cooling systems and alternative technologies. Doing so would help align private sector innovation with public water security goals over the long term.
Conclusion
The rapid growth of AI-driven digital infrastructure has created new and often underappreciated pressures on global water resources. Data centers rely on water for direct cooling, electricity generation, and the manufacturing of critical hardware, with demand concentrated in specific regions that may already face water stress. Geographic patterns, from the United States and Europe to the Middle East and Asia, demonstrate that local water availability, climate variability, and utility capacity are central determinants of the sustainability of AI expansion.
Addressing these challenges requires a multi-pronged approach. Policy frameworks that mandate water-use disclosure, integration of data centers into regional water resource planning, and incentivizing efficiency can guide sustainable growth. Corporate actors play a complementary role through technological innovation, operational efficiency, alternative water sourcing, and stewardship programs, while also supporting longer-term solutions through R&D and collaboration with regulators.
Ultimately, aligning AI infrastructure growth with water security demands both public and private engagement, combining regulatory foresight, technological innovation, and transparent reporting. By adopting these strategies proactively, governments and industry can ensure that the benefits of AI are realized without compromising the sustainability of the water resources on which these systems depend.
[1] Miguel Yañez-Barnuevo, “Data Centers and Water Consumption,” Environmental and Energy Study Institute, June 25, 2025, https://www.eesi.org/articles/view/data-centers-and-water-consumption.
[2] Yankai Jiang, Rohan Basu Roy, Raghavendra Kanakagiri, and Devesh Tiwari, “WaterWise: Co-optimizing Carbon- and Water-Footprint Toward Environmentally Sustainable Cloud Computing,” ACM Digital Library, 2025, https://dl.acm.org/doi/pdf/10.1145/3710848.3710891.
[3] “AI Demand to Drive $600B From the Big Five for GPU and Data Center Boom by 2026,” Carbon Credits, January 23, 2026, https://carboncredits.com/ai-demand-to-drive-600b-from-the-big-five-for-gpu-and-data-center-boom-by-2026/#:~:text=Data%20centers%20play%20a%20central,73%25%20driven%20by%20AI%20rollout.
[4] “Ireland’s data centre appeal ‘fading’ due to pressure on electricity grid, report says,” The Irish Times, September 17, 2025, https://www.irishtimes.com/business/2025/09/17/irelands-data-centre-appeal-fading-due-to-pressure-on-electricity-grid-barclays-report-says/.
[5] “Google’s London data center outage during heatwave caused by “simultaneous failure of multiple, redundant cooling systems,” Data Center Dynamics, August 2, 2022, https://www.datacenterdynamics.com/en/news/googles-london-data-center-outage-during-heatwave-caused-by-simultaneous-failure-of-multiple-redundant-cooling-systems/
[6] Greenland’s Cold Climate Fuels Global Race for AI Data Centres, TRT World, 2026, 3:07, https://www.youtube.com/watch?v=tT6mKP6lrCI.
[7] Yañez-Barnuevo, “Data Centers and Water Consumption.”
[8] “About the Programs,” AZwater, 2025, https://www.azwater.gov/aaws/aaws-overview.
[9] “Singapore: The quest for green data centres (Part 2),” Bird&Bird, February 27, 2025, https://www.twobirds.com/en/insights/2025/singapore/singapore-the-quest-for-green-data-centres.
[10] “Data Center Futures: Water,” ARUP, September 2025, https://www.arup.com/globalassets/downloads/insights/t/the-future-of-data-centres-how-is-the-industry-changing-in-the-ai-era/data-centre-futures-water.pdf.
[11] Nino Lazariia, “Revolutionizing Data Center Cooling: Immersion Technologies at the Forefront,” Cleantech, August 15, 2023, https://cleantech.com/revolutionizing-data-center-cooling/.