This is the 9th in a series of AI CORNER articles, which explains why modern data centers generate significant heat, how cooling systems work, and why water use has become a central issue—while highlighting the technologies now reducing water demand across the industry.
As data centers become the physical backbone of cloud computing and artificial intelligence, questions about their environmental footprint – especially water use – are increasingly common. To understand the issue clearly, it helps to start with a simple fact: data centers generate enormous amounts of heat and managing that heat safely and efficiently is critical to keeping digital systems running.
Why Cooling Is Necessary
Inside a modern data center there are thousands of servers processing and storing data around the clock. These servers consume large amounts of electricity, often 30 to 60 megawatts (MW) per building, with some advanced AI facilities exceeding that range. Nearly all of that electrical energy ultimately becomes heat. Without effective cooling, equipment would fail in minutes.
Cooling systems exist to remove that heat and maintain stable operating temperatures. Historically, many data centers relied on evaporative cooling towers, which use water to absorb heat and release it into the atmosphere. This approach is effective but can be water intensive.
Typical Water and Power Use
Industry studies suggest that a traditional 100 MW data center using evaporative cooling may consume 300,000 to 600,000 gallons of water per day, depending on climate, design, and operating conditions. That statistic understandably raises concern, particularly in regions sensitive to water availability.
To measure efficiency, operators track Water Usage Effectiveness (WUE), which reflects how many liters of water are used per kilowatt-hour of IT energy. Lower WUE means better water efficiency.
Growing Concerns
Public concern around water use tends to focus on three questions:
- Where does the water come from?
- How much is used compared to other industries or households?
- Are there alternatives that reduce or eliminate freshwater demand?
These questions are especially relevant as AI workloads increase power density and thermal output.
Modern Alternatives and Improvements
The data center industry has evolved rapidly. Today, many new facilities rely on:
- Air-cooled systems that use outside air and heat exchangers
- Closed-loop chilled water systems, where water circulates repeatedly rather than being consumed
- Direct liquid cooling and liquid-to-chip cooling, which remove heat more efficiently
- Immersion cooling, where servers are cooled in specialized fluids with little or no water use
These approaches significantly reduce – or in some cases nearly eliminate – ongoing water consumption.
Setting the Stage for a Local Conversation
Understanding why data centers use water and how cooling technology has evolved is essential before evaluating any specific project. In the next article, we will look closely at what is actually being built and used in Frederick County, and more specifically, how cooling and water usage are being handled at the Quantum Frederick data center campus—where modern design.
Become a MacRo InsiderWith more than 50 years advising regional landowners, investors, and institutions, Rocky Mackintosh, Broker of MacRo, LTD has firsthand experience supporting nationally recognized hyperscalers with site search and selection services throughout the Mid-Atlantic. Our team has worked at the interface of land planning, infrastructure analysis, and high-value redevelopment—experience that uniquely informs our understanding of projects like Quantum Frederick.

