Google just revealed a $9 billion plan to construct a new data center site in Loudoun County, Virginia, which is known as “Data Center Alley.” This is part of the most recent wave of Big Tech investing on AI.
Northern Virginia is not the only place where this type of development occurs. An industrial boom in data centers is changing the environment, local economies, and landscapes throughout the nation.
In order to map and assess the effects of these developments nationwide, The Cool Down spoke with leading experts on whether this expansion will endanger nearby towns or present a chance to boost employment, growth, and cleaner energy.
AI data centers that are expanding quickly: “Power-hungry facilities”
The number of data centers in the United States is presently above 5,400. The mentioned map illustrates the extensive, countrywide growth of significant facilities closely related to AI adoption or use that are either planned, under construction, or in the process of being created.
“The rapid growth of AI data centers is concentrating large, flexible loads in specific regions, which puts pressure on local grids, water resources, and permitting,” says Marcus Jecklin, co-founder of Ai4, North America’s largest AI industry event.
He told The Cool Down that if it’s done badly, there would be price increases, water stress, and opposition from the surrounding community. But when done well, data centers may boost sustainable energy by recovering waste heat, investing in storage and transmission, and securing long-term contracts for new wind, solar, geothermal, or small hydro projects, according to Jecklin.
Considering that many regions are currently coping with overloaded grids or drought situations, scaling up AI will demand astounding quantities of water and electricity. A single hyperscale plant can use millions of gallons of water and as much electricity as a mid-sized city each year, even when it is optimized.
“Data centers are extremely power-hungry facilities, and a large portion of the power they use still comes from fossil fuels,” noted Professor Romany Webb, deputy director of Columbia University’s Sabin Center for Climate Change Law, as an explanation of the difficulty. They are therefore a major cause of climate change, but they may also contribute to its solution, depending on how they are constructed and powered.
In addition to running these facilities, energy is needed to keep their hardware cool and avoid overheating. According to Microsoft, between 2021 and 2022, the amount of water it used for AI cooling nearly doubled. In one year, a Google data center in The Dalles, Oregon, used more than 274 million gallons of water.
The city signed a $28.5 million deal with Google to gain more access to the Dog River watershed in order to accommodate the growing demand. In return for infrastructure to facilitate its growth, the corporation gave the city the rights to use industrial groundwater.
The Dog River may run dry for up to eight months a year if withdrawals continue, according to environmental experts, endangering fish habitats and aquatic creatures that are vital to the area, including salmon, steelhead, and smaller invertebrates that are vital to the food chain.
In a similar vein, the Alliance for the Great Lakes recently released a paper warning about the effects of AI data centers on the Great Lakes, which contain almost 20% of the freshwater on Earth. According to Inside Climate News, Illinois is home to more than 220 data centers and is predicted to experience rapid development.
According to the Alliance for the Great Lakes report, hyperscale data centers in the United States can use up to 365 million gallons of water annually, which is equivalent to 12,000 Americans’ water use during that time. Over the next five years, they are expected to use up to 150.4 billion gallons of water, which is equivalent to 4.6 million American households.
In arid places like New Mexico or Arizona, where water shortage is already acute, farmers, wildlife, and communities may eventually face direct competition with AI infrastructure for depleting groundwater resources.
The Chief AI Officer Network and RegulatingAI creator Sanjay Puri told The Cool Down, “There will be challenges with every transformative technology.” “We have to maintain our boundaries. deciding who utilizes these systems, how we spend our resources, and whether or not they are equitable.
Puri collaborates with prominent figures in industry, politics, and civic society to influence governance frameworks globally and promotes creative AI regulation.
He claimed that the demand will only increase more and the demand for AI appears to be endless.
How to meet AI’s energy needs
While businesses brag about their clean energy commitments, many projects are still tied to the expansion of fossil fuels.
According to Webb, we simply cannot be constructing a large amount of new fossil fuel infrastructure if we wish to meet global climate targets. However, arguments are being made to build new gas plants and extend the lifespan of existing coal plants only to power data centers.
Microsoft is spending $3.3 billion to construct a huge campus in Mount Pleasant, Wisconsin. State officials authorized two new natural gas facilities that We Energies is expected to operate for the next 30 years in order to fulfill demand. And to meet its enormous power needs, Meta’s $10 billion project in Richland Parish, Louisiana, will be dependent on three recently authorized natural gas demands.
According to the Union of Concerned Scientists, pollution from natural gas-related pollutants, such as methane from leaks and nitrogen oxides from burning, can affect water quality, small animals, insects, and birds in addition to endangering human health.
Although many data centers still use diesel backup generators, more recent facilities are increasingly using nuclear power, which is thought to be a cleaner energy source than fossil fuels. There are also signs that entirely clean renewable energy can be used.
With the help of a local nuclear power plant, Amazon Web Services just paid $650 million to acquire the Cumulus Data Center Campus in Pennsylvania, which is dubbed “the world’s first 24×7 carbon-free, co-located data center.” Aside from other nuclear waste and wastewater treatment concerns, however, the facility uses water from the Susquehanna River to cool its reactors, making them warmer in the process, potentially upsetting aquatic ecosystems.
To provide zero-emission AI infrastructure, however, Eco Computing Labs, a Silicon Valley business, is constructing off-grid, hydrogen-powered data centers. It has the world’s first fully hydrogen fuel cell-powered plant, the MV1 facility in Mountain View, California (shown on the map).
Unlike megacampuses, ECL creates smaller units capable of handling massive AI workloads. MV1 demonstrates that an AI data center does not require hydropower or carbon-based fuels; it can operate cleanly, have a negative water footprint, and provide great performance.
Geothermal power is an alternative. According to a new research by Future Ventures and Project InnerSpace, a single geothermal site in the US may supply electricity and cooling for many AI data centers without the need of the grid. Geothermal is a good fit for the United States, it adds, because the capabilities of the oil and gas personnel are easily transferred to the geothermal sector.
As part of its plans to power data centers carbon-free, Google said earlier this year that it had reached a $3 billion agreement to acquire hydro facilities in Pennsylvania.
The Gap in Regulation
At the UN’s AI for Good summit in Geneva in July, the worldwide movement for significant AI regulation took center stage. However, the government reaction in the United States prioritizes speed over regulatory protections.
AI infrastructure is deemed “critical” by the AI Action Plan, which was released in the same month as the UN meeting. It also asks for fewer environmental studies, quicker approvals, and less regulatory obstacles.
Webb told The Cool Down that there is a lot of pressure on facilities to be operational as soon as possible, sometimes without a thorough knowledge of their effects on the environment and climate. “That urgency highlights the need for stronger regulation.”
Take the AI startup Crusoe, occasionally referred to as “The AI Factory Company.” Crusoe, in collaboration with OpenAI, Oracle, JPMorgan, SoftBank, and others, is developing a $15 billion data center facility near Abilene, Texas, dubbed as Stargate. The joint venture comprises up to 85% property tax abatements, eight new buildings, and the development of nearly 350 jobs.
Stargate is currently under development and is projected to be finished next year. Permit applications have also been filed for a 360-megawatt onsite natural-gas plant to provide power; President Donald Trump has publicly identified the project as part of a larger national AI infrastructure deployment.
Carbon offsets, renewable energy credits, or commitments to operate emissions-free by 2030 are frequently the foundation of corporate sustainability pledges. Although these objectives may seem lofty, they don’t really address the current situation, which is that the AI revolution is already putting a strain on ecosystems, water tables, and antiquated electrical infrastructures. The International Energy Agency predicts that by 2030, data centers in the United States would be responsible for over half of all electricity consumption.
According to Jecklin, “compute-aware development”—building with efficiency aims from the start rather than as an afterthought—is the cultural transformation we need. When engineering and procurement have access to the same carbon-and-cost dashboard, efficiency becomes a competitive advantage rather than a nice-to-have.
According to him, “policy is no longer optional.” Infrastructure, privacy, IP, and safety must all be made clear to businesses. Standardized reporting, procurement regulations that encourage additionality, simplified licenses for clean siting, and grid services markets that emphasize flexibility are all examples of how enforcement may be made practical with measurement and incentives.
The generation of jobs is the main selling factor for many communities. But it’s a sobering fact. Professor Webb claims that when construction is complete, data centers need very few permanent employees, which might result in increased utility costs, environmental stress, and significantly less economic value than anticipated.
In comparison to the costs to the environment and public health, the long-term employment advantages for communities are negligible, according to Webb.
In actuality, data centers do provide excellent employment, but not many, Puri said.
The Way Ahead
There is no denying the wonders of artificial intelligence and quantum computing, and their infrastructure is here to stay. An estimated 800 million people utilize OpenAI’s ChatGPT alone each week, with 120–190 million of them signing in every day.
The need for data centers will only increase as more businesses incorporate AI into their products. The question is whether planning and regulation can catch up before the harm is irreparable.
Puri concurs: “Companies are kind of learning and [as they say] flying the plane and building it at the same time.”
Shifting to renewable, sustainable energy is no longer an option; it is now a need for AI-powered firms and the communities they serve, from Silicon Valley to Data Center Alley, to thrive.
How soon, though, can such change occur? And who makes the rules—state commissions, federal regulators, or the local communities left to shoulder the costs?
Puri said in discussions with tech company executives that creating green AI regulations is increasingly important. While securing talent remains their top priority, the necessity of renewable power solutions is becoming more widely recognized.
“The good news is that it’s getting discussed,” Puri added.Innovation is advancing quickly, and it should lead to more sustainable options, whether it be in power consumption reduction or AI cooling [methods].
Opposition from the community is increasing as the development continues. Legal and environmental letters, spearheaded by groups like the Southern Environmental Law Center, helped homeowners and environmental groups in Bessemer, Alabama, convince officials to halt Project Marvel, a 4.5 million-square-foot data center.
The takeaway is unambiguous: public scrutiny counts.
Make infrastructure a first-class citizen throughout the event, bringing utilities, cities, and cloud providers on stage with companies. Bring more community voices, labor, local governments, and environmental groups into siting talks, Jecklin said.
Data Center Watch reports that data center projects worth $64 billion have been postponed or rejected in the United States. At least 142 activist groups in 24 states were found to be mobilizing against construction and expansion, according to the 2025 study.
There will be greater local resistance. While governments frequently seek incentives, communities desire more responsible development. According to Webb, that gap will only widen.
Webb noted that community benefits agreements might be effective instruments for talking about solutions. To make sure locals genuinely benefit, they have been utilized in stadium construction and renewable energy initiatives. The same standards should apply to data centers.
The Alabama case emphasizes the critical need for genuine policy involvement, more robust local lobbying, and transparent data. By 2030, experts estimate that the yearly cost of data center pollution to the public’s health might quadruple that of the coal-steel sector, amounting to $20 billion. However, few systems keep track of or report these expenses.
Communities frequently lack precise information on the long-term impacts or the amount of water or electricity a project will need.
The absence of unbiased data is one of the main obstacles. It is challenging for policymakers to make wise judgments since they rely on industry data for siting, size, and implications, Webb stated.
The Bessemer case demonstrates the effectiveness of advocacy. Holding AI infrastructure responsible may be achieved by requiring public reporting, linking fiscal incentives to sustainability, and demanding transparency on energy and water consumption. Both individuals and policymakers lack the means to react in the absence of trustworthy data.
According to Jecklin, alignment is a social compact between models and the biosphere, not merely a safety need. We may achieve something more like to abundance with ethics if we plan for sufficiency, transparency, and stewardship.
From the music we stream to our most recent financial transactions, data centers have long driven the digital world. It’s tempting to think of AI as a weightless, distant technology as we speed toward an AI-driven future. It really has a physical footprint, affecting ecosystems, public health, water supplies, electrical networks, and household budgets.
Connectivity comes at a great cost. Prices will continue to climb unless regulations and data-driven policies begin to direct expansion.
Large language models like ChatGPT and Gemini require constant training and operation, which has led to an unending loop of more servers, heat, cooling, and electricity. Repeat.
Professor Webb underlined that these facilities may enable critical climate research and modeling, but only if they are designed to do as little harm as possible to people and ecosystems. The technology has value; the question is whether we will create it properly.






