Artificial intelligence in 2025 produces as much carbon dioxide emissions as New York — from 32.6 to 79.7 million tons annually compared to 50 million tons for the metropolis. A new study has shown that global AI energy consumption has reached 23 GW, surpassing bitcoin mining figures for the entire year of 2024.

Water appetite exceeds the global bottled water market

Alongside the increase in energy consumption, artificial intelligence consumes between 312.5 and 764.6 billion liters of water annually — an amount comparable to the annual consumption of the entire global bottled water market. Alex de Vries-Gao, a PhD student at the VU Amsterdam Institute for Environmental Studies, published a paper in the journal Patterns, stating: "It is impossible to get an absolutely accurate figure, but it will be really big in any case. In the end, everyone pays for this."

Large data centers with a consumption of 100 MW of energy require about 2 million liters of water daily to cool servers — as much as 6,500 households consume. At the same time, up to 80% of the water used evaporates during cooling, making it unrecoverable.

Energy consumption has increased fivefold

The International Energy Agency (IEA) in a special report "Energy and AI" predicts that by 2030, electricity consumption by data centers worldwide will more than double to 945 TWh — more than the consumption of all of Japan.

The gap in energy consumption between old and new data centers is staggering. If ten years ago the average data center used 20 MWh of energy per month, modern AI campuses consume 100+ MWh — five times more than their predecessors.

Lawrence Berkeley National Laboratory estimated that by 2028, more than half of the electricity consumed by data centers will be directed towards servicing AI. By then, artificial intelligence will consume as much electricity as 22% of all US households.

Electricity bills are rising by 267%

The impact of the AI boom on electricity rates is becoming critical. US data centers consumed 183 TWh of electricity in 2024 — over 4% of the country's total consumption, equivalent to the annual demand of Pakistan.

Bloomberg News found that in regions near data centers, the wholesale price of electricity has increased by 267% over five years. More than 70% of the nodes with rising prices are located within 50 miles of major data centers.

A study by Carnegie Mellon University shows that data centers and mining could increase the average electricity bill in the US by 8% by 2030, reaching a 25% increase in the markets with the highest demand in Virginia.

Tech giants acknowledge the problem

Google in its 2025 environmental report acknowledged: "While we remain committed to climate goals, it has become clear that achieving them has become more complex at all levels." The company states that reaching its goal of eliminating all emissions by 2030 has become "very challenging."

Microsoft plans to convert all data centers to 100% renewable sources by 2030, while Google has already achieved 90% renewable energy usage. However, companies often have to resort to natural gas to meet current needs.

Water stress in arid regions

Particular concern arises from the placement of data centers in water-scarce regions. Bloomberg found that about two-thirds of American data centers built since 2022 are located in areas with high water stress.

In Texas, data centers will use 49 billion gallons of water in 2025 and up to 399 billion gallons by 2030 — an amount equivalent to a 16-foot drop in the largest US reservoir, Lake Mead, in a year.

The issue of data transparency

The main difficulty in assessing the real impact of AI lies in the reluctance of technology companies to disclose detailed information about resource consumption. Alexandra Luccioni from Hugging Face calls for: "We need to stop trying to reconstruct figures based on rumors and put more pressure on companies to share real data."

Research shows that energy consumption in the IT sector will grow by 30-40% annually over the next 3-5 years.

Musk's space solution

The figures are impressive: from the water that would be enough to lower the level of the largest American lake by 16 feet, to nearly tripling electricity rates in regions with data centers. Meanwhile, technology companies continue to hide the real amounts of resource consumption by their AI systems, making accurate impact assessment virtually impossible.

Elon Musk proposes a radical solution — to move data centers into space. "Starship will be able to send about 300, or even 500 GW of solar AI satellites into orbit each year," said the head of SpaceX. According to his forecasts, in just 4-5 years, the most cost-effective way for AI systems to operate will be solar-powered satellites, where there are no limitations on power supply and cooling due to thermal radiation.

AI Opinion

Historical patterns of technological revolutions show similar resource consumption crises. The electrification of the 1920s raised similar concerns about overloading power systems, but back then, the alternative was a lack of electricity. Today, AI competes with existing electricity needs, creating shortages where there were none.

The technical side of the problem goes beyond simple energy consumption. Modern AI chips require stable voltage with minimal fluctuations, forcing data center operators to reserve redundant capacity. This means that actual network consumption exceeds active usage by 20-30%. Musk's space solution looks technologically appealing but ignores the issue of energy transmission to Earth and space debris from thousands of satellites.

#AI #AImodel #Write2Earn

$BTC

BTC
BTC
87,193.6
-2.68%