Everyone talks about AI… but here’s what it really consumes

From phone assistants to office software, artificial intelligence looks almost magical on screen. Behind the scenes, though, a rapidly expanding industrial infrastructure is drawing as much power as entire countries, and researchers are only starting to map the real environmental price of this boom.

How AI jumped from clever software to heavy industry

For years, AI felt like a niche research topic. That changed once large language models and image generators went mainstream. Tech giants began racing to plug “generative AI” into search engines, office tools, cars, and home gadgets.

To keep all this running, companies build and expand data centres – vast halls packed with servers and specialised chips. These buildings may look anonymous from the outside. On the grid, they behave more like power-hungry factories.

AI is not just a smart algorithm on your laptop. It is an industrial system of chips, servers, cables, cooling towers and power lines.

Researchers at MIT have started to treat AI not as a simple software upgrade, but as a new class of infrastructure with its own footprint. Their work highlights how training and operating AI models reshapes energy, water and material use worldwide.

Data centres now rival entire nations in electricity use

The surge in AI demand is sharply visible in data centre energy numbers. In North America, electrical capacity used by these facilities roughly doubled in a single year, jumping from about 2,700 megawatts at the end of 2022 to more than 5,300 megawatts by the end of 2023.

Looking globally, data centres consumed an estimated 460 terawatt-hours of electricity in 2022. That is roughly on par with the annual power consumption of France. Projections suggest this could pass 1,000 terawatt-hours by 2026 if current trends continue.

By the middle of this decade, data centres could draw as much electricity as several mid-sized industrialised countries combined.

This energy use is not steady. Training a cutting-edge AI model means running tens of thousands of graphics processors flat out for weeks or months. That produces sharp peaks in demand that grid operators must cover, often by firing up fossil-fuel plants or diesel generators.

➡️ A rare early-season polar vortex shift is now developing, and experts say its strength is nearly unprecedented for February

➡️ As It Drifts Away From Earth, The Moon Slowly Changes Our Days And Our Tides

➡️ France and Rafale Lose €3.3 Billion Military Contract Following Diplomatic Breakdown

See also  Meteorologists warn this country may face a historic winter as La Niña and the polar vortex align

➡️ Anduril Wins $23.9M Contract to Equip U.S. Marines With Man-Packable Bolt-M Loitering Munitions

➡️ Electricity: the French households abandoning Tempo for EDF’s regulated tariff

➡️ Legendary rock band announces retirement after 50 years, marking the end of an era for “the hit everyone knows”

➡️ Spain and Portugal are slowly turning in place say geologists and this strange continental dance is splitting opinions between alarm and indifference

➡️ This overlooked habit helps reduce social fatigue

Why your AI chat uses more power than a web search

Once a model is trained, it moves into what engineers call “inference”: answering questions, summarising texts, generating images or code in real time. This phase lasts for the entire life of the model.

An average web search requires relatively modest computing effort. A conversation with a large language model is far heavier. Early estimates suggest a single prompt to a popular chatbot may use around five times more electricity than a classic search query.

  • Basic web search: lightweight computation on a standard server
  • AI chatbot request: large neural network activated across powerful chips
  • Image generation: even more demanding, often running multiple passes

As companies create larger and more capable models, and as more people use them every day, these per-request costs add up quickly.

The hidden water bill of artificial intelligence

Electricity is only part of the story. Servers produce heat, and that heat has to go somewhere. Many data centres rely on water-based cooling systems, where huge volumes of water pass through chillers and cooling towers to keep chips within safe temperatures.

Industry figures often work with a simple ratio: roughly two litres of water used for every kilowatt-hour of electricity consumed. Scaled over hundreds of terawatt-hours, the numbers become striking, especially in regions already dealing with droughts or competing agricultural use.

Each AI request can carry an invisible water cost, particularly when the data centre sits in a hot, dry region fighting for every litre.

Some firms are moving towards air cooling, direct-chip liquid cooling or locating data centres in cooler climates. Others plan to match their consumption with water restoration projects. For now, though, most AI runs on infrastructure that depends heavily on local water resources.

From rare metals to toxic waste: the material footprint

Beyond operations, there is the hardware itself. AI data centres increasingly rely on graphics processing units, or GPUs, instead of standard CPUs. GPUs handle thousands of calculations at once, which makes them ideal for neural networks.

See also  Emergency declared in Greenland as researchers spot orcas breaching dangerously close to rapidly melting ice shelves

Producing these chips is far from clean. It involves intensive mining of copper, cobalt and other materials, extensive chemical processing and highly energy-intensive semiconductor fabrication plants.

MIT researchers estimate that chip makers such as Nvidia, AMD and Intel sold about 3.85 million GPUs to data centres in 2023, up from 2.67 million in 2022. That pace likely climbed again in 2024.

AI’s brainpower comes from hardware that is harder, dirtier and more complex to manufacture than traditional processors.

Each chip leaves behind a trail of emissions and waste, from the mine to the smelter to the factory. When servers are replaced, often after only a few years, disposal and recycling add further environmental pressure.

Why measuring AI’s true impact remains tricky

Researchers freely admit that current numbers are rough. Data centre operators do not always publish detailed statistics. AI companies treat their models and infrastructure as trade secrets. Supply chains stretch across dozens of countries and suppliers.

MIT engineers argue that a more “contextual” method is needed. That means accounting for where data centres are built, how local grids generate electricity, which materials are used, and how communities near mines and factories are affected.

Technological change is also moving faster than the research. By the time academics measure one generation of hardware or models, companies may have deployed the next, with different efficiency levels and different impacts.

Can AI go green, or at least greener?

There are genuine efficiency gains on the horizon. Chip designers are working on specialised AI processors that do more work per watt. Data centre operators are fine-tuning cooling systems and placing facilities near renewable power sources.

Some promising directions include:

  • Training models in regions with abundant wind, solar or hydroelectric power
  • Scheduling energy-intensive training runs for times when the grid is rich in renewables
  • Using smaller, task-specific models instead of one giant model for everything
  • Recycling more heat from data centres into district heating networks

Policies may also push change. Regulators in Europe and some US states are starting to demand greater transparency from data centres on energy and water usage. Investors are asking for lifecycle assessments of AI projects. Large customers, from banks to governments, increasingly factor environmental criteria into their contracts.

What this means for regular users and companies

For individuals, the idea that a question to a chatbot might use several times the energy of a normal search feels abstract. One way to think about it is to imagine AI as similar to streaming video: one or two sessions won’t break the planet, but heavy daily use by billions of people changes infrastructure planning.

See also  This 7,000-year-old stone wall found off the coast of France may be the work of hunter-gatherers

Companies face more direct choices. They can decide whether they really need giant models for every task, or if smaller, more efficient systems suffice. They can select cloud providers based on energy mix and cooling methods. They can push vendors for transparency on emissions and water use.

AI does not have to be abandoned to cut its footprint, but it does have to be used more like a scarce resource than free magic.

Key terms that shape the debate

A few technical words now sit at the heart of debates about “green AI”. Understanding them helps make sense of claims and marketing slogans:

Term What it means for AI’s footprint
Training The one-off but highly intensive phase where a model learns from vast datasets. Huge energy peaks.
Inference The day-to-day use of a model. Lower per request, but continuous and driven by user demand.
PUE (power usage effectiveness) A measure of data centre efficiency. A PUE of 1.2 means 20% extra power is used for cooling and overheads.
Water usage Often stated per kilowatt-hour, but highly dependent on climate, cooling technology and local policy.
Lifecycle emissions All greenhouse gases linked to a product, from mining and chipmaking to operation and disposal.

Scenarios for the next decade

If AI adoption keeps racing ahead and efficiency lags, grids will need new power plants, more high-voltage lines and larger cooling systems. That path could lock in additional fossil fuel use for years, especially in regions where coal and gas remain cheap.

A different scenario pairs AI growth with aggressive efficiency gains and a rapid shift to renewables. In that case, AI still adds pressure, but acts more like an accelerator for clean infrastructure: justifying new wind farms, solar parks and storage systems that also serve homes and businesses.

Users will not see these choices in their chat windows or photo apps. Yet every new “smart” feature sits on a spectrum that runs from resource-hungry to resource-aware. The conversation about AI’s future is not only about what it can say or create, but about what it quietly consumes to be there at all.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top