Google revealed how efficient its Gemini AI is: each prompt uses just a tiny bit of energy – like watching TV for 9 seconds – and even less water and carbon than everyday actions. These small numbers are possible because Google improved their chips, software, and use of clean energy at their data centers. The report also counts everything, from cooling servers to making chips, making it one of the most complete studies so far. But when millions of people use Gemini, all those small costs add up, and Google’s total emissions have still grown as it builds more data centers. Experts say we still need better ways to track the total impact of AI.
How much energy, carbon, and water does one Google Gemini AI prompt use?
A single Google Gemini AI prompt uses just 0.24 Wh of energy (equal to 9 seconds of TV), 0.03 g CO₂ emissions (less than 1/30 of a breath), and 0.26 ml fresh water (about 5 drops), thanks to major efficiency improvements.
Google has just lifted the curtain on the energy appetite of a single Gemini AI prompt, and the numbers might surprise you.
How much does one prompt really cost?
Resource consumed | Amount per prompt | Everyday equivalent |
---|---|---|
*Energy * | 0.24 Wh | 9 seconds of TV watching or 1-second microwave zap |
CO₂ emissions | 0.03 g | Less than 1/30 of the CO₂ exhaled by a human breath |
Fresh water | 0.26 ml (≈ 5 drops) | Less than a sip from a teaspoon |
Behind these tiny figures is a year-long 33-fold drop in median energy use and a 44-fold cut in carbon footprint, driven by better chips, smarter software and a bigger share of clean electricity flowing through Google’s data centers (source).
What Google counted (and why it matters)
Previous public estimates usually looked only at the moment a GPU crunches numbers. Google’s disclosure folds in idle hardware, data-center cooling, chip manufacturing and even the water needed to keep servers from overheating – a scope wide enough to make this one of the most complete AI-impact studies released by any hyperscaler so far (source).
The catch: scale still adds up
Even super-efficient prompts add weight when issued millions of times. Google’s absolute emissions rose 11 % last year, mainly from building new server farms and fabricating the specialized processors that power them. Without standardized industry metrics, total load remains hard to pin down – a gap regulators and researchers are now pushing to close (source).
Google just gave the world its first official peek at how much energy one Gemini request really uses – and the numbers are smaller (and stranger) than most researchers expected.
How much energy does one Gemini prompt burn?
Google pegs the median text prompt at:
- 0.24 watt-hours – about the same as a 9-second TV viewing session
- 0.03 grams of CO₂ – lighter than one-fifth of a standard business card
- Five drops of water – roughly 0.26 milliliters
That is 33× less energy than the same type of query needed just one year ago, and the carbon footprint has dropped 44× in the same period, thanks to newer TPUs and a higher share of carbon-free electricity.
How did Google arrive at these figures?
Unlike past estimates that only looked at the chip doing the work, Google’s team counted every upstream watt:
- Active TPU/GPU power
- Idle machines kept online for low latency
- Full data-center cooling overhead (PUE)
- Embedded emissions from building the hardware itself
So far, no independent lab has formally audited the model, but the company has published a 17-page methodology paper and invited outside reviewers.
Why are total emissions still going up?
Per-query efficiency is soaring, yet Google’s absolute emissions rose 11 % year-over-year. The culprit is scale:
- More chips built and shipped
- Larger training clusters for newer models
- Global data-center expansion, especially water-intensive cooling sites
The pattern matches International Energy Agency projections that AI data-center energy demand could double by 2030, reaching Japan’s entire yearly electricity use.
Do other AI tasks cost more?
Yes. Google has not released figures for image or video generation, but outside studies show:
- An image prompt can use 4-10× the energy of a text prompt
- A 30-frame high-definition video prompt can top 200× the footprint of text
For comparison, a single ChatGPT text prompt is estimated at 2.9 Wh – more than ten times the 0.24 Wh used by Gemini.
What happens next?
Google’s disclosure is already sparking calls for standardized reporting. Watch for:
- EU CSRD rules forcing Scope 3 disclosures from AI firms starting in 2026
- A possible ISO standard for sustainable AI (expected late 2025)
- Pressure on rivals to publish model-by-model footprints
Bottom line: your next Gemini question is lighter than a microwave ping, but the industry’s cumulative appetite is still on track to rival the power needs of entire nations.