Skip to content
Video

AI Power Consumption and Sustainability

|

HIKE2

As artificial intelligence becomes part of our daily lives, its environmental impact is growing just as fast. At Innovation Summit 2025, HIKE2’s Kavitha Killough and Rachel Brozinick illuminated the hidden energy costs behind every AI query—and offered practical strategies to make AI use more sustainable. Their insights reveal why smarter prompting, infrastructure innovation, and conscious consumption must become part of the conversation as AI continues to expand. Explore the full session below and start rethinking the energy footprint of your digital habits with these key takeaways:

Key Takeaways:

  1. Most AI Energy Consumption Happens During Everyday Use, Not Just Training
    Contrary to common assumptions, it’s not the training of AI models that uses the most energy—it’s the billions of everyday queries. Every ChatGPT prompt adds up, contributing to an increasingly significant global energy demand.
  2. Smarter Prompts = Lower Environmental Impact
    Crafting more precise, goal-oriented prompts can dramatically reduce the number of interactions needed—and, by extension, energy consumption. Learning to “speak AI” isn’t just about getting better answers; it’s about making AI use more efficient and sustainable.
  3. AI Searches Use Far More Energy Than Traditional Web Queries
    A simple Google search four years ago consumed the equivalent of 2 minutes of LED light usage; today, an AI-powered query could burn an hour’s worth. With over a billion prompts a day, small differences in search behavior now have massive cumulative effects.
  4. Cloud Providers Are Innovating to Improve Sustainability
    Companies like Amazon are investing in solutions like ambient air cooling and smart water management to reduce data center resource consumption. These behind-the-scenes advancements are critical to making large-scale AI infrastructure more environmentally friendly.

Hello, everyone. My name is Rachel, and I’m joined by my colleague Kavitha. We’ll be discussing “Powering the Future: Making AI Work for Us and the Planet.” Wow!

All right—so we have a beautiful graphic of the United States. The darker-blue states indicate a higher concentration of data centers. If we make a query on ChatGPT from Pittsburgh, for example, that query is most likely handled by a data center in one of these higher-concentration states—California, Texas, or Florida. As a result, we can expect those states to have a higher demand for energy.

All right, I’m going to discuss the hardware energy-consumption breakdown at these data centers. As you’d expect, most of the energy is used by the actual IT equipment itself. Think of your server rooms—how many of you have ever been near a server or inside a server room? Show of hands? You all know those rooms get really, really hot. We need to cool them down, because the hardware is crunching through huge numbers. Cooling is essential for these servers to function and survive.

On the other side, we have the software energy-consumption breakdown. Perhaps unexpectedly, development and training of the AI don’t consume most of the energy; it’s the use of the AI. All those times I open ChatGPT to find the best s’mores-espresso-martini recipe probably aren’t helping our energy footprint!

Let’s try to visualize this with a basic 10-watt LED bulb—courtesy of Courtyard Marriott.
A basic Google search four years ago used about two minutes of that bulb.

Each prompt you type into ChatGPT now uses about 17 minutes of the bulb.

Moving to larger language models (like today’s AI-powered Google Search) pushes usage to nearly an hour of that bulb for every search.

According to EPRI’s data on ChatGPT, the platform receives about one billion prompts every day, consuming roughly 2.9 million kWh per day.

So, what can you do? How can we be more efficient when using AI tools? Think back to sixth grade, when we first learned how to write effective search-engine queries. We must learn to communicate with these AIs just as purposefully, and doing so can drastically lower energy use.

Here’s a prompt-writing framework (adapted from AI Pro):

Role – Tell the AI who it is: “Assume you are a …”
Task – State what you want it to do and in what format.
Goal – Explain the outcome you’re trying to achieve.
Audience – Specify who the content is for.

If you can cut your interactions from five poorly formed prompts down to two well-crafted ones using that formula, you could save nearly an hour of LED-bulb time—per ChatGPT session.

Finally, an anecdote from Amazon’s 2024 Sustainability Report shows how hyperscalers are reducing environmental impact. One highlight is increased water efficiency for data-center cooling:
They rely on ambient air (like an evaporative “swamp” cooler) instead of water whenever possible.

When ambient air is too hot, they introduce water—but alarms alert staff to any leaks.

They treat the water before discharging it to the watershed so downstream users can safely reuse it.

Thank you all.