• | 8:00 am

AI’s monstrous energy use is becoming a serious PR problem

The growing energy demand of AI is leading to pushback from knowledge workers. Here’s how AI-forward companies can address it.

AI’s monstrous energy use is becoming a serious PR problem
[Source photo: Freepik]

If you’re in charge of an editorial team, you’re used to objections from the rank and file about using AI“It gets things wrong.” “I don’t know what it’s doing with my data.” “Chatbots only say what you want to hear.”

Those are all valid concerns, and I bring them up often in my introduction to AI classes. Each one opens a discussion about what you can do about them, and it turns out to be quite a bit. AI hallucinations require careful thought about where to apply fact-checking and “human in the loop.” Enterprise tools, APIs, and privacy settings can go a long way to protecting your data. And you can prompt the default sycophancy out of AI by telling it to give you critical feedback.

There’s another objection to AI that’s been growing, however, and you can’t just prompt your way out of this one. There’s a growing reluctance among some knowledge workers to use AI because of how much energy it consumes and the consequential environmental impact. It’s no secret that, as the number of people using AI grows, the colossal energy footprint of the AI industry increases. It’s true that chips powering AI continually become more efficient, but tools like deep research, thinking models, and agents ensure the demand for energy rises, too. It didn’t help that Sam Altman once said that saying “please” and “thank you” to ChatGPT was needlessly burning millions of extra dollars. Data center construction alone has soared by 40% year over year, raising concerns about not just energy needs but also water consumption.

When guilt over AI use turns into pushback

In the eyes of those concerned about the environment, these stories and statistics can weigh on a person. Using ChatGPT starts to feel like a betrayal, with every query producing both intelligence and a commensurate amount of guilt. If they feel their employer is pushing them to use these tools anyway, that guilt can bubble up into anger, and even resistance.

We’re already starting to see serious objections. Civil servants in the U.K. voiced reluctance to use AI tools because of net zero emissions concerns, The Telegraph reported. Various officials charged with implementing AI-driven initiatives balked, fearing that doing so would conflict with Britain’s climate commitments. A similar dynamic is playing out at the municipal level in the U.S. Some city IT staff and policymakers in places like California have begun scrutinizing AI projects through a sustainability lens.

Many media professionals are concerned too. A couple of weeks ago, I saw at least three journalists bring up the concern—at separate events—while I was attending the Online News Association conference in New Orleans. And in a recent training I did with a large corporate comms team, I polled the audience: What is your chief concern about using AI, giving them five choices: hallucinations, bias, sycophancy, privacy, or energy use? A full 37% picked energy use.

All the evidence points to AI’s energy use developing into a massive PR problem—not just for the industry, but for any business. It’s hard to be “AI forward” if your workers think using it is a huge step backward for climate change.

To be clear, this isn’t to say the environmental concerns aren’t valid—it’s just that they’re simply not my area of expertise. But AI and managing teams are, and it’s clear this issue will be a growing challenge for AI leaders across industries, but especially media, since journalists are on the front lines of reporting AI’s environmental impact.

Dos and don’ts for managing employee concerns

So what can company leaders do to address this problem before it gets out of control? That will depend on a number of things: your AI policy, the tools you’re using, and the demographics of your workers. But here is some guidance, divided between dos and don’ts:

  • Do listen carefully to their concerns. Are they objecting because of broad climate implications, or are their concerns more specific? Does it have to do with a specific tool? A local impact? The more detail you have on the issue, the more you will know what you can do about it.
  • Don’t dismiss their concerns, or try to deflect them by pointing to other industries. Yes, cars spew carbon, and there are microplastics in the ocean. But there are also diesel engines and recycling programs. It’s fair to ask what the equivalent is for AI.
  • Do research the problem. In August this year, Google became the first major lab to produce a detailed technical report on the energy, carbon, and water footprint of its AI services, which was an opportunity for the company to brag about its progress, reducing the energy consumed per prompt by 33 times from May 2024 to May 2025. This could be useful information for your team.
  • Don’t encourage mitigating individual use. This might be controversial, but the worst thing an AI-forward worker can do is neglect to use AI to help solve a problem that it can really help with. And that goes for thinking, deep research, and GPT-5 Pro, too. Rather than mitigating individual use of tools, instead . . .
  • Do transition workflows into dedicated tools. If a particular tool or workflow proves useful enough, you should develop it such that it uses the most efficient model possible, which will save on compute costs and the environment. Paying for your own compute is the ultimate incentivizer to throttling unnecessary use.
  • Finally, don’t stop talking about the problem. When you give updates to your team, talk about what you’re doing, as an organization, to address the issue. Ambitious companies might even create an internally visible energy counter—something that would measure not just how much energy you’re using, but also how much compute you’re getting from it, showing how you’re improving efficiency over time.

The risk when workers lose faith

As AI advances, governed by mammoth trillion-dollar companies and world governments, it’s understandable that individuals may feel they have no agency in how it impacts society, and that includes the planet. It’s important for leaders to recognize that feeling of impotence and flip it into a quest for efficiency and open communication. Organizations that don’t might find that the workers using AI in unauthorized ways aren’t nearly as bad as the ones who refuse to use it at all.

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

More

More Top Stories:

FROM OUR PARTNERS