The cost of powering AI

ChannelBytes

Mention AI to anyone in the world of B2B Tech and it’s all about efficiency – being able to process much more data significantly faster. If strategically deployed, most companies believe that AI can help them gain a competitive advantage and power the economy of the future. With this as the prevailing mindset, few people stop to think about broader impacts.

The challenge is that AI has a sneaky secret it’s keeping very quiet about – power. More precisely, how it is powered. The emergence of AI as a mainstay of B2B tech is placing huge demands on data centers, both in terms of their capacity and the costs to operate them. Specifically, energy costs.

To put things in perspective: a single google search consumes approximately 0.0003 KWh. The same query run through ChatGPT uses 0.0029KWh. That’s almost ten times the electricity usage.

Now consider that ChatGPT currently has over 300 billion active users and that most of them using ChatGPT for much more than simple search, that’s a significant demand for electricity. In fact, it’s estimated to come in at 1,058,500,000 KWh a year. Where will the resources come from to power the demands for AI? Especially as more B2B tech companies look to integrate AI into their product offerings, causing the demand on data centers to grow exponentially?

From a tech industry perspective, most people see this growth as a positive for the economy. Demand for electricity, however, is not. Especially if you start to explore the environmental consequences and how the power is generated. We have all heard the calls to transition to renewables and to cut back on the use of fossil fuels. Most of us even understand why, but the reality is that it simply isn’t happening fast enough.

While some countries are starting to achieve a majority power generation from renewables, the USA is not one of them. So, arguing that AI data centers could be powered by solar or wind is a moot point. The infrastructure doesn’t currently exist to support it and at the rate that AI is being developed, it’s anyone’s guess if the energy demand from any source will be able to keep up.

Do a quick search on whether AI data centers are powered by renewables and you won’t be able to find much. Instead, you’ll find PR posturing about how powering AI data centers is very complex. It isn’t really when you think about it.

Current economic reliance on fossil fuels is causing a myriad of ecological disasters. Resource loss, biodiversity loss, water, soil and air pollution, and rising global temperatures – to name a few. Among the remedies suggested to mitigate these risks while transitioning to renewables, are reducing energy demand and improving energy efficiency. AI data centers are effectively doing the opposite.

On its own, ChatGPT is using a million times more electricity than the average household in a day. At that rate, there’s no hope of achieving lower demand or better energy efficiency. So as you open up your keyboard to type your next prompt, consider if the efficiency you gain is the right one and if it’s really worth it.

Want to be featured on ChannelBytes?