Electricity bills have been climbing across the US, but the AI boom is making it worse. A lot worse. New data centers are popping up everywhere to handle ChatGPT, Gemini, and countless other AI tools. And here’s the kicker: you’re helping pay for them whether you use AI or not.
In Columbus, Ohio, residential customers saw their bills jump by $27 a month this summer. Philadelphia went up $17. Washington D.C., $21. You may not be using more power, but utilities are building massive infrastructure for data centers and passing those costs onto you.123456
How much is this AI blitz really costing us? And what can we actually do about it?
Yes, it really is that bad
Whether you’re a fan of generative AI or not, its rapid growth has had a powerful effect on grid planning. And the way utilities plan the grid directly affects your electric bills.
Here’s the thing. Figuring out how much AI is really costing us is tough for two reasons. First, the math is complicated. We’re not talking back of the napkin calculations here. And second, we’re working with incomplete data. Big Tech companies aren’t exactly transparent about how much energy their AI models actually use.27
The lack of transparency on Big Tech’s end means that for most gen-AI models, estimates of their resource usage simply aren’t available. The best we can do is speculate, but the problem is that speculation is constantly changing as the technology does. In the meantime, many of us are left wondering why our bills are jumping up when our own energy use isn’t.8910
That’s not to say that we’ve got absolutely nothing. In August, Google released a technical report outlining the resource consumption of its Gemini model.11 Altogether, Google claims that your typical Gemini text prompt uses 0.24 Wh of energy (or about nine seconds of TV time).11 However, in Google’s own technical paper, the authors acknowledge that AI users are already racking up billions of prompts every day.9 Those prompts are weighing on our wallets, directly or not.
It’s also worth reading the fine print, here. These numbers haven’t been verified by a third party, and as stated in a press release footnote:
“These findings do not represent the specific environmental impact for all Gemini App text-generation prompts nor are they indicative of future performance.”11
That brings us right back to the first problem with determining the scope of AI’s impacts: it’s complicated. For starters, the amount of energy used for training a model is not the same as the amount of energy used for maintaining and providing AI-based services to consumers.9 In this figure from a recent study by the U.S. Lawrence Berkeley National Laboratory, you can see the huge difference in approximate operational time between AI training and inferencing.12

Inferencing is basically the response you get from an AI when you ask it a question. And when it comes to inferencing not all tasks are comparable. Different types of queries require different amounts of energy, and models can vary wildly.
Machine learning company Hugging Face is attempting to standardize AI’s energy use with its AI Energy Score Leaderboard, which is separated into categories. It’s shows how tough it is to make one-to-one comparisons. Image classification is the orange to text generation’s apple…and gen-AI is one hefty fruit basket.1013 The median energy used for inference is also slanted by larger customers or heavy users.8 The bell curve is pretty warped.
The bad news is, residential ratepayers are the most vulnerable to these extremes because of the way that utility regulation in the US works.
Why is this happening?
Rising electric bills are the most obvious impact. There’s a direct link between the surge in data centers and higher costs for residential customers.12 So why does this happen? Simple: data centers get special treatment. And why do they get special treatment? Because it’s profitable for utilities.
When data centers started ramping up for the AI boom, utilities had to boost their capacity to handle that sudden demand for power. For most utilities, the preferred, age-old strategy to boost capacity is to design the whole system to for the highest peaks of power consumption.1
In other words, utilities plan for the worst day of the year, when the grid is straining to meet demand — that one egg-frying-on-the-windshield summer afternoon when everyone’s cranking up the AC. The way utilities like to do this is by building more: more transmission, more power generation, more infrastructure.2 So when utilities need to handle power-hungry data centers, they build more infrastructure. More transmission lines, more power plants, more everything.
But all that building costs money. And where does the money come from? Us. Your electric bills pay for these projects.2 Now, you might think that’s fair. After all, everyone benefits from a bigger grid. If demand is going up, costs should go up too, right? Well, it’s not that simple.
For starters, utilities aren’t typical businesses. They’re government-backed monopolies. And the way they calculate rates? It’s complicated and not very transparent. Even utility regulators don’t have access to all the information they need to figure out if a utility’s proposed rates are actually fair.2
Because here’s the thing: utilities profit by spending. When these companies go on a shopping spree, they do so knowing that they can internally justify the splurge by having residential ratepayers pick up the tab. And they can externally justify an addition to that tab by claiming that the spending is necessary to meet climbing demand.2
With that in mind, the incentive to prioritize data centers is clear. We already know that Big Tech’s rush to spit out generative AI products is sucking up massive amounts of energy unlike we’ve ever seen…especially because we can’t physically see inside the black box. Everywhere that data centers are discussed, the same word comes up over and over again: “unprecedented.”
It’s no surprise, then, that utilities are fighting for attention from the likes of Amazon, Google, Meta, and Microsoft. By snagging these elite clients with generous discounts, utilities can meet the outsized needs of data centers with more and more infrastructure, which then passes some of that cost onto residential customers.21
Here’s the thing. We’re not even using most of the grid most of the time. Duke University researchers explained it this way:
“A system utilization rate below 100% is expected for most large-scale infrastructure designed to withstand occasional surges in demand. Nevertheless, when the gap between average demand and peak demand is consistently large, it implies that substantial portions of the electric power system — generation assets, transmission infrastructure, and distribution networks — remain idle for much of the year. These assets are expensive to build and maintain, and ratepayers ultimately bear the cost.”14

Are there other ways to meet demand? Absolutely. But those methods don’t make money…Or do they? And here’s the thing (the fact that we’re only using a fraction of our grid capacity) that’s actually good news. It means we have room to solve this without building everything from scratch. I’ll show you how in a bit. And in the extended cut of this video, I’ll dive into how grid users of all sizes can reduce their consumption…and cash in on it, too.
A major alternative to infrastructure that’s mostly idle (e.g. peaker plants) is energy storage.15 When energy storage is smartly integrated into the grid, you can not only operate more cleanly and efficiently, but also save…and make…a shocking amount of money while doing it.16 In my home state of Massachusetts, the town of Danvers pulled this off in an incredible way. By the summer of 2024, Danvers had partnered with Caterpillar to providing a Caterpillar-owned and operated battery system.
Every two seconds, New England’s Independent System Operator (ISO) would ping that battery to effectively ask it to help keep the grid in check. What I’m talking about is known as frequency regulation: the battery would either charge or discharge to maintain the delicate balance of the grid’s preferred frequency.17 Then, because the battery led to a more stable grid overall, the local utility pays the town of Danvers $50,000 a month.
It didn’t stop there. On June 25th, Caterpillar’s Distributed Energy Resource Management System (or DERM) kicked in…and saved Danvers $250,000. How? By predicting record-breaking demand, based on the day’s weather and market conditions. When New England’s demand hit 26 GW at around 6 p.m., the battery automatically started to discharge, reducing Danvers’ demand by about 5 MW. This drop in demand in turn reduced what Danvers would pay for the next year — by a quarter of a million dollars — because New England’s commercial customers are billed based on their consumption during the grid’s most trying hour.
However, these kinds of cost savings aren’t limited to the commercial scale. Consumer Time of Use (TOU) rates can grant some of the gamification that corporations, data centers, and yes, entire towns, are tapping into. The idea is that instead of locking in your electricity rate at a set per kWh fee for the day, month, or year, the rate fluctuates along with the market value over the course of a day. Rates are the highest when the demand is highest (i.e. from around 5 p.m. to 9 p.m.), and rates are lowest when the demand is lowest (i.e. in the middle of the night). As a residential customer, you might set your EV to only charge overnight when the rates are cheapest, set your dishwasher on a timer to only run late at night, and so on.
This is where home batteries come into the picture, just like they do for the data center trying to maximize its energy savings. With a home battery, you can usually turn on a built-in feature that syncs up with your local TOU rates. That means you home will pull energy from your home battery when prices are highest and charge back up with they’re the cheapest. This simple dance can save you a significant amount of money every year.
In California, TOU is very common. I personally don’t have option to use TOU where I live right now, but it appears that my utility in Massachusetts is laying the groundwork to get there. They’re about to switch my electric meter out for a smart meter. Roughly 14% of U.S. utilities offer TOU rates, so it’s worth looking into what’s available in your area.18
Virtual Power Plants (VPPs) are another way residential customers can participate in stabilizing the grid and earn money from it. Bottom line: this isn’t something where only corporations can earn a profit.
What about improvements in efficiency as AI becomes more common? Well…I’m optimistic that the future of gen-AI will be less energy intensive. At the same time, that might not fix the problem. I don’t have to tell you how much consumer AI products are booming. There’s probably evidence of that right below this video. Increasingly efficient bots, agents, and customer service lines that put you right in the shoes of Richard Deckard could be really cool and convenient and not at all insanity-inducing…but they could also drive demand up even further, minimizing or negating those efficiency gains.2 This idea of increasing efficiency inadvertently leading to increased consumption is what’s known as Jevons Paradox, which has haunted us since 19th-century. English economist William Stanley Jevons first theorized that it would happen…with coal.19
Here’s another thing people get wrong about data centers. Yes, they’re getting bigger and using more power. And yes, AI data centers use even more. But these buildings aren’t running at full blast 24/7.220 AI chips have limits, just like any hardware.
Tyler Norris, who led that Duke University study I mentioned earlier, points out that even the most sophisticated data center operators can’t predict exactly how much capacity they’ll need as the market shifts. So when things are uncertain, companies play it safe. They build extra capacity to handle potential spikes or changes in demand.20
The problem, of course, is that this exact overbuilding is what raises prices. But before I show you what this is costing people, I want you to know there are ways out of this. States, utilities, and citizens are already fighting back, and some of the solutions are surprisingly simple. More on that in a bit.
The Real Costs
So what does this actually cost people? You probably already see it on your own electric bill. But let’s look at the bigger picture. A Deloitte report from June shows that eight of the nine largest data center markets saw power prices jump above the national average between December 2023 and December 2024.21

Now, the graph might make it look like Ohio got off easy. But that’s not the case. Ohio is still one of the hardest-hit states. This past summer, the “generation service” portion of monthly electricity bills for Columbus residents rose by a whopping $27. According to utility American Electric Power (AEP) Ohio, “Limited generation supply combined with increasing demand for electricity is driving bills up for customers.”12223
It gets worse. Because the power grid operates regionally, this problem spreads beyond individual cities. The regional transmission organization called PJM oversees electricity for 13 states. So it’s not just Columbus feeling the heat.24 This summer, residential customers in Trenton, New Jersey saw their bills jump by $26 a month. Philadelphia went up $17. Pittsburgh, $10. And in Washington, D.C., Pepco customers are paying an average of $21 more per month.25 All these areas are part of PJM’s grid. And according to PJM’s Independent Market Monitor, data centers are responsible for about 75% of these rate increases.261 Here’s what they said in a June analysis:
“It is misleading to assert that the capacity market results are simply just a reflection of supply and demand…
The current conditions in the capacity market are almost entirely the result of large load additions from data centers, both actual historical and forecast.”27
And there’s another long-term problem. When utilities keep building new infrastructure instead of maintaining what we already have, the existing grid suffers. Old equipment, lack of upgrades, and above-ground power lines in rural areas create a perfect storm. We’re seeing more frequent and more severe wildfires as a result.282930
What can we do?
It isn’t all bad news. While the strain brought on by the AI boom might seem insurmountable, what we’re lacking isn’t solutions, but will. In fact, new data centers could potentially make things more affordable and efficient if (and that’s a huge if) they collaborate with regulators and utilities. Here’s a few ways how.
What Regulators Are Doing
Some states are starting to fight back. In June, Oregon passed House Bill 3546.31 It creates a new class of “large energy use facility” specifically designed to prevent cost shifting.32
For Ohioans that suffered this summer, relief might be around the corner. In July, the Public Utilities Commission of Ohio voted to approve a settlement put forth by AEP Ohio that ensures that new data centers pay for at least 85% of the energy they project they’ll use each month, even if they end up using less than expected.33 As of October, AEP Ohio has claimed that this tariff — which effectively forces accuracy in estimating data center’s energy needs — allowed for its backed-up interconnection queue to drop from about 30 GW worth of demand to 13 GW.34
Notably, the utility went on the record to say that a lot of these projects might have been “duplicative or speculative” once the Ohio Manufacturers Association accused AEP Ohio of inflating the original numbers. Food for thought.34
What Utilities Are Doing
There’s a better way to manage the grid, and it’s called demand response. The basic idea? Instead of building more infrastructure, you shift when people use energy.14 I touched on this earlier, and honestly, there are so many ways to do demand response that it could be its own video. But here’s one interesting idea from that Duke study. They argue that the key isn’t how much energy we need. It’s when we need it. Scaling back usage for few hours can have powerful results, and this has big implications for AI data centers.3514
Take PJM as an example. According to Duke’s research, PJM could add 13 GW of new load without building any new infrastructure. The trick? Curtailment. Now, this isn’t about cutting off renewable energy when there’s too much. This is about reducing power demand. Big customers could switch to backup generators or battery storage systems. Basically, shift when they use power.14
Duke’s research found that curtailment would only be needed 0.25% of the time.3514 Put another way, the entire U.S. grid could handle 100 GW of new data center load by cutting back for just two hours on average.36 And that’s just one demand response approach.16

What Companies Are Doing
So what are these data center companies actually doing about all this new energy demand? I recently talked to Microsoft about how they’re handling it, and they used a term I hadn’t heard before: “additionality.” The idea is pretty straightforward, but important. Whenever Microsoft adds a new data center and needs to pull more power from the grid, they also add an equal amount of new carbon-free generation to that same regional grid.
For example, their new data center in Mount Pleasant, Wisconsin, is matched with a 250 MW solar project in Portage County.37 That’s not just about going “green” on paper. That additionality means adding new clean energy to the system, so the overall grid gets cleaner as they grow. They also work with utilities on battery storage and specialized rate structures to help keep the grid stable and avoid driving up costs for everyone else.
In other words, they’re trying to expand grid capacity in step with their own energy demand and to also help grow the clean energy pie at the same time.
Then there’s the technological approach. In 2020, Google announced that it had begun using an algorithm to shuffle around its computing tasks to align with the times when renewable energy is abundant.38 In 2023, Google adapted this routine for demand response, allowing the company to limit or re-assign less urgent requests when the grid is stressed.39 Pretty clever.
Here’s what I’ve learned. We can’t manage the grid the same way anymore. AI data centers are different from typical industrial customers, and we need to treat them differently.40 The good news? We can upgrade the grid for everyone. The solutions exist. I couldn’t cover all of them in one video, but they’re out there. This isn’t a one-fix to solve everything problem. We need utilities, regulators, and the data center companies themselves all working together to make it happen.
- The AI explosion means millions are paying more for electricity ↩︎
- Extracting Profits from the Public: How Utility Ratepayers Are Paying for Big Tech’s Power ↩︎
- ‘How come I can’t breathe?’: Musk’s data company draws a backlash in Memphis ↩︎
- I Live 400 Yards From Mark Zuckerberg’s Massive Data Center ↩︎
- The Electricity Affordability Crisis Is Coming ↩︎
- AI is changing the grid. Could it help more than it harms? ↩︎
- As data centers upend electric grids, the largest operator in the US is facing down a revolt from state officials ↩︎
- We did the math on AI’s energy footprint. Here’s the story you haven’t heard. ↩︎
- Measuring the environmental impact of delivering AI at Google Scale ↩︎
- AI Energy Score Frequently Asked Questions ↩︎
- How much energy does Google’s AI use? We did the math ↩︎
- 2024 United States Data Center Energy Usage Report ↩︎
- AI Energy Score Leaderboard ↩︎
- Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in US Power Systems ↩︎
- Energy Storage To Replace Peaker Plants ↩︎
- Battery storage applications have shifted as more batteries are added to the U.S. grid ↩︎
- Frequency Response ↩︎
- A Survey of Residential Time-Of-Use (TOU) Rates ↩︎
- Why the AI world is suddenly obsessed with a 160-year-old economics paradox ↩︎
- The Puzzle of Low Data Center Utilization Rates ↩︎
- Can US infrastructure keep up with the AI economy? ↩︎
- Upcoming Adjustments to Your Electric Bill ↩︎
- AEP Ohio: Who We Are at a Glance ↩︎
- Maryland shouldn’t be on the hook for out-of-state data centers | GUEST COMMENTARY ↩︎
- Projected data center growth spurs PJM capacity prices by factor of 10 ↩︎
- PJM ↩︎
- Analysis of the 2025/2026 RPM Base Residual Auction Part G ↩︎
- What sparks US wildfires: Power lines, burning trash and lightning ↩︎
- Fire Ready?: White paper finds many U.S. power utilities unprepared for wildfire risk ↩︎
- Preventing Wildfires Sparked by Power Lines ↩︎
- 2025 Regular Session HB 3546 Enrolled ↩︎
- Enrolled House Bill 3546 ↩︎
- Ohio regulators approve settlement requiring data centers to pay at least 85% of power costs ↩︎
- AEP Ohio slashes data center pipeline by more than half – report ↩︎
- Data Centers Could Make or Break Electricity Affordability ↩︎
- Three Key Takeaways: Rethinking Load Growth in U.S. Power Systems ↩︎
- Made in Wisconsin: The world’s most powerful AI datacenter ↩︎
- Our data centers now work harder when the sun shines and wind blows ↩︎
- Supporting power grids with demand response at Google data centers ↩︎
- From Bottlenecks to Breakthroughs: Rewriting the Grid Planning Playbook in the Southwest Power Pool and Texas ↩︎















Comments