AI data centers don’t just use a lot of power—they use it at the worst possible time. In the 6–9 p.m. “scarcity window,” new compute load can spike wholesale prices and push costs onto families through three channels:
• Capacity charges (“insurance” for the grid) — in PJM, prices jumped from $28.92 to $329.17 per MW-day; a 100-MW campus that doesn’t self-cover can face ~$12M/yr.
• Fuel/purchased-power riders — pass-throughs that can lift bills ~1.5%–5% (about $5.55–$9.25 on a $185 summer bill).
• Network upgrade riders — new substations/lines recovered from customers over decades (single 230-kV projects often $13–$27M).
The squeeze lands hardest where energy burdens are already high—many low-income households spend 15%–26% of income on energy. And it’s not just electricity: cooling water demand is huge (e.g., one campus used 355M gallons in a year), with the capex to serve it often rate-based too.
We lay out a practical fix: a finance-grade Self-Powered AI Standard with four signals that keep neighbors whole:
Attributable Additional Clean Supply (prove you added new clean MWh).
Hourly Clean Coverage + Clean Matching Shortfall (match consumption hour-by-hour, locally, and report gaps).
Firm Self-Supply Availability (ride through peak alerts using your own dispatchable resources).
Scarcity-Adjusted Import Exposure (near-zero imports in the dirtiest/top-price hours).
Who should listen: regulators, local officials, utilities, data-center operators, and anyone who pays an electric bill.
Takeaways: understand how costs flow to bills, why timing beats totals, and how to scale AI without shifting risks to households or stressed water basins.
Links & extras: full write-up on ThePricer.org and the preprint with methods, metrics, and example ledgers.
— ThePricer Podcast • Breaking Down the Cost of Everything