Last month, the Berkeley National Laboratory, under contract with the US Department of Energy, published this report estimating total data center energy usage across the country. It also forecasted future demand out to 2028.
As you can see, in 2018, total electricity consumption by US data centers was estimated at approximately 76 TWh or 1.9% of the US total. In 2023, consumption more than doubled to 176 TWh or 4.4% of the US total. And by 2028, this is expected to further jump to somewhere between 6.7-12% of the US total.
Here's some commentary from the report:
With significant changes observed in the data center sector in recent years, owing to the rapid emergence of AI hardware, total data center energy use after 2023 is presented as a range to reflect various scenarios. These scenarios capture ranges of future equipment shipments and operational practices, as well as variations in cooling energy use. The equipment variations are based on the assumed number of GPUs shipped each year, which depends on the future GPU demand and the ability of manufacturers to meet those demands. Average operational practices for GPU-accelerated servers represent how much computational power, and how often AI hardware in the installed base is used, to meet AI workload demand. Cooling energy use variations are based on scenarios in cooling system selection type and efficiency of those cooling systems, such as shifting to liquid base cooling or moving away from evaporative cooling. Together, the scenario variations provide a range of total data center energy estimates, with the low and high end of roughly 325 and 580 TWh in 2028, as shown in Figure ES-1.
This strikes me as being an important macro trend and a big deal. All signs point to more data centers being needed. And before we know it, they're going to represent a meaningful chunk of total electricity usage.
Note: TWh = terawatt hour = one trillion watt hours