Across the U.S. and worldwide, energy demand is soaring as data centers work to support the wide and growing use of artificial intelligence. These large facilities are filled with powerful computers, called servers, that run complex algorithms to help AI systems learn from vast amounts of data.
This process requires tremendous computing power, which consumes huge quantities of electricity. Often, a single data center will use amounts comparable to the power needs of a small town. This heavy demand is stressing local power grids and forcing utilities to scramble to provide enough energy to reliably power data centers and the communities around them.
My work at the intersection of computing and electric power engineering includes research on operating and controlling power systems and making the grid more resilient. Here are some ways in which the spread of AI data centers is challenging utilities and grid managers, and how the power industry is responding.