Image for Data Center Gold Rush Squeezes Electricity Grid

Residential Ratepayers Forced to Foot Larger Share of System Improvement Costs

Data center expansion has been fueled by the need to process and store more data. Artificial intelligence is turning the expansion into a data center gold rush. Residential ratepayers are footing more than their share of the cost of meeting higher electricity demands.

Data centers are large physical facilities that house servers, storage systems, networking equipment and other infrastructure to process and manage vast amounts of digital information. Smaller data centers can cost $10 million to build. Larger, more complex data centers can cost $2 billion.

There are already 237 data centers in the Pacific Northwest, operated by 74 providers serving companies like Amazon, Google and Microsoft. Oregon has 125 data centers. Intel has its own data center in Hillsboro. Amazon has one in The Dalles and Google operates one in Boardman.

As reported by The Oregonian’s Mike Rogoway, “Meta has your Facebook and Instagram posts stored in digital warehouses in the high desert above Prineville. And Oracle, LinkedIn, X and many others have large data centers in Hillsboro, some right next to farmhouses and suburban cul-de-sacs.”

Data Centers Require Lots of Electricity
Data centers consume a lot of electricity to run “high-octane computers and the systems that keep them cool,” Rogoway reports. “They already account for 11 percent of Oregon’s total electricity consumption, which is twice as much as homes in Portland consume.” Energy forecasters predict data center electricity demand could double or triple by 2030, requiring as much as 4,000 megawatts of power.

The Pacific Northwest is attractive to data center developers because of the region’s abundant and relatively cheap hydroelectricity. The 39 Columbia River dams generate an average of 7,500 megawatts per year. At full capacity, they could generate 22,000 megawatts. There are just 288 data centers in California, which is home to many high tech firms but has higher energy costs.

Rapid growth of electricity demand for data centers has impacted the Pacific Northwest power grid – and the pocketbooks of ratepayers in Oregon and Washington. Pacific Power and PGE have sought and won sizable consumer rate hikes over recent years to pay for power line extensions and transmission enhancements to accommodate load growth and maximize use of alternative energy sources.

Rogoway reports, “There is broad agreement – among regulators, certain lawmakers and even electric utilities – that Oregon should insulate residential electricity customers from the costs of supplying the state’s proliferating data centers with energy. But it’s not clear how to ensure the tech industry covers its own costs.”

“Our regulatory structures weren’t designed for this scale of large customer growth,” explains Nolan Moser, acting director of the Oregon Public Utility Commission, which regulates electric utilities. PGE says its electricity overall demand grew 10 percent from 2019 to 2023. In that same time period, industrial energy use grew by 34.7 percent.

Residential Consumers Bear Rate Hikes
“Like utilities nationwide, PGE is experiencing a surge in requests for new, substantial amounts of electricity load, including from advanced manufacturing, data centers and AI-related companies,” PGE CEO Maria Pope told KGW-TV. “This comes at a time when we are investing in a system to withstand increasingly extreme weather, support increasing electrification and enhance access to the lowest cost renewable energy available.”

Pope says load growth can help affordability by spreading out the cost of grid operation to more users. At the same time, Pope acknowledges utility regulations and price-setting don’t fully reflect “where demand is growing the most, or where the bulk of PGE’s infrastructure investments are going.”

“Existing regulatory frameworks will need to evolve to appropriately reflect how investments serve different customers and how costs are allocated given the changes in new large load demands,” Pope said. “Collaboration with regulators, policymakers and stakeholders is essential to help address these new realities and to keep the price of electricity as low as possible for residential and other business customers.”

Data Centers and Tax Breaks
Data center site selection usually depends on tax incentives. In Oregon, that comes in the form of property tax breaks. Washington offers retail and use tax exemptions. Communities, especially rural communities, benefit from construction jobs and a boosted tax base. It’s common for a data center to require 1,000 construction workers. When operational, data centers only employ a dozen or so employees for monitoring, security and maintenance.

Washington enacted its first data center tax break in 2006 by offering a sales tax reduction in return for tech companies hiring union construction workers. According to a New York Times report, four of the largest technology companies in Washington spent more than $200 billion to build data centers in the last year. Site selection was often based on the sweetness of tax breaks, as well as schools preparing students to enter the trades.

Not surprisingly, many small communities experienced a housing crunch caused by the influx of migrant workers on data centers, even though many workers who chase data center work live in campers.

Disruptive Technology At It Again
Advancing technology is inherently disruptive. But the disruptions caused by large-scale data processing are different than the impact of artificial intelligence on white-collar work environments. Data centers require large tracts of land, access to cheap electricity and migration of multitudes of mobile skilled workers.

The data center gold rush has created a vortex affecting more than just construction sites. Among other things, data center expansion has exposed the need to modernize transmission grids, maximize use of every power source available and how to fairly allocate costs.

If there is a bright spot, AI developers are pursuing models that consume less energy without affecting performance. In the meantime, AI models focus on accuracy above all else, without regard to energy use efficiency.

According to one industry report, neural networks used for language translation, face recognition, object detection and medical image analysis can require as much as 263,000 kWh of energy for training purposes. The report says that’s as much energy as the average Danish citizen consumes over four decades.