Supply, demand, and the future of data centers
Researchers across the university are working to solve the mounting challenges of AI infrastructure.
Note to readers: This series of articles focuses on researchers whose work improves efficiency, addresses concerns, or offers alternative solutions to some of the pressing issues created by data centers.
As Kirk Cameron sees it, there’s an innovation gap in the rapidly expanding business of data centers.
On one side is the demand for raw computing power that will need to be built out to service the ever-expanding needs of artificial intelligence (AI). On the other is the supply of energy and resources that will be required to make these data centers run. The companies involved, understandably, tend to be focused just on their own side of the industry.
“In general, the two work in isolation, siloed,” said Cameron, managing director of the Virginia Tech Institute for Advanced Computing in Alexandria.
That dynamic may be changing, though. As consumer energy prices rise because of the massive energy demands that data centers put on the grid and as some communities have rallied to keep companies from building the structures nearby, there is more pressure than ever to find better solutions on all fronts to the challenges that data centers can create. Virginia Tech’s location as a land-grant university in the commonwealth, its guiding principles, and its broad collection of researchers already tackling these issues make the university the ideal place to convene to discuss what comes next.
“What you’d really like to do is bring everybody together on the supply and the demand side to start talking,” said Cameron.
That’s exactly what will happen when the university convenes leaders from industry, academia, and government, including both expert and emerging voices, at the Data Center Summit in Alexandria later this spring. The invitation-only event will feature candid conversations about the opportunities, trade-offs, and decisions defining the future of data center infrastructure, grounded in real-world experience and cross-sector collaboration.
“These are billion-dollar industries that have been very successful focusing on what they do well. There’s not a ton of incentive for them to sort of cross the aisle, in a sense, to really think about co-designing these things together,” said Cameron. “And yet, there’s probably a lot of efficiencies to be gained.”
Cameron is also particularly well-suited to help facilitate these conversations. He began his career working on supercomputers, precursors to today’s data centers. In California, contemporary particle accelerators were such energy drags that they could often only run at night in the summer because they were competing with residential and commercial air conditioning on the power grid. Data centers now compete with cities for power, water, and real estate, making Cameron and his team’s work at the time prescient.
“We were mostly focused on squeezing every ounce of performance” out of power sources, he said.
Cameron’s early work was all about comparing systems like large-scale supercomputers and tracking their energy efficiency over time. That led to creations like the SPECpower benchmark and, more practically to the general public, EnergyStar appliance standards. Once better measurements were in place, the second phase of data center evolution prioritized designing for efficiency throughout the 2010s, delivering a 200-fold improvement over those years. Then came AI.
“Every time we build these big systems, we think we’re satisfying a current need,” said Cameron. “The reality is, the moment we satisfy the current need, the need goes up.”
The recent Ratepayer Protection Pledge signed by AI companies will require that they build, bring, or buy energy resources to power their data centers. But experts believe that will likely require some combination of new natural gas, nuclear, or even fusion power plants to come online, each of which come with their own obstacles and secondary impacts. Cameron believes that viable solutions will need to address both this power supply issue as well as the efficiency on the demand side of the equation.
“They’re working in isolation — they always have,” he said. “It might be that the solution we’re looking for is in the collaborative space."
Other articles in this series
Thinking small: How small language models could lessen the AI energy burden
There's a power/water trade-off in data center resource allocation