YOU ARE AT:FundamentalsAI and data centers: Pinch-points, promises and power

AI and data centers: Pinch-points, promises and power

Amid a lot of gen AI hype, managing AI’s energy needs will in part come down to not using it for everything

Artificial intelligence offers a conundrum, when it comes to sustainability. AI workloads are particularly energy-intensive. But AI also offers a lot of potential for assisting in the energy transition to more renewable sources and sustainable practices, as well as in improving upon human efficiency in how businesses operate. Will that be enough to justify its use?

At the recent Telco Sustainability Forum, the discussions around AI were multi-fold: Operator representatives highlighted practical applications of AI where they have realized energy savings and increased operational efficiency, while also acknowledging that the massive growth in demand for data center compute, partially driven by AI, is a major problem facing the telecom industry today.

Jérôme Goulard, chief sustainability officer at Orange Business, gave a number of examples of how Orange is leveraging AI. At Orange Poland, he said, AI was used to inform the optimization of 5G antennas as the new technology was deployed. As a result, Orange saw a 20%-25% reduction in energy consumption based on the AI optimization of the network infrastructure.

He noted that in addition to implementation of AI in infrastructure, Orange is also using AI initiatives focused on aspects of operations, sales and customer service–which, while not directly related to network infrastructure, is part of operating a modern telecom business efficiently. The operator has also begun to introduce AI-specific services for its customers, including secure hosting of AI computing resources and AI applications by Orange in order to increase enterprise access to AI.

However, Goulard also pointed out that when a generative AI service is queried compared to a Google search, the CO2 emissions are 100% more—so managing the impact of AI on energy use and emissions means only using it when the benefits are clear. “Our view … is that we have to be very careful about how we use AI solutions,” he said. “What is really the need, and what is the best solution to answer the need? Probably gen AI will not be the solution for all requests and all needs.”

AI
Image: 123RF

Dr. Jukka-Pekka Salmenkaita, VP of AI and special projects at Elisa, reflected that despite the hype of “LLMs for everything” in the last couple of years, gen AI is not particularly well-suited for application in areas where there is detailed, multi-model data involved and a lot of domain expertise is required for precision in analysis: Telecom network optimization models, for example, or analyzing precision processing data from semiconductor manufacturing, for instance. Gen AI offers valuable new tools for things like data processing related to documents, he said, but not all AI use should be gen AI.

Data centers, energy demand and the impact of AI

In a separate discussion around power use in data centers related to AI, John Coster, senior manager of innovation, strategy and planning at T-Mobile US, who is responsible for the company’s roughly 100 data centers, talked about balancing three factors in data center operations: managing cost and capacity, controlling emissions and managing operational risk. That’s not an easy proposition: 5G deployments already added to the power density required at T-Mobile US sites, and AI adds even more power demand.

“We don’t see it going away, and I think right now, everyone in the data center industry is facing this,” Coster said. “It’s either, add more capacity, or find ways to do more with the same amount of energy that we’re using—because energy is now the pinch point everywhere, pretty much.”

He continued: “We can’t meet the need of the demand without … more supply, and we’re constrained by the grid, and we’re constrained by the sources.” In the U.S. in particular, he said, “There’s just no more power, period, and all the power is coal-fired.”

He does see potential for increased development of natural gas as a power source in the United States, and acknowledged that there have been recent moves by companies such as Microsoft to start buying power from the Three Mile Island nuclear plant.

But data centers are seeing increasing constraints on their development due to the intensity of their power needs. For example, in Ireland, data centers were embraced initially—but last year they consumed more than 20% of all the country’s power. The Associated Press has reported that fear of rolling blackouts has meant that Ireland’s grid operator, EirGrid, has put a moratorium on new data centers near Dublin until 2028. Even in countries with large percentages of renewable energy available, the grid infrastructure isn’t necessarily available to transport that power to where it could be used to feed data centers.

There is also “still a proximity discussion to be had,” particularly around AI workloads, Coster said. How close do GPUs for AI processing need to be located, and how does that change during the model training phase versus the inference phase? What does that mean for data center locations? And, he added, profitable AI use cases for telecom to support are not yet clear, given the expense and power demand—so T-Mobile US is being cautious and limited in where and how it is deploying AI infrastructure in its data centers.

“I think there’s a lot of learning, there’s a lot of testing right now. … We’re not deploying widely, quickly, because we don’t see the business use of it yet,” he said. While T-Mo is positioning itself so that it can pivot to AI as needed, he added, “right now … we’re not betting the farm on putting out giant deployments with no business case yet.” Patrick Smith, VP and field CTO for EMEA for Pure Storage, pointed out that it appears things are “settling down a bit” in terms of exploration of gen AI versus more traditional analytics and AI, with the majority of generative AI projects being piloted and then not reaching full implementation.

Both Coster and Smith said that actually utilizing available tools for maximizing computational performance and orchestrating are key to managing the power needs of data centers overall, and AI in particular. Coster said that changes to the system set-up for some of its servers enabled it to gain 30% better energy efficiency.

As T-Mo sharpened its focus on increasing energy utilization and really tracked energy usage, it found as much as 40-50% inefficiencies in CPUs burning energy while no work was being done. The company found that it could get the same amount of computational work done, with far less energy, by using smarter workload orchestration. Pushing such capabilities to operate data centers more efficiently even further—including leveraging AI for management at the server level—offers a lot of potential, they both agreed.

Smith also pointed out that the compute and storage technology that goes into a data center matters as well, in terms of efficiency—not just the buildings themselves, or cooling methods, which have gained a lot of attention as demand for data center capacity ratchets up. Much like in telecom networks overall, the newer the equipment, the more energy efficient it tends to be—so modernization has to be part of the strategy on energy management.

For example, Smith said, storage accounts for about 20% of the power consumed in a data center, he said—so for a company like Pure Storage to be able to take significant energy out of powering storage infrastructure, it can make a noticeable dent in the overall data center’s energy consumption. Meanwhile, longer-lived and reliable storage also reduces the need for frequent updates and positively impacts the amount of electronic waste—which ties back to related forum discussions about applying circular economy concepts to the telecom sector.

Watch the full session and additional content from Telco Sustainability Forum on-demand here.

ABOUT AUTHOR

Kelly Hill
Kelly Hill
Kelly reports on network test and measurement, as well as the use of big data and analytics. She first covered the wireless industry for RCR Wireless News in 2005, focusing on carriers and mobile virtual network operators, then took a few years’ hiatus and returned to RCR Wireless News to write about heterogeneous networks and network infrastructure. Kelly is an Ohio native with a masters degree in journalism from the University of California, Berkeley, where she focused on science writing and multimedia. She has written for the San Francisco Chronicle, The Oregonian and The Canton Repository. Follow her on Twitter: @khillrcr