YOU ARE AT:AI InfrastructureAI's impact on data center networking: Five key points

AI’s impact on data center networking: Five key points

A new report from Ciena shows some of the demands AI is expected to place on data center networking

With artificial intelligence expected to have a significant impact on network traffic, one of the emerging potential bottlenecks is in data center interconnect (DCI) networking and capacity.

A recent Ciena survey asked more than 1,300 data center experts across a dozen markets around the world what they are anticipating in terms of needs for AI infrastructure and networking as related to data centers and DCI. The data was collected in January of this year, so it’s a very current window into anticipated DCI needs and potential pain points.

“AI workloads are reshaping the entire data center landscape, from infrastructure builds to bandwidth demand,” said Jürgen Hatheier, international CTO for Ciena. “Historically, network traffic has grown at a rate of 20-30% per year. AI is set to accelerate this growth significantly, meaning operators are rethinking their architectures and planning for how they can meet this demand sustainably.”

-Network data traffic growth is expected to be huge, and mostly driven by AI. The Ciena report said that it is likely that as generative AI “goes multi-modal”, sees more adoption of AI-driven automation, AI agents and inferencing, “network traffic will grow in often unpredictable ways.” While the industry’s focus thus far has been on the compute power that is needed to support AI, the report added, “The first network segment to see substantial traffic spikes is exactly the one supporting that AI infrastructure – inside the data center.”

More than half of the survey respondents said that they expected that AI workloads will “overtake traditional cloud and big data applications” in the next two to three years.

Data centers will increasingly be designed and connected to support AI workloads. On a global basis, the survey respondents said that 43% of the new data centers their companies are planning to build will be dedicated to handling AI workloads. There were regional differences within that overall figure, interestingly: Survey respondents from India and Indonesia expected that more than half of new data centers in their region would be dedicated to AI, while Norwegian and Swedish respondents expected it to be closer to a third of new data centers.

Interconnection pressures are coming. The increase in AI workloads will mean a corresponding increase in pressure on data center interconnects, the report said, because “the AI ecosystem of tomorrow will be a network of interconnected data centers.” This translates to a “far greater demand on DCI than we’ve seen to date, and meeting these needs will require a significant investment in—and expansion of—data center estates and infrastructure,” the report found.

Survey respondents anticipate a minimum of a sixfold increase in DCI network bandwidth over the next five years, or 40-60% compound annual growth. That’s at least double the typical growth norm.

DCI will have to balance functionality and connectivity of edge vs. central data centers. Most survey respondents expect that Large Language Models will take place in some type of distributed data center facilities: major hubs that can support the compute and power needs for AI model training. Edge data centers, meanwhile, will “handle latency-sensitive inferencing applications but also offer strategic locations for various customers.”

DCI performance will have to keep up, and investments are being made now to prepare. Networks are expected to need 800 Gbps or higher, such as 1.6 Tbps, per wavelength across both new and existing routes, and operators are already investing in such high-capacity infrastructure, the Ciena report noted. (See a list of recent 1.6 Tbps trials here, and a discussion with from fiber infrastructure provider Zayo on its build-out plans to support AI here.) More than half of the survey respondents said that 1.6 TBps would be a “necessary requirement.”

“The AI revolution is not just about compute—it’s about connectivity,” said Hatheier. “Without the right network foundation, AI’s full potential can’t be realized. Operators must ensure their DCI infrastructure is ready for a future where AI-driven traffic dominates.”

ABOUT AUTHOR

Kelly Hill
Kelly Hill
Kelly reports on network test and measurement, as well as the use of big data and analytics. She first covered the wireless industry for RCR Wireless News in 2005, focusing on carriers and mobile virtual network operators, then took a few years’ hiatus and returned to RCR Wireless News to write about heterogeneous networks and network infrastructure. Kelly is an Ohio native with a masters degree in journalism from the University of California, Berkeley, where she focused on science writing and multimedia. She has written for the San Francisco Chronicle, The Oregonian and The Canton Repository. Follow her on Twitter: @khillrcr