YOU ARE AT:Network InfrastructureDell saw AI as a ‘strong tailwind' in Q2 earnings

Dell saw AI as a ‘strong tailwind’ in Q2 earnings

The Dell PowerEdge XE9680 GPU-enabled AI server is the “fastest ramping” new solution in the company’s history, COO Jeff Clarke said

After laying out a robust artificial intelligence (AI) strategy covering infrastructure and services across its businesses during Dell Technologies World in April, company COO Jeff Clarke this week said AI “is a strong tailwind for all things data and compute,” on a Q4 fiscal year 2024 earnings call. “AI is expanding the TAM for total technology spending and is projected to grow at a 19% CAGR for the next couple of years to approximately $90 billion, including hardware and services.” 

For its second quarter, Dell reported revenue of $22.9 billion, up 10% sequentially and down 13% year-over-year. Operating income was $1.2 billion, down 8% sequentially and up 1% year-over-year.

To the AI point, Clarke said that in Q2 Dell saw strong demand for its PowerEdge XE9680 GPU-enabled server which he called a “key element” to the company’s generative AI solutions. He said AI service increased to 20% of servers order revenue in the first half of the fiscal year. He also called out $2 billion in backlogged orders for the XE9680 “and our sales pipeline is substantially higher.” 

“Gen AI represents an inflection point driving fundamental change in the pace of innovation while improving the customer expectation and enabling significant productivity gains and new ways to work,” Clarke said. “As the number one infrastructure provider, we are clearly positioned to serve the market in a unique and differentiated way. And we have the world’s broadest Gen AI infrastructure portfolio that spans from the cloud to the client….I like our hand.” 

Highlighting the AI use cases customers seem focused on, Clarke mentioned customer operations, content creation and management, software development and sales.

Clarke reiterated Dell’s strong outlook for AI given that relevant workloads need to run across PCs, data centers and edge clouds, meaning Dell can support the proliferation of AI across its portfolio. He also hit on another key point which is that generative AI will rely on large language models that are trained using proprietary, domain-specific data, and optimized in service of very specific business processes. 

More on the AI opportunity: “We think it’s one size does not fit all. We think there’s a whole slew of AI solutions, again, from PC to workstations to what happens I the data center—and the data center could be a single server running inference at the edge…We believe it is incremental.” 

In response to a question that was in part about the types of companies buying Dell’s AI solutions, Clarke tied it to the prevalence of multi-cloud architectures and said, essentially, AI workloads will follow the data. “It’s highly unlikely you’re going to have a smart factory or a smart hospital or a set of robots that are going to continuously look to be trained or run inference a long way away. Latency will matter. We think security will matter. We think performance will matter. And we ultimately think cost will matter. When you put that equation together, we think it’s going to be a hybrid world…We think it’s going to be very, very heterogeneous in the way that this will be done with classic compute as well as accelerated compute. In a nutshell, that’s what we think of the opportunity.” 

Click here for Dell’s full financials.

ABOUT AUTHOR

Sean Kinney, Editor in Chief
Sean Kinney, Editor in Chief
Sean focuses on multiple subject areas including 5G, Open RAN, hybrid cloud, edge computing, and Industry 4.0. He also hosts Arden Media's podcast Will 5G Change the World? Prior to his work at RCR, Sean studied journalism and literature at the University of Mississippi then spent six years based in Key West, Florida, working as a reporter for the Miami Herald Media Company. He currently lives in Fayetteville, Arkansas.