Qualcomm CEO discusses the “era of AI inference” and what it means for the company’s diversification strategy
Qualcomm this week reported Q1 2025 financials marked by the company’s QCT business, including handsets, automotive and IoT, bringing in record revenues of $10.084 billion, up 20% from the year-ago quarter. The company, as compared to Q1 2024, saw 13% growth in handsets, 61% in automotive, and 36% in its IoT segment which includes PCs.
“We’re off to a great start in fiscal ‘25,” CEO Cristiano Amon said. “Our mobile roadmap is the strongest in our history with exceptional traction for Snapdragon in premium-tier handsets, and we’re are delivering growth across our diversification initiatives.” To that last point, Qualcomm is targeting $22 billion in non-handset revenues by fiscal year 2029.
On the earnings call and in an interview with CNBC’s Jon Fortt, Amon detailed the company’s position in the dynamic AI landscape. “We started talking about AI on the edge, or on devices, before it was popular,” he told Fortt. And now with the launch of DeepSeek’s R1 model—which is already avialable on Snapdragon-powered devices–Amon talked through the larger opportunity for edge AI.
“Our advanced connectivity, computing and edge AI technologies and product portfolio continue to be highly differentiated and increasingly relevant to a broad range of industries,” he said on the earnings call. “We also remain very optimistic about the growing edge AI opportunity across our business, particularly as we see the next cycle of AI innovation and scale.”
He called that next cycle “the era of AI inference.” The big idea here is around the emerging test-time AI scaling law where techniques applied at inference improve overall model performance without the time-consuming and costly need to re-train the underlying foundation model. This aligns with other trends around model size reduction allowing them to run locally on devices like handsets and PCs.
“We expect that while training will continue in the cloud, inference will run increasingly on devices, making AI more accessible, customizable and efficient,” Amon said. “This will encourage the development of more targeted, purpose-oriented models and applications, which we anticipate will drive increased adoption, and in turn, demand for Qualcomm platforms across a range of devices.”
Bottomline, he said, “We’re well-positioned to drive this transition and benefit from this upcoming inflection point.” Expanding on that point in conversation with Fortt, Amon said the shifting AI landscape is “a great tailwind for business and kind of materializes what we’ve been preparing for, which is designing chips that can run those models at the edge.”
Heres more on the emerging test-time AI scaling law, and here’s some commentary from Qualcomm’s Durga Malladi talking edge AI during CES last month.