Qualcomm sees distributed spatial computing as key to enabling new consumer and enterprise XR use cases
While the timeline remains uncertain, connected smart glasses are widely expected to replace smartphones as the primary mobile computing platform. As that future takes shape, Qualcomm sees distributed spatial computing as key to unlocking new capabilities in extended reality (XR) devices.
In an interview during Mobile World Congress, Qualcomm VP of Engineering Hemanth Sampath said distributed spatial compute is crucial to “ensure long battery life and all-day usage…Whether it’s perception or rendering or gen AI workloads or ML workloads, you cannot run everything on the device. So you will need to take it and split it across a compute platform whether it’s a puck, a smartphone, or even an edge server. By distributing the compute, and by trading off compute and communications, you can ensure a seamless user experience.” In short, compute workloads must be intelligently shared between the glasses and nearby devices to deliver high performance without compromising battery life.
And Qualcomm is working with Ericsson and T-Mobile US to test infrastructure- and network-readiness for new types of XR experiences. In the trial, Sampath explained, Snapdragon-powered glasses built on the AR2 Gen 1 Platform were tethered to a smartphone with a Snapdragon mobile platform and connected to T-Mobile’s 5G network. 5G-Advanced features, including low latency, low loss and scalable throughput (L4S), were tested.
T-Mobile EVP and CTO John Saw called it a “pioneering trial” focused on “unlocking the full potential of 5G Advanced. With our network’s proven readiness and strength, we are excited to power a new wave of immersive experiences, delivering unparalleled extended reality experiences to our customers.”
Another important part of ongoing XR development is the combination of XR and AI. At a high-level, AI needs context and data in order to understand prompts and take action; given that XR devices like smart glasses see what the wearer sees, it’s a natural fit in terms of contextual awareness. “When you wear these glasses here, essentially you enable multi-modal AI,” Sampath said, referencing the Meta Ray Bans, which use Qualcomm XR technologies.
This fusion of XR and AI enables context-aware computing across both consumer and enterprise applications. He gave the example of consumers being able to use a voice command to prompt smart glasses to take a video; compare this to having to pull out a smartphone and physically disrupt the context being captured. Sampath pointed to hands-free worker assistance or automatic capture and upload of parcel deliveries as just two examples of how XR and AI can streamline enterprise workflows.
As XR devices evolve beyond novelty toward utility, Qualcomm’s approach—combining distributed spatial compute, advanced 5G networks, and on-device AI—is laying the foundation for immersive experiences that are efficient, intuitive, and scalable. Whether enhancing everyday interactions or transforming the way we work, the future of spatial computing looks increasingly intelligent—and increasingly real.