YOU ARE AT:MetaverseEnabling the metaverse with a device-edge-cloud continuum

Enabling the metaverse with a device-edge-cloud continuum

No single company, network or technology can build the metaverse. Instead, the industry will have to tap into every available foundational technology across the ecosystem, from the very edge of the network all the way to the devices themselves. 5G and Wi-Fi, high-powered cameras and sensors, advanced hardware and software and graphics and audio will all have a role to play in creating the metaverse experience.

Starting with the network, Qualcomm Technologies, Inc. Vice President of Engineering Hemanth Sampath told RCR Wireless News during Mobile World Congress 2023 that edge compute is absolutely necessary because metaverse applications require “end-to-end latencies [that] are as low as possible.”

The first iteration of the metaverse will likely consist of a 5G-enabled phone connected to a pair of XR glasses with Wi-Fi capability. “Now you can put together an end-to-end system where you have the edge to phone to glass[es] [and] you can begin to then render content on the glass[es],” Sampath provided. “The connection between the glass[es] and the phone can be over Wi-Fi and the connection between the phone and edge cloud can be… 5G.”

Perception-assisted wireless for enhanced XR

In order for this device-edge-cloud continuum to work, intelligence and awareness needs to be added to the equation. In service of this, Qualcomm showcased perception-assisted 5G for enhanced XR, in which an XR headset—in this case the AR Smart Viewer Reference Design, powered by Snapdragon XR2 Gen 1 Platform— uses sensors like cameras and inertial measurement units (IMUs) to determine the device’s precise movements.

“Let’s say you are in a 5G mmWave network where the beams change pretty rapidly when there is an obstacle or when the user is moving,” Sampath said. “In that case, we use perception information such as camera[s] and 6DoF to enable predicting the mmWave beam as quickly as possible. That really allows for a seamless user experience.”

Qualcomm further showed how dynamic distributed compute, which toggles between local and remote compute modes depending on 5G radio quality conditions, can result in an overall better user experience. For this demonstration, Qualcomm again tapped into the device-edge-cloud continuum: The company emulated boundless air traffic over a 5G sub-7 GHz test network and Wi-Fi 6. In good 5G radio conditions, the system operates in remote compute mode where user data flows from the glasses to the phone to the server, where the data is processed and then an encoded graphic is sent back via the phone to the glasses using a high band 5G link. In poor 5G radio conditions, the system operates in local compute mode where the 5G link is used for low bandwidth communication of the aggregated user data. In this case, the phone renders the graphics that are then displayed on the glasses.

XR in motion

Qualcomm’s third demo addressed one of the biggest challenges associated with delivering an immersive and mobile metaverse experience, which is the fact that radio conditions can easily degrade when the user is moving around. Data packets can also get lost or become delayed during handover between connectivity nodes. Using cloud gaming — an application expected to be popular in the metaverse — as an example, Qualcomm demonstrated that by implementing its 5G Application Programming Interfaces, or APIs, the gaming application can dynamically adapt in a mobile situation by allowing the server to quickly adapt the video bit rate, resulting in smoother video and better image quality.

The metaverse is still taking shape, but it will without question require high throughput, low latency, perception-assisted wireless and 5G APIs to best support advanced, immersive applications, which are expected to reach across the consumer, enterprise and industrial markets.

Click here to learn more about how Qualcomm is enabling the metaverse.

ABOUT AUTHOR