Snapdragon 8 Gen 3 gets major upgrade to NPU and other subsystems; features a 5G-Advanced ready modem-RF system
Last week at its Snapdragon Summit on Maui, Qualcomm product launches and messaging focused on the company’s role in delivering AI-powered experiences across its portfolio. While the focus was on the new custom Oryon CPU and the Snapdragon X Elite SoC for PCs, significant upgrades to the AI capabilities of the newest mobile platform, the Snapdragon 8 Gen 3, were also announced.
Specific to the AI piece, Qualcomm is pushing messaging focused on the benefits on device AI can bring to user experiences, particularly around camera features, personal assistants and sensing-based contextualization. The new premium tier SoC is already available in commercial devices, the Xiaomi 14 series, with many more to come, including flagships from ASUS, Honor, iQOO, MEIZU, NIO, Nubia, OnePlus, OPPO, realme, Redmi, RedMagic, Sony, vivo and ZTE.
In his portion of the day one keynote at Snapdragon Summit, Qualcomm SVP and GM of Mobile, Compute and Infrastructure Alex Katouzian called out a 30% faster CPU, 25% faster GPU and 98% faster neural processing unit (NPU) as compared to the Snapdragon 8 Gen 2. “And our connectivity is leading the market again,” he said, noting a 5G-Advanced ready modem-RF system, Wi-Fi 7 support and Dual Bluetooth capabilities. “These improvements will have a massive effect across every major smartphone experience,” he said.
The Qualcomm AI Engine built into Snapdragon 8 Gen 3 includes the Hexagon NPU which uses Micro Tile Inferencing, a low-power approach AI that slices a neural network into smaller portions to speed up inferencing. According to Qualcomm, it’s latest mobile SoC can run 10 billion parameter generative AI models on the device itself at up to 20 tokens per second.
“With these massively powerful models running on device,” Katouzian said, “you no longer need to rely entirely on the cloud.” In addition to adding responsiveness and efficiency, this approach makes sure interactions with AI-based tools are “private and secure because they run on device.” One proofpoint is the performance of the text-to-image gen AI tool Stable Diffusion. At Mobile World Congress in Barcelona in February, Qualcomm showed on device performance (meaning the device was not connected to any type of network) of the app at around 15 seconds from prompt to output. With the Snapdragon 8 Gen 3, that metric is now less than one second.
Chris Patrick, Qualcomm’s SVP and GM of the mobile handset business, called out the heterogeneous compute delivered through the combination of SoC subsystems. “We know this new configuration strikes the perfect balance between performance and power efficiency in a flagship device,” he said. Snapdragon 8 Gen 3 is “the titan of on device AI,” Patrick said.