To complement the Qualcomm Cloud AI 100 Ultra accelerator, the company has developed a software suite for AI inference workloads
From an enterprise perspective, AI is all about putting data to work in a way that improves process and workflow efficiency, and creates new revenue...
As AI shifts from centralized clouds to distributed edge environments, the challenge is no longer just model training—it’s scaling inference efficiently. Test-time inference scaling is emerging as a critical enabler of real-time AI execution, allowing AI models to dynamically adjust compute resources at inference...
NXP Semiconductors has agreed a deal to buy US edge AI chipmaker Kinara for $307 million. Kinara specialises in energy-efficient neural processing units (NPUs). NXP said it will combine the firm’s edge NPUs and AI software more formally into its own industrial and IoT...
Industry discourse around test-time inference scaling converging with edge AI—where real people really experience AI—continues to ramp, and an important focus area is around memory and storage. To put it reductively, modern leading AI chips can process data faster than memory systems can deliver...
Qualcomm CEO discusses the "era of AI inference” and what it means for the company's diversification strategy
Qualcomm this week reported Q1 2025 financials marked by the company’s QCT business, including handsets, automotive and IoT, bringing in record revenues of $10.084 billion, up 20% from...
Dell Technologies talks the device-edge-datacenter-cloud AI continuum, the rise of agentic AI and the outlook for federated learning
For years, AI discourse has been dominated by massive cloud-based models, trained on enormous datasets and running in centralized data centers. But as AI deployment scales, the...
The new AI Connect suite from Verizon comes as McKinsey estimates 60% to 70% of AI workloads will be for inference by 2030
It’s been a busy two weeks in the world of AI. OpenAI and partners announced a planned $500 billion investment in AI...
Fractile is focused on AI hardware that runs LLM inference in memory to reduce compute overhead and drive scale
In December last year, then-CEO of Intel Pat Gelsinger abruptly retired as the company’s turnaround strategy, largely marked by a separation of the semiconductor design and...
BT talks monetizing edge AI, creating a global connectivity fabric and the future of conversational networks
Back in June at a telecoms industry show, BT CTO Colin Bannon referred to the edge as the “Goldilocks location for AI.” He expanded on this topic, and more,...
Chinese OEMs Oppo and Vivo are expected to launch devices featuring the Dimensity 9400 this month
MediaTek this week launched the 4th generation of its Dimensity flagship smartphone chipset family that it said is optimized for edge-AI applications. The Dimensity 9400 is built on TSMC’s...
NTT Data, the global system integrator division of Japanese group NTT, has introduced a new edge AI platform that “integrates and synthesises” data from sensors, devices, and systems in enterprise venues to flow into task-specific industrial AI models. It is being offered as a...