Intel made more than $1 billion in revenue from artificial intelligence (AI) processor chips sold to data centres in 2017. It also said its target market for connectivity, storage and computing products, swelled by the burgeoning internet of things (IoT) space, will widen to $200 billion in 2022 – a 25 per cent jump on its previous 2021 forecast, of $160 billion.
“This is the biggest opportunity in the history of the company,” commented Navin Shenoy, executive vice president and general manager of the company’s data centre group, in a blog post, attending the company’s Data-Centric Innovation Summit in Santa Clara, California.
“We’ve entered a new era of data-centric computing. The proliferation of the cloud beyond hyperscale and into the network and out to the edge, the impending transition to 5G, and the growth of AI and analytics have driven a profound shift in the market, creating massive amounts of largely untapped data.
“When you add the growth in processing power, breakthroughs in connectivity, storage, memory and algorithms, we end up with a completely new way of thinking about infrastructure. I’m excited about the huge and fast data-centric opportunity that we see ahead.”
The $1 billion break-out figure, for revenue from AI chips last year, came from customers running AI on Intel Xeon processors in data centres, he said. The company’s AI training and inference performance has improved 200-fold alongside, he said. Intel’s total 2017 revenue was $62.8 billion.
Ninety per cent of the world’s data has been generated in the past two years, according to Intel, which quoted research at the same time that data volumes will grow 10-fold by 2025, reaching 163 zettabytes.
Shenoy said: “We have a long way to go in harnessing the power of this data. A safe guess is that only about one per cent of it is utilised, processed and acted upon. Imagine what could happen if we were able to effectively leverage more of this data at scale.”
He cited the “intersection of data and transportation” as an example.
“The life-saving potential of autonomous driving is profound – many lives globally could be saved as a result of fewer accidents. Achieving this, however, requires a combination of technologies working in concert – everything including computer vision, edge computing, mapping, the cloud and artificial intelligence (AI),” he said.
But Shenoy urged the industry to shift its perception of computing. “We need to look at data holistically, including how we move data faster, store more of it and process everything from the cloud to the edge.”
Intel revealed its Xeon roadmap at the Data-Centric Innovation Summit, including:
CASCADE LAKE
A future Intel Xeon Scalable processor based on 14nm technology that will introduce Intel Optane DC persistent memory and a set of new AI features called Intel DL Boost. An embedded AI accelerator, it will speed deep learning inference workloads, with 11-times faster image recognition than current Xeon Scalable processors. Cascade Lake will start shipping late this year.
COOPER LAKE
A future Intel Xeon Scalable processor, also based on 14nm technology. Cooper Lake will introduce a new generation platform with significant performance improvements, new I/O features, new Intel DL Boost capabilities (Bfloat16) that improve AI/deep learning training performance, and additional Intel Optane DC persistent memory innovations. Starts shipping in 2019.
ICE LAKE
A future Intel Xeon Scalable processor based on 10nm technology that shares a common platform with Cooper Lake. Starts shipping in 2020.