Technology and gaming content trusted in North America and globally since 1999
8,610 Reviews & Articles | 60,937 News Posts

Intel's new Lake Crest CPU has AI tech, 32GB of HBM2

Intel is set to shake the enterprise and deep learning markets with their next-gen Lake Crest architecture

Anthony Garreffa | Feb 5, 2017 at 2:22 am CST (1 min, 38 secs time to read)

AMD is going to completely own the CPU game with its Ryzen processors, offering some nice technology features that even Intel doesn't have on their flagship processors - but Intel is preparing for a new deep neural network aimed CPU architecture, known as Lake Crest.

Intel's new Lake Crest CPU has AI tech, 32GB of HBM2 01 | TweakTown.com

Lake Crest have been made with DNN (Deep Neural Network) workloads in mind, so it will better compete against the GPU-based offerings from AMD and NVIDIA.

Intel's new Lake Crest CPU has AI tech, 32GB of HBM2 02 | TweakTown.com

Intel acquired deep learning startup Nervana for $350 million last year, so the catalyst of this partnership is the new Lake Crest architecture. Intel's VP Datacenter Group and GM for AI solutions, Naveen Rao explains: "We have developed the Nervana hardware especially with regard to deep learning workloads. In this area, two operations are often used: matrix multiplication and convolution".

The next generation Lake Crest CPU will operate as a Xeon co-processor, but is designed to increase AI workloads in a big way thanks to Intel's new "Flexpoint" architecture, which will be used inside the arithmetic nodes of the new Lake Crest CPU. Intel's new super-power can increase arithmetic operations on the Lake Crest CPU by up to 10x, as well as offering MCM (Multi Chip Module) design.

Intel's new Lake Crest CPU has AI tech, 32GB of HBM2 03 | TweakTown.com

Better yet, it will have 32GB of HBM2 available, with a huge 8Tbps of memory bandwidth across the entire CPU.

Intel's new Lake Crest CPU has AI tech, 32GB of HBM2 04 | TweakTown.com

Intel will be using "proprietary inter-chip links" which the company says are "up to 20x faster than PCIe".

Diane Bryant, Executive Vice President and GM of the Data Center Group at Intel explains: " We expect the Intel Nervana platform to produce breakthrough performance and dramatic reductions in the time to train complex neural networks. Before the end of the decade, Intel will deliver a 100-fold increase in performance that will turbocharge the pace of innovation in the emerging deep learning space".

Last updated: Apr 6, 2020 at 04:47 pm CDT

NEWS SOURCE:wccftech.com
Anthony Garreffa

ABOUT THE AUTHOR - Anthony Garreffa

Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering.

Related Tags