Giveaway: Win an ASRock Z890 Taichi Lite Motherboard

NVIDIA preparing 800,000 units of SOCAMM memory for AI PCs, ready for its N1X chip in 2026

NVIDIA is reportedly preparing 800,000+ units of LPDDR-based SOCAMM memory for AI PCs, demand expected to explode with next-gen SOCAMM 2 memory.

NVIDIA preparing 800,000 units of SOCAMM memory for AI PCs, ready for its N1X chip in 2026
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
1 minute & 45 seconds read time
TL;DR: NVIDIA is ramping up production of LPDDR-based SOCAMM memory, targeting 600,000 to 800,000 units in 2024 for AI PC and server products. SOCAMM offers superior power efficiency, modular upgrades, and higher bandwidth than traditional memory, positioning it as the future standard for low-power AI devices.

NVIDIA is reportedly acquiring hundreds of thousands of LPDDR-based SOCAMM memory that it'll use in future AI PC products, with demand for next-gen SOCAMM 2 memory expected to boom in the years ahead.

At its recent GTC (GPU Technology Conference) event earlier this year, the company showcased SOCAMM memory because of its superior performance and lower power consumption for AI products, with NVIDIA's new GB300 AI platform using SOCAMM memory developed by Micron. SOCAMM is very different to HBM and LPDDR5X memory used on current AI products including servers and mobile platforms.

SOCAMM memory is based on LPDDR DRAM, which is traditionally used inside of mobile and low-power devices, but SOCAMM memory is upgradable, unlike HBM and LPDDR5X memory. SOCAMM is not soldered onto the PCB, and can be secured by just three screws.

In a new report from Korean media outlet ETNews, we're hearing that NVIDIA is reportedly set to make between 600,000 to 800,000 units of its new SOCAMM memory this year, with a boost in production of the new memory standard in deployment for NVIDIA's family of AI products.

NVIDIA's new GB300 AI server platform is one of the first uses of SOCAMM memory, with the company set to move to the new SOCAMM memory standard for its future AI products, with up to 800,000 units of the new LPDDR-based SOCAMM memory lower than the use of HBM memory inside of its products by its memory partners in 2025, with plans to scale this up big time with next-gen SOCAMM 2 memory in 2026 and beyond.

NVIDIA preparing 800,000 units of SOCAMM memory for AI PCs, ready for its N1X chip in 2026 703

SOCAMM includes a custom form factor, where it's not just compact and modular, but it's also incredibly more power efficient than RDIMM memory. We should expect SOCAMM to feature improved power efficiency, while also delivering more bandwidth than RDIMM, LPDDR5X, and LPCAMM memory.

SOCAMM has around 150GB/sec to 250GB/sec of memory bandwidth and they're swappable, meaning SOCAMM is a great option for AI PCs and AI servers, as they can be upgraded with ease. SOCAMM memory is expected to become the new standard for low-power AI devices, while Micron is the current manufacturer for SOCAMM memory for NVIDIA, the likes of Samsung and SK hynix are reportedly in discussions with NVIDIA to make the new SOCAMM modules for the company, too.

Photo of the AMD Ryzen 7 9800X3D
Best Deals: AMD Ryzen 7 9800X3D
Today7 days ago30 days ago
$383.54 USD$419.95 USD
$464 USD$469 USD
--
$629.99 CAD$629.99 CAD
£374.28-
$383.54 USD$419.95 USD
$715$719
* Prices last scanned 4/21/2026 at 12:22 pm CDT - prices may be inaccurate. As an Amazon Associate, we earn from qualifying purchases. We earn affiliate commission from any Newegg or PCCG sales.
News Source:wccftech.com

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined TweakTown in 2010 and has since reviewed 100s of tech products. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Follow TweakTown on Google News
Newsletter Subscription