NVIDIA is secretly negotiating with Samsung, SK hynix, and Micron to use SOCAMM memory modules

NVIDIA 'secretly negotiating' with Samsung, SK hynix, and Micron to use new 'SOCAMM' modules: new RAM overcomes data bottlenecks for AI workloads.

NVIDIA is secretly negotiating with Samsung, SK hynix, and Micron to use SOCAMM memory modules
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
1 minute & 45 seconds read time
TL;DR: NVIDIA is negotiating with Samsung, SK hynix, and Micron to use SOCAMM memory modules in its Project DIGITS follow-up. SOCAMM offers better energy efficiency and more I/O channels than current DRAM standards, enhancing AI workloads. Mass production may start this year, marking a strategic shift in AI PC development.

NVIDIA is reportedly "secretly negotiating" with Samsung, SK hynix, and Micron to use new "SOCAMM" memory modules in its follow-up to Project DIGITS.

NVIDIA is secretly negotiating with Samsung, SK hynix, and Micron to use SOCAMM memory modules 60

In a new report from SEDaily, we're learning from an industry insider who said "NVIDIA and memory companies are currently exchanging SOCAMM prototypes to conduct performance tests" and that "mass production could be possible as early as later this year".

The new SOCAMM memory module is a more cost-effective standard than regular DRAM modules for small PCs and laptops, as traditional PCs use DRAM modules in SO-DIMM form (with DDR4 and DDR5 modules). SOCAMM however, places low-power LPDDR5X DRAM directly onto the board with far better energy efficiency.

SOCAMM memory modules offer multiple advantages over low-power module LPCAMM, which is being pushed as the next-generation DRAM module for laptops. SOCAMM memory modules feature a higher number of I/O channels, which are the pathways that transfer signals between DRAM and electronic devices.

Regular PC DRAM memory modules have 260 I/O channels, LPCAMM has 644 channels, but the new SOCAMM modules have 694 channels. This means that SOCAMM memory modules in future systems will overcome data bottlenecks, paving the way for even faster RAM for AI workloads compared to other memory standards.

At CES 2025 earlier this year, NVIDIA CEO Jensen Huang unveiled the AI PC "Digits" which embodies his vision of democratizing AI. Jensen said at the show earlier this year: "In the future, engineers, artists, and everyone using computers as tools will need a personal AI supercomputer", while underscoring the transformative potential of SOCAMM memory in achieving this goal.

NVIDIA is displaying a very strategic move in its push for SOCAMM, giving the company its own memory standard. It's push into the AI PC world is going to be like a wrecking ball, with some very big changes in store for 2025 and 2026.

Photo of the MSI GeForce RTX 5080 16G Ventrus 3X OC Plus
Best Deals: MSI GeForce RTX 5080 16G Ventrus 3X OC Plus
Today7 days ago30 days ago
$1629.95 USD-
-$6684.37 CAD
$1629.95 USD-
$1629.95 USD-
Check PriceCheck Price
* Prices last scanned 4/28/2026 at 2:09 am CDT - prices may be inaccurate. As an Amazon Associate, we earn from qualifying purchases. We earn affiliate commission from any Newegg or PCCG sales.

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined TweakTown in 2010 and has since reviewed 100s of tech products. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Follow TweakTown on Google News
Newsletter Subscription