NVIDIA is reportedly "secretly negotiating" with Samsung, SK hynix, and Micron to use new "SOCAMM" memory modules in its follow-up to Project DIGITS.

In a new report from SEDaily, we're learning from an industry insider who said "NVIDIA and memory companies are currently exchanging SOCAMM prototypes to conduct performance tests" and that "mass production could be possible as early as later this year".
The new SOCAMM memory module is a more cost-effective standard than regular DRAM modules for small PCs and laptops, as traditional PCs use DRAM modules in SO-DIMM form (with DDR4 and DDR5 modules). SOCAMM however, places low-power LPDDR5X DRAM directly onto the board with far better energy efficiency.
SOCAMM memory modules offer multiple advantages over low-power module LPCAMM, which is being pushed as the next-generation DRAM module for laptops. SOCAMM memory modules feature a higher number of I/O channels, which are the pathways that transfer signals between DRAM and electronic devices.
- Read more: NVIDIA Project DIGITS is the World's Smallest AI Supercomputer, 1 Petaflop of AI performance
Regular PC DRAM memory modules have 260 I/O channels, LPCAMM has 644 channels, but the new SOCAMM modules have 694 channels. This means that SOCAMM memory modules in future systems will overcome data bottlenecks, paving the way for even faster RAM for AI workloads compared to other memory standards.
At CES 2025 earlier this year, NVIDIA CEO Jensen Huang unveiled the AI PC "Digits" which embodies his vision of democratizing AI. Jensen said at the show earlier this year: "In the future, engineers, artists, and everyone using computers as tools will need a personal AI supercomputer", while underscoring the transformative potential of SOCAMM memory in achieving this goal.
NVIDIA is displaying a very strategic move in its push for SOCAMM, giving the company its own memory standard. It's push into the AI PC world is going to be like a wrecking ball, with some very big changes in store for 2025 and 2026.