Micron's new SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support its just-announced GB300 Grace Blackwell Superchip announced at GTC 2025 this week.

Micron's new modular SOCAMM memory modules provide over 2.5x higher bandwidth at the same capacity as RDIMMs, and they're super-small, occupying just one-third the size of the industry-standard RDIMM form factor. Thanks to LPDDR5X, the new SOCAMM modules use one-third the power of regular DDR5 DIMMs, while SOCAMM placements of 16-die stacks of LPDDR5X memory enable a 128GB memory module: the highest-capacity LPDDR5X memory solution, which is perfect for training large AI models and more concurrent users on inference workloads.
Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit, explained: "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications. HBM and LP memory solutions help unlock improved computational capabilities for GPUs".
SOCAMM: a new standard for AI memory performance and efficiency: Micron's SOCAMM solution is now in volume production. The modular SOCAMM solution enables accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to provide high-capacity memory for increasing AI workload requirements.
Micron SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution,1 designed to meet the demands of AI servers and data-intensive applications. This new SOCAMM solution enables data centers to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to provide infrastructure flexibility.
- Fastest: SOCAMMs provide over 2.5 times higher bandwidth at the same capacity when compared to RDIMMs, allowing faster access to larger training datasets and more complex models, as well as increasing throughput for inference workloads.2
- Smallest: At 14x90mm, the innovative SOCAMM form factor occupies one-third of the size of the industry-standard RDIMM form factor, enabling compact, efficient server design.3
- Lowest power: Leveraging LPDDR5X memory, SOCAMM products consume one-third the power compared to standard DDR5 RDIMMs, inflecting the power performance curve in AI architectures.4
- Highest capacity: SOCAMM solutions use four placements of 16-die stacks of LPDDR5X memory to enable a 128GB memory module, offering the highest capacity LPDDR5X memory solution, which is essential for advancements towards faster AI model training and increased concurrent users for inference workloads.
- Optimized scalability and serviceability: SOCAMM's modular design and innovative stacking technology improve serviceability and aid the design of liquid-cooled servers. The enhanced error correction feature in Micron's LPDDR5X with data center-focused test flows, provides an optimized memory solution designed for the data center.