Micron has announced that it is sampling its new 192GB SOCAMM2 memory module to customers, ready for broader adoption of low-power memory for AI data centers.

The new 192GB SOCAMM2 memory module uses Micron's most advanced 1-gamma DRAM process technology which provides over a 20% improvement in power efficiency, which the company says further enables power design optimization in large data center clusters. These savings can become quite significant in full-rack AI installations, where there can be over 40TB of CPU-attached low-power DRAM main memory.
The modular design of SOCAMM2 memory improves serviceability and lays the groundwork for future capacity expansion. Micron is building on a large 5-year collaboration with NVIDIA, with Micron pioneering the use of low-power server memory inside data centers. SOCAMM2 includes the inherent advantages of LPDDR5X memory, with exceptionally low power consumption and high bandwidth, to the main memory of AI systems.
SOCAMM2 memory has been designed to meet the ever-growing needs of massive-context AI platforms, with SOCAMM2 providing the high data throughput required for AI workloads, while delivering new levels of energy efficiency, setting a new standard for AI training and inference systems.
Raj Narasimhan, senior vice president and general manager of Micron's Cloud Memory Business Unit, said: "As AI workloads become more complex and demanding, data center servers must achieve increased efficiency, delivering more tokens for every watt of power. Micron's proven leadership in low-power DRAM ensures our SOCAMM2 modules provide the data throughput, energy efficiency, capacity and data center-class quality essential to powering the next generation of AI data center servers".




