As per the headline, SK hynix has announced that it has begun mass production of 192GB SOCAMM2 memory modules, the company's next-gen standard based on the 1cnm process, or sixth-gen 10nm technology. This LPDDR5X low-power DRAM, a technology typically associated with low-power mobile devices, is poised to become a primary memory solution for next-gen AI servers.

The reason for the move to SOCAMM2 modules is clear when you look at the benefits: more than double the bandwidth and a 75% improvement to power efficiency when compared to conventional RDIMM memory modules. SK hynix notes that this delivers an "optimized solution for high-performance AI operations."
And when it comes to cutting-edge AI operations, SK hynix confirms that these 192GB SOCAMM2 memory modules and other SOCAMM2 products are designed for NVIDIA's upcoming Vera Rubin platform, where they will help mitigate memory bottlenecks during training and inference. SK hynix has been closely collaborating with NVIDIA on the development of its next-gen SOCAMM2 memory modules for this very purpose.
- Read more: SK hynix ships world's first 12-layer HBM4 samples to customers, ready for NVIDIA Rubin AI GPUs
- Read more: SK hynix showcases world's first HBM4: 16-Hi stacks, 2TB/sec memory bandwidth, TSMC logic die
- Read more: SK hynix confirms both HBM4 and HBM4E memory are coming this year for next-gen AI GPUs
And based on the form factor and size, there are also benefits to space utilization in large AI systems and servers, with improved thermal performance. According to the announcement, the company was able to stabilize mass production ahead of schedule, and will start supplying modules to NVIDIA at the end of the month.
"By supplying the 192 GB SOCAMM2, SK hynix has established a new standard for AI memory performance," Justin Kim, President & Head of AI Infra (CMO, Chief Marketing Officer) at SK hynix said. "We will solidify our position as the most trusted AI memory solution provider, through close collaboration with our global AI customers."




