Micron has confirmed it has begun shipping its fastest-ever DDR5 Registered Dual In-Line Memory Modules to customers, featuring 256GB capacities and speeds of up to 9,200 MT/s. At more than 40% faster than DDR5 RDIMMs currently in volume production, these modules represent a meaningful jump in what server memory can deliver.
The new modules are built on Micron's leading-edge 1-gamma process technology and use advanced 3D stacking alongside through-silicon-via packaging to meet those capacity and speed targets. Micron also claims that a single 256GB module delivers 40% power savings compared to running two 128GB modules together to achieve the same capacity, providing a significant efficiency gain for data center operators running thousands of servers.
With its new 256GB DDR5 RDIMM modules, Micron aims to deliver higher bandwidth and greater DRAM density by maximizing memory capacity per CPU socket. This helps provide the performance and power efficiency required for high-end AI servers.

The ongoing AI boom has created strong demand for larger enterprise memory capacities, higher bandwidth, and better power efficiency, and these modules are designed to address all three. That allows server architects, hyperscale operators, and platform partners to maximize memory capacity per socket while staying within the thermal and power limits of modern data center infrastructure.
Micron is also working with key ecosystem partners to validate these modules across current and next-generation server platforms, ensuring broad compatibility before volume production ramps up later this year. Samples are already with server ecosystem enablers, so the groundwork is being laid now.
The timing makes sense. Last week, Micron's CEO said AI memory demand is still in its "first innings," and with inference workloads and token demand continuing to rise, the appetite for higher-capacity and higher-performance memory is only going in one direction. That said, we expect Micron to start producing these memory modules in high volumes later this year.




