Samsung's new HBM3e 'Shinebolt' memory: 50% perf boost, perfect for NVIDIA Blackwell GPU

Samsung's new 5th-generation HBM3e memory codenamed 'Shinebolt' teased: 24 Gb chips on 12-Hi packages with up to 36GB HBM3e capacities.

2 minutes & 22 seconds read time

Samsung's upcoming 5th generation HBM3e memory product has been codenamed "Shinebolt," with the marketing and development of Samsung's new HBM3e memory catching up to the great work SK hynix has been doing with its own next-gen memory.

Samsung's new HBM3e 'Shinebolt' memory: 50% perf boost, perfect for NVIDIA Blackwell GPU 01

According to sources at Business Korea, Samsung Electronics is currently shipping out HBM3e prototypes to some of its clients for quality approval (QA) testing. Samsung is reportedly shipping out 24 gigabit (Gb) chips in 8 layers (8-Hi), but will soon have a 36GB HBM3e product with 12 layers (12-Hi).

Samsung's new HBM3e "Shinebolt" memory will have a 50% performance boost over HBM3, with a huge 1.228TB/sec (12280MB/sec) memory bandwidth. The future is AI; we all know it, with NVIDIA's crazy-fast AI GPUs needing more and much faster VRAM. This is where Samsung's new HBM3e "Shinebolt" memory shines -- pun not intended, but it just works so well here -- capacity and bandwidth.

In terms of manufacturing, the bonding process business is continuing to grow and is a much-needed requirement of new HBM memory technologies. Samsung has been using the thermal compression-non-conductive film (TC-NCF) method from the very first few days of HBM production. Its competitor -- SK hynix -- has been using the advanced mass reflow-molded underfill (MR-MUF) process, which SK hynix only started using when its new HBM3 memory went into production.

Samsung has recently put more energy into its HBM business, re-strategizing in order to better compete with SK hynix. According to sources, Samsung is gearing up the development of HBM's "mixed connection" process which would be "changing the rules of the game".

Uhhh, bring that on, Samsung. I want to see it. The world wants to see the rules of the HBM business being changed. Especially in a world focused so hard on AI that NVIDIA is sold out of its AI GPUs right through deep into 2024. At this rate, a 2024 AI GPU release will hit hands in 2026, and a 2025 AI GPU will be delivered in 2028. That's a lot of high-speed VRAM required, and Samsung knows that, and it knows it needs to be serious to meet NVIDIA's strict requirements for its high-end, very expensive AI GPUs.

Lee Jung-bae, president of Samsung Electronics' memory business, said in a recent article titled "Unleashing the Infinite Possibilities of Samsung Memory" posted on the company's newsroom: "We are currently in production of HBM3 and are smoothly developing the next-generation product, HBM3E. We will further expand to produce custom-made HBM for our clients".

Buy at Amazon

GIGABYTE GeForce RTX 4080 Gaming OC 16G (GV-N4080GAMING OC-16GD)

TodayYesterday7 days ago30 days ago
* Prices last scanned on 11/29/2023 at 2:07 pm CST - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering.

Newsletter Subscription

Related Tags