TRENDING: Grand Theft Auto 6 release date confirmed by Take-Two

This is what 2.5D aka CoWoS advanced packaging looks like: GPU logic die, HBM, interposer

This is what 2.5D advanced packaging (CoWoS) looks like, with ASE showing off an incredible model in Taiwan of how the components are bound together.

This is what 2.5D aka CoWoS advanced packaging looks like: GPU logic die, HBM, interposer
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
3 minutes & 45 seconds read time

One of the most amazing things that human civilization has created is the silicon chip, with semiconductor technology not just about the 'CPU' or the 'GPU' anymore... but rather advanced packaging technology is absolutely bleeding-edge, and now there's an awesome way to visualize just how amazing it is. Check this out:

ASE showed off a rather awesome model in Taiwan recently, which demonstrates the various components of advanced packaging, and how it is bound together through CoWoS (Chip On Wafer On Silicon). The center piece is the XPU or GPU logic die (this does the calculations) while surrounding that chip are multiple layers of HBM (High Bandwidth Memory) which is made by SK hynix, Samsung, and Micron.

All of this delicious semiconductor tech is packaged together with microbumps onto the copper colored RDL, while underneath the silver-colored component is the silicon interposer. After that, everything is placed onto the substrate itself, which the likes of TSMC and Samsung working towards "radically new" semiconductor packaging technology called panel-level packaging, you can read more about that in the links below:

In a breakdown of the 2.5D aka CoWoS advanced packaging technique, Clark Tang explains that the complex process of advanced packaging is done to create a combined chip that is able to access different capabilities at a very fast rate. Most people think that it's just the GPU doing the work -- but the HBM memory is just as, if not more important than an ultra-fast AI chip -- just like super-fast GDDR6, GDDR6X, and upcoming GDDR7X is to consumer graphics cards. The faster, the wider bus, the better for high-res, high FPS gaming.

HBM3 is the leading memory for AI GPUs right now, with HBM3E debuting inside of NVIDIA's beefed-up Hopper H200 AI GPU, and its new Blackwell AI GPUs. HBM4 and HBM4E will debut in 2025 and 2026, with NVIDIA's next-gen Rubin R100 AI GPU.

AI workloads froth over high memory bandwidth, with constraints on the system occurring when how fast the XPU can read and write the AI calculations to memory.

Newsletter Subscription

Join the daily TweakTown Newsletter for a special insider look into new content and what is happening behind the scenes.

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Related Topics

Newsletter Subscription