Google has teased some photos of using NVIDIA's new Blackwell GB200 NVL AI server racks for its AI cloud platform, using liquid-cooled GB200 AI GPUs. Check it out, because it's utterly gorgeous:

The official Google Cloud account shared the photo on X, with the US-based search giant showing off its first GB200 NVL-based server, deployed to power its AI cloud platform. Google is now deploying NVIDIA GB200 NVL racks for its AI cloud platform, showing off liquid-cooled GB200 high-performance AI GPUs: each of the GB200 chips feature 1 x Grace CPU and 1 x B200 AI GPU for up to 90 TFLOPs of FP64 compute performance.
Google is using custom GB200 NVL racks here, so we don't know what the configuration is exactly -- as the GB200 NVL72 packs 32 x Grace CPUs and 72 x B200 AI GPUs through a 72-GPU NVLink domain.
- Read more: NVIDIA GB200 NVL72 AI server: 'highest-power-consuming server in HISTORY'
- Read more: NVIDIA 'halting developing' of GB200 NVL36x2 AI servers
- Read more: NVIDIA, Foxconn to build Taiwan's fastest supercomputer: with GB200 NVL72
- Read more: Supermicro confirms NVIDIA B200 AI GPU delay: offers liquid-cooled H200 AI GPUs instead
NVIDIA's new Blackwell GB200 NVL72 AI server rack features up to 130TB/sec of bandwidth, with the 36 Grace CPUs and 72 B200 AI GPUs offering a mind-boggling 3240 TFLOPs of FP64 compute performance, and an oh-my-gosh 13.5TB of HBM3E memory.
Google isn't the first cloud service provider (CSP) to use Blackwell GB200 AI servers, with Foxconn deploying GB200 NVL72 racks to build the fastest supercomputer in Taiwan alongside NVIDIA. Google is now having some fun with Blackwell.