Amazon secures $38B deal to provide OpenAI access to NVIDIA GB200, GB300 AI servers

Amazon Web Services (AWS) and OpenAI ink new $38 billion deal to host OpenAI's new NVIDIA GB200 and GB300 AI servers to be deployed in 2026.

Amazon secures $38B deal to provide OpenAI access to NVIDIA GB200, GB300 AI servers
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
1 minute & 45 seconds read time
TL;DR: Amazon Web Services (AWS) and OpenAI have formed a multi-year, $38 billion partnership, providing OpenAI access to NVIDIA GB200 and GB300 AI servers hosted on AWS EC2 UltraServers. This collaboration supports OpenAI's goal to expand AI infrastructure, enhancing large-scale model training and advanced AI deployment by 2026.

Amazon and OpenAI have just announced a new partnership, where Amazon Web Services (AWS) will be one of the primary compute providers to the AI startup, hosting NVIDIA's GB200 and GB300 AI servers.

The multi-year partnership between AWS and OpenAI will see OpenAI getting access to a pool of Amazon's NVIDIA GB200 and GB300 AI servers, in a deal worth around $38 billion, and will span 7 years. OpenAI gets access to NVIDIA GB200 and GB300 AI servers which is important, as all planned capacity is expected to be deployed for use by the end of 2026.

This means that OpenAI will have even more access to the best NVIDIA AI GPU hardware by next year, helping it continue to expand its AI operations and services. OpenAI has been signing new deals left, right, and center, including with US tech giants NVIDIA, AMD, Microsoft, Broadcom, and Oracle, and now adding Amazon to that growing list.

These new AI servers that OpenAI will have access to are set up across Amazon EC2 UltraServers, which are linked together by a fast, high-capacity network that was built to handle both inference, and large-scale model training. AWS is currently hosting massive clusters with over 500,000 chips, one of the big key factors in the decisions at OpenAI.

Amazon secures $38B deal to provide OpenAI access to NVIDIA GB200, GB300 AI servers 905

OpenAI CEO Sam Altman has said that the ChatGPT creator plans to invest $1.4 trillion -- yes, with a "T" -- to build 30 gigawatts of computing capacity, which works out to the same power consumption as around 25 million US households. OpenAI has so far inked deals with NVIDIA for 10 GW, AMD for 6 GW, and Oracle for 4.5 GW of compute capacity, as it slowly builds its steps up to the goal of building a 30 GW AI infrastructure network.

OpenAI co-founder and CEO Sam Altman said: "Scaling frontier AI requires massive, reliable compute. Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone".

Matt Garman, CEO of AWS, added: "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions. The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads".

Best Deals: NVIDIA Tesla V100 Graphics Card
Today7 days ago30 days ago
$1641.91 CAD$1667.09 CAD
--
Check PriceCheck Price
* Prices last scanned 5/8/2026 at 11:52 pm CDT - prices may be inaccurate. As an Amazon Associate, we earn from qualifying purchases. We earn affiliate commission from any Newegg or PCCG sales.
News Source:aboutamazon.com

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined TweakTown in 2010 and has since reviewed 100s of tech products. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Follow TweakTown on Google News
Newsletter Subscription