With all the big tech companies investing billions in AI data centers, research, and the creation of generative AI models and tools, many are looking to create their own hardware as an alternative to NVIDIA's chips - while competing with AMD, Intel, and new AI-chip players like Microsoft.
Google is entering the race with its own arm-based processor designed for the AI market. Like Google's tensor processing units (TPUs), which developers can access only via Google Cloud, the Arm-based CPU called Axiom will apparently deliver "superior performance to x86 chips."
How much extra performance? According to Google, Axiom offers 30% better performance than "general purpose Arm chips" and 50% better performance than "current generation x86 chips" as produced by Intel and AMD.
Google plans to offer the new Axiom chip to Google Cloud customers. The Arm-based chip is set to power popular Google services like YouTube Ads "soon." The new chip will run in pods of 8,960 chips to smash the performance of the previous generation.
"We're making it easy for customers to bring their existing workloads to Arm," said Mark Lohmeyer, Google Cloud's vice president and general manager of compute and machine learning infrastructure. "Axion is built on open foundations but customers using Arm anywhere can easily adopt Axion without re-architecting or re-writing their apps."
With chip scarcity, energy concerns, and rising costs of delivering and servicing AI, it's no wonder Google is looking for its own hardware-based solution for offering AI cloud services. It's worth noting that even though NVIDIA controlled 83% of the data center chip market for AI in 2023, Google's tensor processing units (TPUs) held the majority of the remaining market share.
- Read more: Intel CEO suggests AI will create the first 'one-person, billion-dollar company'
- Read more: Intel announces Gaudi 3 AI accelerator: 128GB HBM2e at up to 3.7TB/sec, up to 900W power
- Read more: OpenAI reportedly trained its best AI model on a million hours of YouTube data
- Read more: Elon Musk says training next-gen Grok 3 will require 100,000 NVIDIA H100 AI GPUs