Meta releases 'world's largest' AI model trained on $400 million worth of GPUs

Meta has released what it's calling the 'world's largest' open-source AI model that was trained on more than 15 trillion tokens and 15,000 NVIDIA GPUs.

Meta releases 'world's largest' AI model trained on $400 million worth of GPUs
Comment IconFacebook IconX IconReddit Icon
Tech and Science Editor
Published
2 minutes read time

As an Amazon Associate, we earn from qualifying purchases. TweakTown may also earn commissions from other affiliate partners at no extra cost to you.

Meta has announced its release of Llama 3.1 405B, which is what the company is describing as the "world's largest" open-source large language model.

Meta releases 'world's largest' AI model trained on $400 million worth of GPUs 366998855

Meta explains via a new blog post that Llama 3.1 405B is "in a class of its own" with "state-of-the-art capabilities" that rival the leading AI models currently on the market when it comes to general knowledge, steerability, math, multilingual translation, and tool use. Meta directly compares Llama 3.1 405B with competing AI models, such as OpenAI's various GPT models, showcasing the recently released model trained on 15 trillion tokens. A token can be considered a fragment of a question and an answer.

To achieve the training of this 405 billion parameter model, Meta used 16,000 NVIDIA H100 GPUs, which cost $25,000 each. This means the AI model was trained by $400 million worth of NVIDIA GPUs, which required 30.84 million GPU hours and produced approximately 11,390 tons of CO2.

For comparison's sake, Meta's latest AI model has 405 billion parameters, while OpenAI's GPT-4 is reportedly 220 billion. However, reports indicate GPT-4 could essentially be eight AI models hiding under one trench coat, making a total of 1.7 trillion parameters. This isn't confirmed and is rumored.

Meta releases 'world's largest' AI model trained on $400 million worth of GPUs 25665