Chinese AI lab DeepSeek has access to tens of thousands of NVIDIA H100 AI GPUs for training, according to DeepSeek CEO.

The DeepSeek R1 is one of the most advanced AI models on the planet, competing with the likes of OpenAI's new o1 and Meta's Llama AI models. In a new interview with CNBC, Scale AI founder and CEO Alexander Wang said DeepSeek R1 has met, or beats all-top performing AI models in his firm's most challenging AI test.
CNBC's Andrew Ross Sorkin interviewed Wang, talking about DeepSeek's new AI test called "Humanity's Last Exam" with the "hardest questions" pumped into it by "math, physics, biology, chemistry professors" that are relevant to the latest research. After testing all of the latest AI models, Wang's team found that DeepSeek's new model was "actually the top performing, or roughly on par with the best American models, which are o1".
Wang was asked about the AI competition between the US and China, where he added: "it has been true for a long time that the United States has been ahead". But he did note that DeepSeek's new models do attempt to change that, where he thinks that "is symbolic that the Chinese lab releases, you know, an Earth-shattering model on Christmas Day when you know the rest of us are sort of celebrating a holiday".
DeepSeek is using NVIDIA's Hopper AI architecture with H100 and H200 AI GPUs training their AI models, with restrictions put in place by the Biden administration to stop powerful AI GPUs from hitting the shores of China. But it doesn't seem all that hard to get advanced AI chips into China, with Wang telling CNBC: "the reality is yes and no. You know the Chinese labs, they have more H100s than, than people think".
Wang said that his "understanding is that DeepSeek has about fifty thousand H100s" and that "they can't talk about obviously because it is against the export controls that United States has put in place", adding "they have more chips than other people expect".
In the future, getting access to advanced AI chips to China, Wang said: "But also on a go-forward basis they are going to be limited by the chip controls and the export controls that we have in place".
Marina Zhang, an associate professor at the University of Technology Sydney, who studies Chinese innovations, said: "Unlike many Chinese AI firms that rely heavily on access to advanced hardware, DeepSeek has focused on maximizing software-driven resource optimization. DeepSeek has embraced open source methods, pooling collective expertise and fostering collaborative innovation. This approach not only mitigates resource constraints but also accelerates the development of cutting-edge technologies, setting DeepSeek apart from more insular competitors".