A new report from Reuters suggests that OpenAI are considering developing its own in-house chips to power the future of AI-tool queries.

The creators behind the immensely popular ChatGPT are reportedly tossing up between acquiring a company to supply GPUs or making one in-house. The report states that OpenAI CEO Sam Altman has prioritized acquiring more AI chips, which will improve the speed and reliability of the API. Additionally, acquiring a new chip source may dramatically reduce running costs for OpenAI, as an analysis by Stacy Rasgon from Bernstein Research suggests that each query on ChatGPT costs OpenAI 4 cents.
Furthermore, ChatGPT received more than 100 million monthly users in its first two months of launching, which equates to millions of queries per day. If OpenAI's chatbot reaches a tenth of Google's total queries per day it would cost the company $48.1 billion in GPUs and $16 billion a year on chips for future years. At the moment NVIDIA is dominating the AI chip-making market, with OpenAI using approximately 10,000 NVIDIA GPUs to power its AI tools.
However, OpenAI may opt for Microsoft's own in-house chip that it's been developing since 2019 - codenamed Athena. OpenAI has reportedly been testing the new technology, but it may still be some years before we see it rolled out.