OpenAI Burns $700,000 A Day To Run The Chatbot!

The AI company OpenAI is said to have to spend at least $700,000 daily to keep ChatGPT running. With the new version GPT-4, it could get far more expensive.

As The Information reports, the sheer maintenance of the operation of the ChatGPT AI tool gobbles up vast sums of money. According to calculations by the market research company Semi-Analysis, the manufacturer OpenAI has already had to spend up to $700,000 daily to operate the necessary high-performance infrastructure.

Semi-Analysis explains that “most of these costs” are due to the “expensive servers they require.” Chief analyst Dylan Patel even assumes to Business Insider that the costs should be even higher.

That’s because his company’s calculations were based on GPT-3, the previous version that now powers the older and now free version of ChatGPT. OpenAI‘s latest language model, GPT-4, would cost even more, according to Patel.

The problem is typical of all types of AI applications because they are powered by expensive specialized chips that consume incredible power. That’s the main reason why AI companies like Google, Amazon, and Microsoft, which is behind OpenAI, are investing heavily in developing their own chips.

While Google and Amazon are already using their chips in production use, Microsoft has been developing its chip, known internally as “Athena,” since 2019 and is still developing it. In the meantime, it is said to be available to a few Microsoft and OpenAI employees, according to The Information.

“Athena, if competitive, could reduce the cost per chip by a third compared to Nvidia’s offerings,” Patel is confident. Meanwhile, Microsoft does not want to replace the Nvidia chips it has used to date ultimately. After all, both companies have recently agreed on an extensive AI partnership that will last several years.

Generally, it has been confirmed for AI models that their performance correlates directly with their size. This, in turn connects directly with the operating costs.

OpenAI CEO Sam Altman believes that “we are at the end of the era” of “giant AI models.” Large language models such as ChatGPT, he said, have reached a point of diminishing returns due to their enormous size.

With a size of more than a trillion parameters, OpenAI‘s latest GPT-4 model may already have reached the limit of practical scalability. At any rate, this is the conclusion reached by OpenAI in its analysis.

However, the fact that the high operation costs could bring OpenAI into financial difficulties can be ruled out because of the massive success of ChatGPT.

Latest articles

Related articles

Leave a reply

Please enter your comment!
Please enter your name here