Microsoft invested hundreds of millions of dollars in supercomputer ChatGPT

Spread the love

Microsoft has invested hundreds of millions of dollars in a supercomputer to train ChatGPT, OpenAI’s chatbot. Before that, the company used tens of thousands of Nvidia A100s.

It was also necessary to find a new way to place servers on racks to prevent failure, reports Bloomberg. Building the Azure supercomputer was part of the 2019 deal between Microsoft and OpenAI that saw Microsoft invest a billion dollars in the company. The computer with tens of thousands Nvidia A100 GPUs cost hundreds of millions of dollars.

It started as a custom computer for OpenAI, but now it’s a general purpose computer. Because the ChatGPT model has already been trained, it is transferred to other servers around the world for generating responses. Microsoft also used it for the new Bing and now other companies can use it too. In addition, Microsoft is building a next version of the supercomputer based on the Nvidia H100.

Microsoft supercomputer for training AI

You might also like