Rumor: Twitter is working on its own LLM with ten thousand GPUs

Spread the love

Twitter is said to be working on its own Large Language Model, an AI model such as GPT from OpenAI and LaMDA from Google. The company is said to have purchased ten thousand GPUs for this purpose. Training such a model requires a lot of computing power.

The project is still in an early phase, claims Business Insider based on own sources. It is still unknown what the purpose of the large language model will be. This may involve better advertising or improving the search function of the social network.

The step is salient, because Twitter director Elon Musk was one of the initiators of an open letter a few weeks ago, in which the signatories asked to pause AI initiatives for six months in order to take time to draw up regulations. Musk has also been involved with OpenAI, the company behind the GPT model and ChatGPT.

The GPUs that Nvidia sells for AI training cost $10,000 each, potentially representing an investment of tens of millions of dollars. Twitter has not responded to the rumor and has not confirmed the purchase of the GPUs.

You might also like