Elon Musk has held a call with major xAI investors recently in an attempt to raise tens of billions of dollars, according to CNBC's David Faber, a renowned financial journalist and market news analyst. Musk reportedly outlined the raise as a way to place a proper value on the company, although the analyst contends that the money could also be spent on xAI's Colossus 2 supercomputer, which features one million GPUs.
"Musk is quoted as having said, we are going to 'put a proper value on the company' in reference to xAI and people took that to mean and again, this is speculation, that they will have a large raise, the last raise," said Faber. "Remember that I reported on $6 billion. This one would be far in excess of that. Perhaps you get a raise of something like $25 billion for a value that could purport to be between $150 and $200 billion. That's speculation. But that is kind of the conversation that is going on after this call."
xAI's major spendings are on supercomputer clusters to train even more advanced AI models and then used them to earn money. Currently, xAI has its Colossus supercomputer with 200,000 Nvidia's Hopper H100 and H100 GPUs, but Musk is gearing up to build Colossus 2 with a million GPUs, according to Faber. This is apparently why the company needs money. However, it requires considerably more than Faber speculates.
One million Nvidia Blackwell B100 or B200 GPUs will cost from $50 billion to $62.5 billion, depending on the deal Elon Musk manages to strike with Nvidia and its partners. The remaining infrastructure (building, servers, networking gear, cooling, etc.) would roughly cost approximately the same amount of money, so we're looking at a total of $100 billion to $125 billion. Whether Musk can raise them in a reasonable amount of time remains to be seen.
But Musk's xAI is certainly not alone in seeking massive funding for next-generation AI data centers. For example, the chief executive of Broadcom, which develops bespoke AI processors for major cloud service providers (CSPs) such as Google and Meta, stated late last year that he expected next-generation AI data centers to house around a million AI processors by 2027. Although some expected the need for compute performance in AI to decrease as companies adopt more efficient ways to train models and run inference, it appears that Musk is not one of those people.