❤️In March this year, Musk's xAI released the latest version of Grok 1.5. Since then, there have been rumors that Grok 2 is about to be released, but there has been no official news. Is it because of insufficient computing power? Yes, the billionaire may not be able to buy enough chips. In April this year, he personally said that there were not enough advanced chips, delaying the training and release of the Grok 2 model. He recently publicly stated that training Grok 2 requires about 20,000 Nvidia H100 GPUs based on the Hopper architecture, and added that Grok 3 models and later will require 100,000 H100 chips. Moreover, xAI also plans to connect all the chips in series into a huge computer-Musk calls it a "supercomputing factory."
Each H100 currently costs about $30,000. Not counting construction costs and other server equipment, the chips alone will cost $2.8 billion. Lao Ma has told investors this month that he hopes to get the supercomputer running by the fall of 2025, and he will be "personally responsible for delivering the supercomputer on time" because it is crucial to the development of LLM. This supercomputer may be jointly built by xAI and Oracle. In recent years, xAI has rented servers with about 16,000 H100 chips from Oracle, which is the largest source of orders for these chips. If xAI does not develop its own computing power, it is likely to spend $10 billion on cloud servers in the next few years. In the end, it is still more cost-effective to build a "supercomputing factory". #人工智能 $BTC $ETH $BNB