India, Feb. 18 -- The development cycle of Elon Musk's "scary smart" Grok 3 sped up through the use of its Colossus supercomputer, which underwent construction within a period of eight months, according to xAI. The Tesla boss called it the "smartest AI on Earth."

Grok 3 used 100,000 Nvidia H100 GPUs to provide 200 million GPU-hours for training which exceeded Grok 2 by ten times. The large-scale installation of more computational power in Grok 3 enables it to run big datasets in a shorter time frame while providing enhanced accuracy.

xAI achieved better capabilities for Grok 3 by modifying its training processes beyond hardware improvements. The updated model implements synthetic datasets, self-correction, and reinforcement learning to ...