xAI Open-Sources Code for 314-Billion Parameter Grok AI Model But Omits Training Details
-
xAI has open-sourced the base code of its Grok AI model on GitHub, but without any training code. The model has 314 billion parameters.
-
Grok-1 was trained on a custom stack, though details weren't provided. The model is licensed under Apache 2.0.
-
Last week, Musk said xAI planned to open source Grok this week. The chatbot version was previously only available to Premium+ X users.
-
Other companies like Meta, Google, and AI2 have open-sourced some of their AI models.
-
Some AI tool makers like Perplexity plan to fine-tune Grok for conversational search and other applications.