Entrepreneurs, Programmers, or Companies Can Now Use xAI Grok — But With an Alteration, said Elon Musk while making it artificial intelligence startup xAI Grok for the open community. Musk’s xAI releases its AI model, xAI Grok, to the open-source community, but with a small tweak.
In a blog post on Sunday, xAI announced that Grok’s is now available for anyone to use for their applications, including commercial ones. “We are releasing the base model weights and network architecture of Grok-1, our large language model,” the company stated, adding, “Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.”
According to the blog post, Grok was open-sourced under Apache License 2.0, allowing commercial use, modifications, and distribution. However, it cannot be trademarked, and users do not receive liability or warranty. The code for Grok can be downloaded from its Github page or via a torrent link.
In November 2023, Grok was originally released as a proprietary model and was only accessible through Musk’s social network X, formerly Twitter. However, the release does not include the full corpus of its training data or any hookup to the real-time information available on X.
The blog post stated, “The base model trained on a large amount of text data, not fine-tuned for any particular task.”
Must Read:
Grok AI Sign Up: How to Access Elon Musk’s AI Chatbot
Difference Between Gemini vs Bard vs ChatGPT vs Copilot vs Grok vs Ernie
Elon Musk’s Grok AI Enters India & 46 Other Countries: Check How to Login XAI Grok
What are Important Announcements Made During Open Grok-1 Release?
We are releasing the weights and architecture of our 314 billion parameter Mixture-of-Experts model, Grok-1.
This is the raw base model checkpoint from the Grok-1 pre-training phase, which concluded in October 2023. This means that the model is not fine-tuned for any specific application, such as dialogue.
We are releasing the weights and the architecture under the Apache 2.0 license.
To get started with using the model, follow the instructions at github.com/xai-org/grok
Model Details:
- The base model is trained on a large amount of text data, not fine-tuned for any particular task.
- 314B parameter mix-of-experts model with 25% of the weights active on a given token.
- Trained from scratch by xAI using a custom training stack on top of JAX and Rust in October 2023.