You are currently viewing xAI open sources base model of Grok, but without any training code

xAI open sources base model of Grok, but without any training code

Elon Musk’s xAI has made the base code of the Grok AI model available as open source on GitHub, but it lacks any training code. The model, referred to as the 314 billion parameter Mixture-of-Expert model, was described by the company in a blog post. xAI clarified that the model wasn’t specifically optimized for any particular application, such as conversational use. The training of Grok-1 utilized a “custom” stack, although specifics were not disclosed. It is licensed under the Apache License 2.0, allowing for commercial usage. Musk previously mentioned on X that xAI planned to open source the Grok model, which was released in chatbot form last year for Premium+ users of the X social network. Notably, while the chatbot had access to some X data, the open source model does not include connections to the social network.

Several prominent companies have made some of their AI models open source, including Meta’s LLaMa, Mistral, Falcon, and AI2. In February, Google also unveiled two new open models named Gemma2B and Gemma7B. Certain AI tool developers are discussing the integration of Grok into their solutions. Aravind Srinivas, CEO of Perplexity, announced on X that the company plans to optimize Grok for conversational search and offer it to Pro users. Musk has been embroiled in a legal dispute with OpenAI and filed a lawsuit against the organization earlier this month, citing a breach of the nonprofit AI mission. Subsequently, he has criticized OpenAI and Sam Altman on X on multiple occasions.

Leave a Reply