top of page just open sourced Grok: what you need to know (March 2024)

Elon Musk's just open-sourced Grok, marking it as the largest open-source large language model to date, but it might not be what you anticipate.


With its 314 billion parameters, it's nearly five times larger than Meta's Llama 2. The cost of training models of this scale ranges from tens to hundreds of millions of dollars, making this a significant achievement for the open-source community and paving the way for the creation of a genuine open-source Chat GPT competitor.

However, there's a catch. The released model is designed to encapsulate as much knowledge as possible, similar to GPT-3 released by OpenAI in 2020, but it's not tailored for chat. It's not the fine-tuned model currently utilized on Twitter, known as X. Therefore, despite its vast intelligence, it may not provide satisfactory answers to queries.

This indicates that we're still several million dollars away from having a truly open-source chat GPT competitor.

Nonetheless, the challenge is set, and it's exciting to see what the open-source community will develop next.

Yours, Pieter


Commenting has been turned off.
bottom of page