👀 Way ahead! Musk opens source Grok-1 model code and weight files

Early this morning, xAI, a large model company owned by Musk, announced the official open source of the 314 billion parameter hybrid expert (MoE) model "Grok-1", as well as the model's weights and network architecture, which is good for the AI ​​sector!

This also makes Grok-1 the open source large language model with the largest number of parameters. It can currently be downloaded on Github. The file size is 318.24GB. I am ready to play with it next time.

Previously, Musk publicly dissed on Twitter that OpenAI’s large model was not open source, so he turned around and made his Grok open source. You must know that Grok is one of the paid version features of Twitter. Now that it is open source, you can open it on your own computer. Deploy your own large model locally.

🔶 Like and follow. If you have any needs, I will make a tutorial later to teach you how to deploy your own Grok assistant.

#热门话题 #grok $FET $AGIX $WLD