According to Foresight News, Musk announced on Twitter that xAI's Grok 2 large model will be launched in August and will focus on "cleaning" large language models (LLMs) from Internet training data.

Large Language Models (LLMs) are very large deep learning models pre-trained on a large amount of data. The underlying transformer is a set of neural networks consisting of encoders and decoders with self-attention. The encoders and decoders extract meaning from a sequence of text and understand the relationships between words and phrases in it.

The transformer LLM is capable of unsupervised training, but a more precise explanation is that the transformer performs autonomous learning. Through this process, the transformer learns to understand basic syntax, language, and knowledge.