According to Foresight News, decentralized all-chain data network Chainbase has announced the open-sourcing of its cryptocurrency-focused large language model, Theia-Llama-3.1-8B. The training data for this model is derived from CoinMarketCap and detailed project research reports. Chainbase claims that this model exhibits lower perplexity and higher BERT scores compared to other mainstream models. The model has been efficiently fine-tuned using LORA, allowing it to adapt to specific tasks with lower computational power. This fine-tuning process has significantly reduced the model's memory usage and increased inference speed while maintaining acceptable accuracy.

Previously, Foresight News reported that Chainbase completed a $15 million Series A funding round in July this year. The investment was led by Tencent Investment Group and Matrix Partners China.