ChainCatcher news, the decentralized full-chain data network Chainbase announced the open source of the cryptocurrency-centered large language model Theia-Llama-3.1-8B. Its training data comes from CoinMarketCap and the project’s detailed research report. Chainbase stated that this model has lower perplexity and higher BERT score than other mainstream models. It uses LORA to efficiently fine-tune the model, adapting large pre-trained models to specific tasks with lower computing power, and significantly shrinking the model. reduces the model's memory footprint and speeds up inference while maintaining acceptable accuracy.

In addition, Chainbase completed a US$15 million Series A financing round in July this year, led by Tencent Investment Group and Matrix Partners China.