Near Protocol has announced plans to create the largest open-source artificial intelligence model in the world. The platform announced this initiative on the opening day of the Redacted conference held in Bangkok, Thailand.
The initiative, a model with 1.4 trillion parameters, will be larger than Llama, Meta's open-source model. Near Protocol noted that the initiative will include crowdsourced research results from several researchers and participants of the new Near AI center. It also states that participants are ready to join the training of a model with smaller parameters of 500 million, starting from November 10.
Near Protocol presents its artificial intelligence model plan
Near Protocol reported that the project will continue to grow in size, covering seven models. It also noted that at each stage and model, only the best researchers and participants will be retained. The model will also be monetized, and the platform will ensure its privacy. Near Protocol intends to use an encrypted trusted execution environment to reward participants while encouraging ongoing updates throughout the process.
Near Protocol co-founder Ilya Polosukhin mentioned at the event in Bangkok that the firm intends to fund the costly training through token sales. He noted that the model will cost about $160 million. That's a lot of money, but it doesn't mean it can't be raised in the crypto market.
He clarified that token holders will recoup their investments through the intervention arising from the use of the model. "So we have a business model, a way to monetize it, a way to raise funds, and a way to launch it into a cycle. And therefore, people can also reinvest back into the next model."
Near may raise funds, considering that its CEO Polosukhin was one of two people involved in the research that spawned ChatGPT. Co-founder Alex Skidanov also worked at OpenAI in the period leading up to its grand release in 2022. Skidanov, who is now one of the leaders of Near, said that the task is feasible, but there will be challenges.
Decentralized AI to address privacy issues
The firm will have to invest so many resources into the project to bring its dreams to life. For example, it will require gathering many graphics processors in one place, which is not an ideal solution. However, using a decentralized network for computations will require technology that currently does not exist.
The required method of distributed learning also requires a fast connection. However, Skidanov added that Deep Mind's research shows that it is possible. Polosukhin noted that he still has to interact with projects such as the Alliance of Artificial Superintelligence, but he would be happy if both projects follow the same path.
He said that whatever happens, we need to ensure the victory of decentralized artificial intelligence technology so that we can all benefit from it. Guest speaker Edward Snowden also touched on this topic, discussing how centralized AI could turn the world into a large surveillance state.
"This is probably the most important technology right now and possibly in the future. The reality is that if AI is controlled by one company, we will effectively be doing everything that company says," he explained.
"If all AI and, essentially, the entire economy are managed by one company, then at this stage there will be no decentralization. So, essentially, this is the only way to keep Web3 relevant from a philosophical point of view if we have AI that also follows the same principles," Snowden added.