
A Crypto Project Just Trained an AI Model From Scratch
Audio Summary
AI Summary
AI is poised to be a long-term dominant force, and this will extend to cryptocurrency. However, most crypto AI projects are considered scams. A notable exception is Bittensor, which has successfully trained a large language model, codenamed Covenant 72B, from scratch on one of its subnets. This process, known as pre-training, involves training a model on raw data. The 72B in the model's name refers to its size and power.
Decentralized model pre-training is not entirely new. For instance, another project previously trained a 40 billion parameter model using a decentralized network. However, that project's model was less competitive and required pre-selection of participants.
The key innovation with Bittensor's approach is its fully permissionless and collaborative nature. Instead of a single data center, Bittensor utilizes a distributed network where individuals worldwide connect their computers to train the model. This method aims for both efficiency and a more powerful final model. Unlike previous attempts that necessitated whitelisting participants, Bittensor's subnet number three allowed anyone to contribute to the training process.
The Covenant 72B model was compared to Meta's Llama 2 70B, a model released two and a half years ago. While not state-of-the-art compared to current models like GPT-5, the technology demonstrated by Bittensor proves that decentralized model training can be technically achieved and competitive with older, established models.