Today3h ago
Project update
0G Labs Begins Public Retrain of 107B Parameter AI Model

0G Labs Begins Public Retrain of 107B Parameter AI Model

While the market celebrated Bittensor’s 72B model this week, 0G Labs trained a 107B parameter model eight months earlier using standard 1 Gbps internet.


J. Huang (NVIDIA CEO) just validated decentralized AI on the All-In Podcast, but 0G had already proven frontier-scale training is possible months earlier. One builds a model, the other builds infrastructure.


The team has now started public retraining with full transparency and an open-source commitment. At 48% larger than Bittensor’s Covenant-72B, it is the largest decentralized AI model on record.


DiLoCoX-107B runs on 0G’s full-stack blockchain for AI agents, including an EVM-compatible L1 chain, decentralized compute, distributed storage, and a data availability layer that is 50,000x faster and 100x cheaper than Ethereum's DA.