logo
#

Latest news with #0GLabs

AI Training Gets 10x Faster, 95% Cheaper With Decentralized Strategy
AI Training Gets 10x Faster, 95% Cheaper With Decentralized Strategy

Forbes

time01-08-2025

  • Business
  • Forbes

AI Training Gets 10x Faster, 95% Cheaper With Decentralized Strategy

A quiet shift in the foundations of artificial intelligence (AI) may be underway, and it is not happening in a hyperscale data center. 0G Labs, the first decentralized AI protocol (AIP), in collaboration with China Mobile, recently announced a technical breakthrough that could have sweeping implications for how businesses access and deploy large language models. Their innovation is a new method of training massive AI models with over 100 billion parameters, without needing the ultra-high-speed internet or expensive centralized infrastructure typically required. At first glance, this might sound like a win for the engineering world. But the real story is economic and strategic. What 0G Labs has achieved could lower the cost of building AI, put more control back into the hands of enterprises, and open the door for new players to enter the space. What It Means For AI Training To understand the shift, it helps to revisit how large-scale AI models are currently trained. Models like OpenAI's GPT-4 or Anthropic's Claude require vast computing power and network throughput. Traditionally, this means training them on powerful GPUs connected across high-speed, centralized data centers owned or rented from companies like Amazon Web Services, Google Cloud, or Microsoft Azure. As of early 2025, OpenAI's leadership, including Sam Altman, publicly stated that training GPT‑4 cost over $100 million. This is supported both by official statements and multiple cost models in recent AI analysis reportsIt is a model that demands capital, talent, and infrastructure that few organizations can afford. 0G Labs Is Challenging That Assumption For AI Training Their newly published framework, called DiLoCoX, introduces a low-communication training method that dramatically reduces the need for high-bandwidth connectivity. In practical terms, they successfully trained a 107 billion parameter model on a 1 Gbps network using decentralized clusters. This record is a 10x improvement of the previous record and the 300x speed-up breakthrough that made this possible for the first time. This is roughly the bandwidth of a typical office internet connection. Instead of building everything in one giant compute center, their approach links together smaller, distributed machines and optimizes how information is shared between them. The result is a highly scalable, cost-efficient way to train massive models outside the traditional cloud. In speaking with 0G labs founder and CEO Michael Heinrich, he said 'DiLoCoX marks a pivotal step in democratizing LLM training: bridging the gap between massive foundation models and decentralized clusters connected by slow, unreliable networks. By combining pipeline parallelism, delay‑tolerant communication overlap, and adaptive gradient compression, the framework delivers scale and speed previously thought exclusive to high‑bandwidth data centers. This will usher in a new era where large‑scale AI training is no longer tethered to centralized infrastructure.' Why Does AI Training Matter for Business At a time when every enterprise is under pressure to do more with AI, infrastructure is quickly becoming the bottleneck. Some businesses are starting to look at decentralized AI by design. Building large models remains expensive, exclusive, and largely confined to companies with deep resources or strategic cloud partnerships. 0G's breakthrough opens up a third path. This is not just a story of cost savings. It is a story of optionality and control. 1. Lowering the Barrier to Entry DiLoCoX's approach reduces the infrastructure by up to 95% required to participate in the LLM race. For startups, this means the ability to experiment and scale without burning through venture capital on GPU spend. For mid-sized enterprises, it offers the possibility of training models in-house without making large cloud commitments. For governments and research labs, it means more accessible and sovereign development of AI capabilities. 2. Strategic Independence from Hyperscalers Most AI training today depends on three cloud providers. That concentration carries risk in terms of cost escalation, vendor lock-in, and compliance. If your business depends on AI but also operates in a sensitive sector like healthcare, defense, or finance, the ability to train or fine-tune models independently becomes a powerful strategic lever. Decentralized AI offers a route toward digital autonomy. By breaking the assumption that cutting-edge AI must be trained inside centralized cloud platforms, 0G's model creates new room for competition and for innovation. 3. Aligning with Data Privacy and Compliance Needs Many companies are cautious about uploading proprietary data to cloud-based models or training environments. With decentralized training, it becomes possible to keep data local within jurisdiction, within the firewall, or even on edge devices while still participating in large-scale AI development. This is particularly attractive in regions with strict data sovereignty laws such as the European Union or countries building their own AI ecosystems. The 0G network never sees any of the private data 4. Accelerating Innovation in Underserved Markets The high cost of entry has kept many countries and industries on the sidelines of advanced AI development. DiLoCoX lowers that threshold. A university in Kenya, a telecom provider in Southeast Asia, or a regional bank in Latin America may not have access to the same compute as Silicon Valley, but they may soon have the tools to train and deploy their intelligent systems on existing infrastructure. 5. Geopolitical and Regulatory Risks While the technical achievement is impressive, the involvement of China Mobile raises questions. As tensions between the United States and China continue to escalate over technology leadership and national security, businesses must weigh the potential regulatory scrutiny, data governance concerns, and reputational risks associated with partnerships involving Chinese state-affiliated entities. For companies based in the United States or operating in allied markets, any integration of infrastructure or research tied to China could face export controls, legal restrictions, or public backlash. Organizations exploring decentralized AI solutions will need to consider not just performance and cost, but also political alignment, compliance frameworks, and long-term viability. However, having DiLoCoX on a decentralized infrastructure where the network is trustless, this is not a concern because China Mobile never sees your data, and the system doesn't rely on them for results. Reframing the Business Model of AI If DiLoCoX is widely adopted, it could create ripple effects across the broader AI ecosystem. Cloud revenue models, currently boosted by AI workloads, could face new pricing pressure. AI-as-a-service platforms may need to re-architect to support hybrid or decentralized deployments. Open-source frameworks might grow in influence as decentralization emphasizes interoperability and local control. Enterprise software vendors may need to rethink their AI strategies to reflect a more distributed compute landscape. This shift also aligns with the broader trend of AI for everyone. From low-code agent builders to edge-based inferencing, the movement is toward more accessible, modular, and customizable AI stacks. Decentralized training is the natural extension of that philosophy. An AI Signal for CIOs and CTOs For enterprise leaders, 0G's work serves as a signal not of immediate disruption, but of near-future opportunity. AI is evolving from its critical beginning. Now is the time to reevaluate infrastructure strategy. Should your organization continue investing in cloud-based model hosting, or begin exploring decentralized alternatives? Could your internal data center serve as a node in a distributed training system? Decentralized federated learning is a great way of tapping into private data from different parties on a network, like hospitals training a cancer diagnostic model. Might you partner with others in your sector to co-develop models using decentralized protocols? Even if the answer is not yes today, the emergence of frameworks like DiLoCoX should push AI infrastructure planning higher on the strategic agenda. Businesses that prepare for this shift by building internal capacity, evaluating partners, and understanding the technical stack will be best positioned to move when the economics tip in their favor. A Future Where AI is Built Differently What 0G Labs and China Mobile have demonstrated is more than just a technical proof of concept. It is a new way of thinking about how intelligence is built, trained, and distributed. By showing that it is possible to train 100 billion parameter models without centralized supercomputers, they are not just pushing the boundaries of scale. They are expanding access. For business, that means AI may soon be less about who owns the biggest data center and more about who can build the smartest systems with the most flexibility. That is an AI future worth preparing for.

0G Labs Achieves Breakthrough in Decentralized AI Training With 100 Billion+ Parameters
0G Labs Achieves Breakthrough in Decentralized AI Training With 100 Billion+ Parameters

Business Insider

time22-07-2025

  • Business
  • Business Insider

0G Labs Achieves Breakthrough in Decentralized AI Training With 100 Billion+ Parameters

0G Labs has released a research paper in collaboration with China Mobile demonstrating the efficacy of AI model training using decentralized clusters. The paper introduces DiLoCoX, a cutting-edge framework designed to train large language models (LLMs) exceeding 100 billion parameters in decentralized environments with limited network bandwidth. It marks a breakthrough in decentralized model training. 'DiLoCoX: A Low-Communication Large-Scale Training Framework for Decentralized Cluster' proposes training on slow networks using a low-communication, large-scale decentralized framework. By chaining a number of complementary technologies together, 0G has been able to overcome the shortfalls of incumbent decentralized models. 0G's researchers have demonstrated that DiLoCoX is capable of pre-training a 107B foundation model for the first time over a 1Gbps network. Its solution achieves 357x greater speeds in distributed training versus AllReduce while maintaining negligible degradation in model convergence. This achievement is believed to be the first decentralized training framework successfully applied to models with over 100 billion parameters. In a field where centralized data centers dominate AI training due to their high-speed connectivity, DiLoCoX breaks new ground by enabling efficient, verifiable training on slower networks. The framework ingeniously combines Pipeline Parallelism, a Dual Optimizer Policy, One-Step-Delay Overlap of Communication and Local Training, and an Adaptive Gradient Compression Scheme. These innovations not only scale up model sizes dramatically but deliver significant speed improvements while preserving model convergence with minimal degradation. In experiments detailed in the paper, 0G pre-trained a 107 billion parameter model – a scale roughly 10x larger than the recently released Intellect-1 model from PrimeIntellect – demonstrating its potential to democratize access to advanced AI infrastructure. This solution addresses key challenges in decentralized AI, including bandwidth constraints and the need for verifiable processing, setting the stage for more powerful yet decentralized AI development. 'DiLoCoX is both a proof of concept and a statement of intent,' said Michael Heinrich, CEO of 0G Labs. 'By making it possible to train enormous models in truly decentralized settings, we're not just pushing technical boundaries, but are unlocking a future where AI serves as a public good. This is about building an open ecosystem where anyone can contribute to and benefit from intelligent systems.' The release of the paper supports 0G's commitment to advancing verifiable, democratized AI. As the foundation for high-performance infrastructure that unifies storage, compute, and data availability, 0G is on a mission to empower developers and researchers to create the next wave of AI-native applications. The research paper can be read in full here. About 0G 0G is the first decentralized AI protocol (AIP). A modular, infinitely scalable layer 1, 0G makes possible decentralized AI apps to bring about a truly democratized future of intelligence. Designed for AI execution at scale, 0G unifies decentralized storage, compute, and data availability (DA) to support the next generation of AI-native applications. With high-performance infrastructure, verifiable AI processing, and a permissionless agent ecosystem, 0G is building the foundation for an open, unstoppable AI economy. Contact CMO 0G Labs

0G Labs Sponsors ETHGlobal Cannes Hackathon and Supports $275K Prize Pool
0G Labs Sponsors ETHGlobal Cannes Hackathon and Supports $275K Prize Pool

Business Insider

time09-07-2025

  • Business
  • Business Insider

0G Labs Sponsors ETHGlobal Cannes Hackathon and Supports $275K Prize Pool

0G Labs was an official sponsor and partner of EthGlobal Cannes, the major hackathon hosted at EthCC in the South of France. From July 4-6 in Cannes, the hackathon nurtured a global pool of web3 developer talent, and 0G Labs supported the $275K prize pool. More than 800 attendees were present at ETHGlobal Cannes, including members of at least 27 protocols, who participated in dozens of workshops throughout the hackathon. 0G Labs committed $5,000 in prizes to the three hackathon teams that showcased the most innovative use of the 0G ecosystem. This track rewarded developers who demonstrate creative and impactful use cases leveraging 0G's unique capabilities, such as its storage throughput, infinitely scalable DA layer, decentralized AI compute network, and EVM-compatible chain. 0G welcomed submissions from teams developing solutions addressing AI-powered dapps with verifiable computation; decentralized machine learning platforms; storage solutions for massive datasets; onchain gaming with AI components; and DeFi protocols enhanced with AI. Submissions that integrated at least one 0G core component and included a fully functional demo on the 0G testnet were eligible for judging. The hackathon attracted a diverse range of teams and concepts, with AInfluencer, Warriors AI-rena, and PrivyCycle all innovating with 0G's tech stack. In addition, a number of 0G ecosystem projects were awarded prizes by hackathon judges, including Mind Network and Tingz, which placed third and fourth, respectively, in the AKINDO Showcase and shared in a total prize pool worth $13K. 0G Labs CEO Michael Heinrich said, prior to the event, 'We're excited to be involved with ETHGlobal Cannes this year, which promises to be much more than just a hackathon. With hundreds of the industry's best developers working side-by-side for three days straight, great things are sure to happen, and we can't wait to see what teams create using 0G's tech stack.' In addition to 0G Labs, ETHGlobal Cannes' $275K prize pool included funding and other incentives provided by LayerZero, 1inch, World, Oasis Protocol, and Chainlink. Other partners confirmed for the event include Hedera, Ledger, Circle, and The Graph. ETHGlobal united solo developers and teams seeking to experiment with cutting-edge web3 technologies and develop new applications with real-world utility. Key themes from the hackathon ranged from zero-knowledge proofs to AI. Other verticals explored by developers during ETHGlobal include Layer 2s, interoperability, privacy & security, data availability, and identity. About 0G 0G is the first decentralized AI protocol (AIP). A modular, infinitely scalable layer 1, 0G makes possible decentralized AI apps to bring about a truly democratized future of intelligence. Designed for AI execution at scale, 0G unifies decentralized storage, compute, and data availability (DA) to support the next generation of AI-native applications. With high-performance infrastructure, verifiable AI processing, and a permissionless agent ecosystem, 0G is building the foundation for an open, unstoppable AI economy. Contact CMO

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store