Crypto M - Crypto News
2.27K subscribers
15.9K photos
194 links
Your #1 destination for the latest and most unbiased market news on Bitcoin, Ethereum, NFT, Fintech, Web3, DeFi, and Blockchain.
Download Telegram
🚀 Gradient Unveils Echo-2 Framework to Enhance AI Training Efficiency

Gradient, a distributed AI laboratory, has launched the Echo-2 distributed reinforcement learning framework aimed at overcoming efficiency barriers in AI research training. According to ChainCatcher, Echo-2 achieves a significant reduction in post-training costs for a 30 billion parameter model, decreasing from $4,500 to $425. This advancement allows for more than a tenfold increase in research throughput within the same budget.

The framework employs compute-storage separation technology for asynchronous training, offloading extensive sampling power to unstable GPU instances and heterogeneous GPUs based on Parallax. It incorporates bounded staleness, instance fault-tolerant scheduling, and the proprietary Lattica communication protocol to enhance training efficiency while maintaining model accuracy. Alongside the framework's release, Gradient is set to introduce the RLaaS platform Logits, which aims to shift AI research from a capital-intensive to an efficiency-driven paradigm. Logits is now open for reservations by students and researchers worldwide.


#Echo2 #AITraining #DistributedReinforcementLearning #AIResearch #Efficiency #GPU #Parallax #RLaaS #Logits #AIFramework #TrainingCostReduction