Qwen Releases 35B Sparse Mixture-of-Experts Model, Keeps Open-Source Pressure On
Alibaba's Qwen team released Qwen3.6-35B-A3B, a sparse mixture-of-experts model that activates only 3 billion parameters per token from a 35B total — continuing China's aggressive open-source AI push.
Subscribe to unlock all stories
Get full access to The Singularity Ledger, archive included.
Cancel anytime. Payments powered by Stripe.