MediaTek and Alibaba’s Qwen Lab have collaborated to deploy the open-source Qwen3 large language model (LLM) on the flagship Dimensity 9400 series smartphone platforms. Leveraging high-performance edge AI capabilities, Qwen3 provides a robust foundation for next-generation generative AI applications that move beyond basic AI assistance to managing local devices and executing complex cross-app tasks, even in low-connectivity or offline scenarios, or where security and privacy are critical.The Dimensity 9400 and Dimensity 9400+ feature MediaTek’s latest 8th-generation AI processor, the NPU 890, which supports the widest range of leading global LLMs. These platforms boast a range of new, innovative technologies such as hybrid Mixture of Experts (MoE), Multi-Head Latent Attention (MLA), and Multi-Token Prediction (MTP), while also accelerating FP8 inference speeds. Equipped with MediaTek’s upgraded SpD+ (Speculative Decoding) technology, the Dimensity 9400+ achieves 20% faster inference speeds for AI agent tasks compared to standard SpD.
Qwen3 variants such as Qwen3-0.6B, 1.7B, and 4B strike an ideal balance between performance and power efficiency for on-device use. These models natively support the MCP protocol, enabling strong cross-agent interoperability.
Enabling Qwen3 is part of MediaTek’s ongoing efforts to foster a global AI ecosystem that brings the benefits of edge AI to smartphone users everywhere; it collaborates closely with developers of LLMs, SLMs, and other language model types to ensure support on its smartphone platforms. Developers can then leverage a wide variety of global language models and popular frameworks to efficiently deploy multimodal generative AI applications—and bring their products and services to market more quickly.