- 30-plus automakers and autonomy firms using the cloud service provider’s products for R&D
- Company touts full-stack advantage across chips, cloud and foundation models
More than 30 automakers and autonomous driving firms are using Alibaba Cloud for intelligent driving research, with deployment of its in-house AI chips surpassing 100,000 units, a senior executive said at a conference on April 11.
The figure marks what the company described as a record for self-developed AI chips used on a public cloud platform in China’s automotive sector, according to Li Qiang, vice president of Alibaba Cloud Intelligence’s public cloud division and head of its automotive AI business, speaking at the Intelligence Electric Vehicle Forum 2026.
Li highlighted the rapid adoption of the company’s “Zhenwu” Parallel Processing Unit (PPU) chips, developed by its semiconductor arm Pingtouge, as demand for autonomous driving compute accelerates.
The forum, held April 11–12, comes as Chinese cloud providers race to integrate computing infrastructure with increasingly complex autonomous driving workloads, with Alibaba positioning its chip deployment as the largest of its kind in the industry.
The company is positioning its advantage on a “full-stack” technology system combining proprietary chips, large-scale cloud infrastructure across the Asia-Pacific region, and its flagship open-source foundation model Qwen, arguing that co-optimization across chip, cloud and model layers improves efficiency in both training and inference workloads.
Chip-to-chip
According to technical specifications published by Pingtouge, the AI chip arm of Alibaba, the “Zhenwu 810E” chip is built on a proprietary parallel computing architecture with chip-to-chip interconnect technology and a full software stack developed in-house.
It features 96GB of HBM2e memory and interconnect bandwidth of up to 700 GB/s, supporting AI training, inference and autonomous driving workloads.

At Alibaba’s March 19 earnings call, executives said cumulative shipments of its AI chips had exceeded 470,000 units, with more than 60% deployed by external customers across over 400 enterprises spanning internet services, financial institutions and autonomous driving companies.
The company said large-scale commercial deployment of the Zhenwu PPU reflects validation in complex real-world workloads, underscoring its stability and scalability in demanding AI environments.
Industry observers note that only a handful of global players — notably Alibaba and Google — currently maintain integrated capabilities across foundation models, cloud infrastructure and AI chip design.
