Qualcomm
CES 2026: Rubber hits the road for Qualcomm automotive
Mobile tech leader uses CES to outline advances in automotive through key collaborations with Chinese startup technology company, IT behemoth and manufacturing group to boost, ADAS, IVI and AI compute
Qualcomm has unveiled a slew of products and partnerships for the connected automotive market that are seeing real-life application.
Specifically, at the CES 2026 trade show, the firm announced it has teamed with Chinese startup Leapmotor for what it calls the world’s first cross-domain integrated service powered by its Snapdragon Cockpit Elite and Snapdragon Ride Elite automotive platforms; expanded its technology relationship with Google to provide what the firms call a leading foundation for transforming the automotive industry; and a collaboration with manufacturing group ZF to provide “cutting-edge” and scalable advanced driver assistance systems (ADAS) that combine advanced artificial intelligence (AI) compute and perception capabilities.
In the former partnership, Leapmotor’s flagship model, D19, will become the first mass-production vehicle powered by a dual Snapdragon Elite (SA8797P) automotive platform in a collaboration that will support the car manufacturer’s advancement towards a more centralised vehicle architecture, and make cars easier and more efficient to build, delivering more responsive features for drivers and passengers.
Qualcomm sees the collaboration as highlighting the growing value of deep chipmaker-automaker integration at the vehicle‑architecture level, and offering a scalable blueprint as the industry accelerates towards centralised computing and fully software‑defined vehicles.
The in-vehicle technology has been designed to unify cockpit, driver assistance, body control and connectivity – including Wi-Fi 6 and 5G mobile comms – on one system. Making its debut at CES 2026, the dual‑chipset architecture is claimed to deliver “exceptional” compute performance to streamline vehicle electronics, reduce system complexity and enable more advanced AI capabilities across the entire vehicle.
The central domain controller has the ability to unify into high‑performance system key vehicle domains, such as intelligent cockpit; driver assistance; body controls including lighting, climate, doors and windows; and the vehicle gateway. The dual‑chipset setup also provides the compute headroom needed for real‑time coordination and advanced AI, including emerging agentic AI workloads.
With the Qualcomm Oryon central processing unit, Qualcomm Adreno graphics processing unit and Qualcomm Hexagon neural processing unit working in parallel, the platform can run both a full‑modality large AI model for the cockpit and a vision-language-action multimodal model for driver assistance. The result is said to be more intelligent, responsive and future‑ready driving experiences.
Read more about software defined vehicles
- LG looks to accelerate in-vehicle experience with Xbox, Zoom: CE giant continues its drive into automotive, claiming to redefine software-defined vehicle era through partnerships offering in-car gaming and meetings delivered over standard content platform.
- Software-defined vehicles drive next-gen auto architectures: Research highlights trend in automotive industry towards software-defined vehicles where functionality, user experience and monetisation opportunities are governed increasingly by software rather than hardware.
- Arm teams with Nvidia to boost software-defined vehicles: Leading processor and AI graphics processing unit provider join forces to advance processing and connectivity in sector where GenAI applications are becoming paramount.
- Renault charges Ampere to drive future with software-defined vehicles: With software-defined vehicles key to bringing it closer to customers, auto manufacturer deploys digital chassis solution to accelerate the time it takes to roll out new features to cars.
Among other key system capabilities is the ability to support up to eight displays, including multiple 3K and 4K screens, and up to 18‑channel audio for immersive in‑car entertainment. The system also enables over-the-air updates, remote diagnostics and remote vehicle control, with its service‑oriented architecture offering more than 200 modular capabilities for flexible, user‑defined experiences.
Driver assistance is designed to support up to 13 cameras and multiple sensors, including LiDAR, vehicle-millimetre‑wave radar, ultrasonic sensors and a high‑precision IMU, to deliver reliable L2 driver‑assistance. Other features include parking‑to‑parking, with the controller engineered to help vehicles handle complex daily and urban scenarios.
In-vehicle connectivity allows reliable communication between all of the vehicle’s systems, while also supporting voice calling, emergency services, Bluetooth, Wi‑Fi and precise location services such as global navigation satellite system.
Beginning with Snapdragon-powered embedded Android infotainment systems, Qualcomm’s relationship with Google has lasted over a decade, and the latest chapter of this partnership will see the firms aim to establish end-to-end automotive technology that integrates Snapdragon Digital Chassis with Google’s automotive software.
Overall, Qualcomm and Google are setting out to establish a unified reference platform aimed at establishing accelerating development cycles, strengthening quality assurance and streamlining production for vehicle manufacturers. This, say the companies, will empower automakers to create next-generation vehicles that better anticipate, react and adapt to driver needs with agentic AI.
Specifically, they will be working on simplifying the deployment of advanced, next-generation AI agents with Gemini Enterprise for automotive – an evolution of the Automotive AI Agent announced at the IAA Mobility show in Munich in 2025. By aligning Snapdragon Cockpit Platforms with Google’s AAOS roadmaps, starting with Android 17, the companies say they are creating a foundation for next-generation SDVs and in-vehicle infotainment systems.
Intelligent mobility
The intelligent mobility technology is said to have been redefined for the generative AI era, connecting vehicles to the cloud using a flexible architecture that blends on-device and cloud models. This approach is designed to enable real-time personalisation for drivers and help speed up the roll-out of new features such as advanced voice-driven and pro-active assistants.
For drivers, the intended benefits include smarter, safer and more adaptive vehicles – with dynamic personalisation, and multimodal interfaces with always-on AI-driven features that can help enhance convenience and safety.
ZF and Qualcomm Technologies are collaborating to provide an ADAS service that combines advanced AI compute and perception capabilities powered by Qualcomm Technologies’ Snapdragon Ride system-on-chips.
ZF’s ProAI supercomputer will integrate Snapdragon Ride Pilot and Vision stack for faster time-to-market and deliver turn-key systems to automotive manufacturers, bringing together automotive computing and real-time perception, enabling automakers to deploy scalable ADAS services across a wide range of vehicle types and automation levels. This ranges from regulatory functions up to Level 3, whereby a vehicle handles all driving tasks in specific scenarios, letting the driver divert attention from the road.
With Snapdragon Ride, the ZF ProAI supercomputer is capable of serving as a domain, zone or central controller, while supporting enhanced computer vision, sensor fusion and decision-making control logic, or all of these functions in a single end-to-end AI model.
The companies want to extend their cooperation to the development of a multi-domain mixed criticality solution for ADAS and in-vehicle infotainment systems.
