NXP Rolls Out AI-Enabled Applications Processor Targeting Automotive
NXP Semiconductors has announced the i.MX 952 applications processor. An extension of its i.MX 9 series for AI-powered automotive interiors, the device integrates the company’s eIQ Neutron neural processing unit (NPU) to support driver monitoring, child presence detection, and adaptive vehicle interaction through on-device machine learning.

The i.MX 952 applications processor. Image used courtesy of NXP
Compatible with existing i.MX 95 platforms, the i.MX 952 is said to empower developers to scale designs for various performance tiers while maintaining a consistent hardware and software foundation.
i.MX 952 Architecture and Features
NXP built the i.MX 952 around a heterogeneous multi-domain architecture that merges real-time, low-power, and high-performance compute clusters.
The application domain features a quad-core Arm Cortex-A55 complex with 32 kB of L1 instruction and data caches, a 64 kB of L2 cache, and a coherent 512 kB of L3 cache with ECC protection. NXP complements the application domain with an Arm Cortex-M7 core for real-time control tasks and an Arm Cortex-M33 core that manages system safety and low-power operations. Together, the cores support ISO 26262 ASIL B and IEC 61508 SIL 2 platforms for automotive and industrial safety compliance.

Block diagram of the i.MX 952 applications processor. Image used courtesy of NXP
NXP also embedded an eIQ Neutron NPU into the design, specifically designed to accelerate neural network inference and perform sensor fusion, image classification, and anomaly detection. A companion image signal processor (ISP) handles up to 500 Mpixel/s throughput and supports RGB-IR camera inputs for driver and occupant monitoring. An additional multimedia subsystem includes a 4K video processing unit, MIPI-CSI/DSI interfaces, LVDS outputs, and a 3D/2D Arm Mali GPU for HMI rendering.
The device also features memory interfaces that support up to 6,000 MT/s of LPDDR5 or 4,266 MT/s of LPDDR4X with inline ECC and encryption, along with xSPI flash featuring inline cryptography. Storage expansion is available through triple uSDHC (3.0) and eMMC 5.1 interfaces. Connectivity includes 2.5 Gbps and 1 Gbps Ethernet with time-sensitive networking (TSN), PCIe Gen 3.0 (1 lane), dual USB 2.0 ports, and multiple UART, SPI, I2C, and CAN-FD controllers.
Understanding Sensor Fusion for Automotive
Sensor fusion is the process by which systems computationally combine data from disparate sensor sources to reach a more accurate and complete understanding of the vehicle’s environment and occupants. While each sensing modality offers distinct advantages, they also have unique limitations. Sensor fusion engines use probabilistic or neural-network-based algorithms to combine sensor outputs, reduce uncertainty, and improve reliability.

Comparing sensor modalities. Image used courtesy of Dell
Interior sensing systems leverage sensor fusion to realize accurate driver attention tracking and occupant classification. These systems merge infrared imaging with depth and motion data, actively monitoring head orientation, eyelid movements, and body position to identify signs of fatigue or distraction. Exterior systems leverage fusion to strengthen ADAS capabilities. Radar measures distances while cameras recognize objects. Algorithms unite these inputs to improve perception and enable smarter decision-making.
Implementing sensor fusion at the edge cuts latency and keeps data local, but it also raises the bar for the compute platform. You need a mix of CPUs, DSP-class resources, and a neural accelerator that can run vision and signal-processing jobs side by side. Clean fusion depends on precise time alignment across sensors, so the system must provide deterministic clocks, fast on-chip and off-chip links, and memory with ECC. The models cannot be static either, as performance drifts with lighting, temperature, and sensor wear. That reality is moving designs toward on-device training and continual learning so the system can recalibrate as conditions change.
Sized and Prepped for AI-Driven Vehicle Interiors
The processor is offered in 19 mm x 19 mm and 15 mm x 15 mm FCBGA packages with 0.7 mm and 0.5 mm pitch options, qualified for automotive temperature ranges from -40°C to 125°C junction. Samples are expected in the first half of 2026, with automotive-qualified production to follow shortly after.





