NXP Semiconductors eIQ™ Auto Deep Learning (DL) Toolkit
NXP Semiconductors eIQ™ Auto Deep Learning (DL) Toolkit is an automotive grade inference engine for the S32 Embedded Processors. The eIQ™ Auto Toolkit is designed to enable designers to move quickly from a development environment to AI application implementations that meet stringent automotive standards. eIQ Auto enables the application of deep learning-based algorithms to vision, driver replacement, sensor fusion, driver monitoring, and other evolving automotive applications.Development & Deployment Agility
With the NXP eIQ Auto Toolkit, designers can seamlessly transition from a development environment to full implementation, converting and fine-tuning their AI models while leveraging familiar platforms and libraries such as TensorFlow, Caffe, and/or PyTorch to port their deep learning training frameworks to a high-performance, automotive-grade NXP processing platform. Neural networks can be optimized for maximum efficiency utilizing pruning and compression techniques.
API Advantages
NXP provides a unified API that enables the same application code and neural network models to be utilized across multiple development stages. Once the model has been quantized, it can be run on the device target or on the bit-exact simulator, greatly accelerating development processes.
Quality & Reliability
NXP’s achievement of Automotive SPICE compliance ensures that the eIQ Auto toolkit meets the stringent, international automotive development standards established by leading vehicle manufacturers. In contrast with competing inference engines developed with open-source tools, NXP’s eIQ Auto Toolkit helps enable seamless standard conformance for safety-critical automotive applications.
Features
- Training frameworks - interface to standard frameworks like TensorFlow, Pytorch, Caffe, and ONNX
- Optimization - prunes, quantizes, and compresses the Neural Network
- Embedded deployment - automated neural net layer deployment to the optimum available compute resource
- Auto-quality inference engine - A-SPICE qualified inference engine
- Supported networks
- Detection, classification, and segmentation
- Includes optimized support for the following networks: MobileNetV1, MobileNetV2, SqueezeNet1.1, SSDMobileNet, ResNet-50, DeepLab v3, and SqueezeSeg
Applications
- Driver/occupant monitoring systems
- LiDAR segmentation
- Object detection, classification, and tracking
- Surround view
- Front view
- Advanced park assist
Download from NXP
System Requirements
- Ubuntu LTS 16.04 64-bit
- NXP Vision SDK Software for S32V234
- SBC-S32V234 S32V Vision and Sensor Fusion Evaluation Board
Development Block Diagram
Implementation Flow Chart
