The expansion builds on the company’s growing ML offerings including the recently announced i.MX 8M Plus applications processors with a dedicated NPU.
The highly configurable Ethos-U55 machine learning accelerator works in concert with the Cortex-M core to achieve small footprint can deliver greater than 30x improvement in inference performance compared to high performing MCUs.
The Ethos-U55 is highly configurable and specifically designed to accelerate ML inference in area-constrained embedded and IoT devices. Its advanced compression techniques save power and reduce ML model sizes significantly to enable execution of neural networks that previously only ran on larger systems.
NXP’s portfolio of ML compute elements (CPU, GPU, DSP and NPU) are enabled with its eIQ machine learning development environment, which provides choices of popular open-source inference engines that deliver the performance needed for the specific compute element.
Using NXP’s edge processors and eIQ tools, customers can easily build applications, such as, object detection, face and gesture recognition, natural language processing, and predictive maintenance.