DigiKey at embedded world 2026 with Texas Instruments

DigiKey at embedded world 2026 with Texas Instruments DigiKey at embedded world 2026 with Texas Instruments

At embedded world 2026, on the DigiKey booth, Lucy Barnard speaks with Vinay Agarwal, Vice President and General Manager of MSP Microcontrollers at Texas Instruments about TI’s approach to making Edge AI accessible across its entire MCU portfolio.

Edge AI, Agarwal explained, is crucial because many embedded applications operate locally on sensors with limited power and memory. “A lot of those decisions are being made locally at the node,” he said. “To make those decisions using cloud computing, you would have to transmit a lot of data into the Cloud and bring it back, which is more power hungry and inefficient.” By enabling AI at the Edge, TI allows developers to process data locally, improving efficiency and reducing latency.

A key advancement highlighted at the event was TI’s TinyEngine, a hardware accelerator optimised for low-power microcontrollers. It can operate on devices as small as the MSP M0 line. Agarwal noted, “If you compare it with a similar MCU for running a similar AI workload, we have seen more than 90 times improvement in power consumption and around 120 times improvement in latency.” The engine runs AI models independently of the main CPU, allowing low-memory devices to perform meaningful AI tasks without straining system resources.

Hardware NPUs offer additional advantages, including faster response times and enhanced security. Agarwal said, “The AI engine allows you to make a decision in a lot faster response time… and running Edge AI locally is a lot more secure because you’re not transmitting data to the cloud.” TI also supports AI workloads on devices without dedicated accelerators through its AI ecosystem, although these require larger MCUs and more power.

Applications for Edge AI span industrial automation, automotive diagnostics, and smart sensor networks. Agarwal described the strategy as “adding scale to AI”: from tiny NPU engines for low-power devices to high-performance C7D DSPs capable of 1,200 TOPS. This allows developers to integrate AI across a wide spectrum of embedded systems.

To address barriers in development, TI has enhanced its software tools. Generative AI capabilities in the CC Studio IDE allow developers to describe tasks in plain language, automatically generating code. Meanwhile, Edge AI Studio streamlines the full AI workflow – from data collection and annotation to model training, optimisation, and deployment. “We have full flexibility entry points for customers,” Agarwal said, emphasising that TI’s ecosystem supports both beginners and experienced developers.

Looking ahead, Agarwal expects Edge AI adoption to expand as development becomes more accessible. “Eventually, engineers will be a lot more open to using these Edge AI-based enhancements in their design,” he said. By integrating AI from low-power MCUs to high-performance processors, TI aims to enable applications across industrial, smart home, automotive, and infrastructure domains, allowing developers to embed intelligence wherever it adds value.

Find out more in the interview below.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Previous Post
DigiKey at embedded world 2026 with NXP Semiconductors

DigiKey at embedded world 2026 with NXP Semiconductors

Next Post
At embedded world 2026, on the DigiKey booth, Paige Hookway speaks with John Weil, VP & GM for IoT and Edge AI Processor Business at Synaptics about Synaptics Astra and Edge AI leadership.

DigiKey at embedded world 2026 with Synaptics