At embedded world 2026, on the DigiKey booth, Lucy Barnard speaks with Marta Barbero, Lead Product Manager at Arduino, about the new Arduino product announcement.
Arduino has unveiled its latest platform, the VENTUNO Q – a powerful, AI-capable development board built on a Qualcomm processor following Arduino’s recent acquisition by Qualcomm.
The VENTUNO Q is the company’s latest expression of: “[Making] electronics accessible also by users that are not super expert engineers” – bringing professional-grade AI capabilities to a wide audience of makers and developers.
Powerful hardware at the core
VENTUNO Q is built on Qualcomm’s IQ8-8275 processor — an eight-core CPU combined with a GPU for advanced computer vision and a Neural Processing Unit (NPU) capable of up to 40 TOPS. This means the board can run large language models (LLMs), visual language models (VLMs), and YOLO-style object detection entirely on the Edge, with no Cloud connectivity required.
Alongside the Qualcomm chip, the board integrates an STM32H5 microcontroller from STMicroelectronics, designed for real-time and motion control tasks.
“Imagine you would like to build your next robot, which is AI capable,” said Barbaro. “You can do that with this platform.” Connectivity is comprehensive: HDMI, USB, Wi-Fi, CAN interfaces, UNO shield headers, MIPI CSI, and DSI connectors for cameras and displays are all included on-board.
App Lab: bridging Arduino and AI
Accompanying the hardware is App Lab, Arduino’s new development framework, first introduced alongside the UNO Q in October. App Lab combines familiar Arduino C++ programming with Python-based Linux development and a low-code AI layer. Barbaro described it as giving developers “the simplification of AI” — with ready-to-run examples that require no coding, as well as the flexibility to go deeper as skills grow.
App Lab integrates with Edge Impulse Studio for training custom AI models and with Qualcomm AI Hub, a repository of pre-optimised models for the platform. Developers wanting to move beyond the framework can access the board’s full Linux environment using PyTorch or any other advanced AI tooling. App Lab runs either standalone on the VENTUNO Q connected to a monitor or as a downloadable desktop application.
Real-world use cases
Among the range of applications Arduino showcased at embedded world was a “smart mirror” that runs a visual language model that analyses a person’s outfit and offers style recommendations — entirely on-device. Other demonstrated use cases include offline LLM chatbots with automatic speech recognition and text-to-speech, gesture recognition for touchless interfaces, object and person tracking, and pose estimation.
Also on display is the company’s autonomous mobile robot (AMR) that runs entirely on a single VENTUNO Q, handling motor control, battery management, localisation, environment mapping, and obstacle avoidance all in one place.
To find out more, watch the full interview below.