Moving forward with the AIoT
Artificial intelligence is not just a trend, but trendy. The concept, and the handily vague term ‘smart’, have been used to unconvincingly promote a range of products that are claimed to have gained sentience – whether it’s our cars, our homes, or even our appliances. Aneet Chopra, VP Business Development, Marketing & Product Management, XMOS discusses.
Marketing tactics aside, it’s clear that the AI label represents the future of technical and technological innovation for many, with a new generation of electronics that can learn independently and adapt to our convenience.
That said, this potential can only be harnessed via significant processing abilities, alongside sensor arrays to collect diverse forms of data.
At present, most devices rely on the cloud for their intelligence, with data being transferred to servers thousands of miles away from the local environment. This presents a greater threat surface when it comes to security, not to mention increased latency and overreliance on internet connectivity.
Rather than staying the course with this convoluted process, it would make more sense to embed AI into the device itself. In-built intelligence will make it possible to develop networks of electronics that can make decisions both on their own, and collaboratively with others.
Fortunately, this is becoming possible through a new technology infrastructure known as the Artificial Intelligence of Things (AIoT), where IoT devices combine with automated learning to offer a new range of innovative gadgets.
What’s more, XMOS’ Edge of Now report – the third of in our series of explorations into what engineers think of the AIoT market – suggests the historical market barriers of power and cost are finally being surmounted, which means manufactures will need to capitalise on the potential this infrastructure offers.
The price is finally right
Development of the silicon needed to support AI once focused on cloud technologies, facilitating server farms inundated with billions of daily requests. The price obviously scaled – rapidly and upwards – with these requirements.
Moreover, semiconductor shortages related to the pandemic meant that cutting edge chips were less available when compared with those featuring established nodes, driving the cost up even more.
These costs, along with inefficient design specifications, were an impossible balancing act for many – despite potential innovations that could advance any entire industry.
However, a truly smart home means having an infrastructure that can perform simple analysis and mundane tasks, whether it’s turning the lights on or adjusting the temperature. It doesn’t need to be capable of sophisticated big data analysis.
As hardware begins to speak to this need, our research shows that the cost of buy-in is falling. Just 24% of the engineers we surveyed mentioned cost as an impediment to improving on-device processing, which is only half the 48% who said the same last year, and barely a third of the 64% in 2020. Moreover, only 16% think cost issues complicate adoption of the AIoT.
When power trips come to an end
Cost and energy concerns are clearly related. AI chips are high spec, which means they tend to consume a lot of energy and can be further complicated by other design decisions. Tech like “always-on” sensors create problems for engineers who want to ensure their designs allow for energy efficiency.
Achieving maximum power consumption means engineers are faced with a difficult choice, where either functionality is removed, or the price is increased given the need to procure high-end hardware.
But recent data from our report shows a decline in consumption-related concerns, with just 23% of engineers seeing efficiency as an impediment to increased on-board processing. This marks a significant drop from 53% last year, and 65% in 2020.
What this also implies that engineers are seeing alternatives to power hungry system-on-chip solutions that require multi-Watt power. New products are beginning to emerge that are capable of serving edge processing needs while consuming well below 0.5W, negating many of the power-based challenges that previous hardware had presented – which, of course, has a knock-on effect on the price tag.
Paving the way
With both power and cost barriers increasingly feasible to overcome, there is more optimism towards edge solutions with respect to both cost and power. As a result, we’ll be seeing a range of new, innovative use cases in the years to come.
We’ve found almost two thirds of engineers are developing ideas due to release over the next 12 months, featuring the integrated processing required for AI support. These evolutions will become the norm, with two fifths (38%) of engineers confirming their blueprints will allow for the processing required for AI integration. Moreover 26% of respondents said this will apply to at least 70% of their product ranges.
With the era of AI-enabled creations having finally arrived, it’s vital that engineers and designers get access to flexible and affordable components. They’ll also need silicon that’s cost-effective, complimenting ambitions for mass production, while remaining applicable to evolving designs. These developments will help engineers to address shifting market trends without significant supply-related issues.
Successfully bringing embedded intelligence to market will afford manufacturers opportunities worth billions, with a new generation of truly intelligent products that’ll make our lives more complimentary.
Users will benefit from smart car parking to prevent fines, healthcare instruments to keep us safe, and a range of truly innovative and exceptional applications. With this in mind, it’s genuinely exciting to see the growing level of confidence in the AIoT.
Electronic Specifier has previously covered XMOS’s Edge of report series – an annual reporting exploring the barriers that electronics engineers must navigate in order to push the boundaries of on-device AI.