We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Artificial intelligence (AI) is generally viewed in terms of a big computing solution as it makes the leap from the lab to production environments. In the public consciousness, AI is complex algorithms crunching vast amounts of data drawn from hyperscale cloud resources and all of this will create profound, transformative changes to business processes and models.

Lately, however, a different form of AI has emerged: narrower in focus individually  and less broad in reach. It’s called embedded AI and because it exists on the device, SoC or even the processor itself it is by nature broadly distributed, particularly out on the edge. This gives it the potential to be an even more significant advancement than enterprise AI, supporting life-changing applications ranging from autonomous vehicles to the metaverse.

Positive outlook

The global embedded AI platforms market is expected to grow 5.4% per year to reach a market value of $38.8 billion by 2026, according to Maximize Market Research. The healthcare sector is expected to be the leading consumer, where it has applications in key initiatives like digital consultation and robotic surgery. Retail and ecommerce will also be key drivers, helping customers streamline web search and order fulfillment.

These may just be the outliers, however. Fierce Electronics’ Cabe Atwell notes that embedded AI has applications across the full economic spectrum, from agriculture to manufacturing to finance. By pushing AI beyond the cloud and onto devices, it becomes more democratized and better able to deal with the specific functions the device was designed for. As well, it streamlines architectures and boosts overall performance by speeding up calculation times and lowering energy consumption, while at the same time providing the data needed for more centralized platforms to oversee predictive maintenance and increase operational efficiency. It’s important to note that embedded AI will not replace cloud platforms. In many ways, it will only add to the cloud’s ability to leverage AI at its layer, since embedded AI on the edge will need strong support from intelligent cloud resources in order to drive new services and create value.

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

Indeed, we should see embedded solutions make their way into cloud and hyperscale environments as well. IBM recently added on-chip AI to its z16 mainframe, specifically trained to speed payments to the point where all transactions can be run through fraud analysis. Under current technology, latency and throughput problems allowed only about 10% of transactions to be scrutinized. The aim is not only to flag suspicious activity, but to more quickly and accurately approve legitimate requests.

Device learning and embedded AI

Perhaps the most significant advanced in embedded AI in recent years was the ability to streamline machine learning models to a point where they no longer consumed substantial processing and data resources. Systems engineering firm DAC notes that embedded machine learning (E-ML) lays the groundwork for an entirely new edge that can employ deep learning and neural networking to become smarter and more responsive over time. This will make edge services less costly while at the same time improving performance and enhancing critical functions like security and privacy.

Already, device manufacturers are looking to leverage E-ML and other technologies to bring unsupervised learning to the edge. French designer Cartesiam recently launched a software platform capable of building AI algorithms for ARM processors with as little as 4 KB of RAM. Now, instead of having to shuttle data back to a massive, central AI engine, the device can learn on its own using data from its own embedded sensor. As with other applications, this lowers the cost to implement AI on distributed architectures and it speeds up the learning process by containing it entirely within the device.

For the most part, embedded AI will work behind the scenes, quietly working out system hiccups that would otherwise go unnoticed. It might lend some support to more flashy applications like natural language processing and intelligent analytics, but for the most part, it will focus on the finite responsibilities of its host device.

Taken together, however, embedded AI could be a powerhouse – changing the world in ways that aren’t even on the drawing board yet.

Author
Topics