Gaining an Edge with AI

By Dan Zhang

Mouser Electronics

June 12, 2019

Story

Gaining an Edge with AI

AI research now offers the possibility of using thousands of hours of powerful computing resources to train and refine a neural network that will then be able to run on cheap, low power devices.

AI is starting to move from the cloud to the Edge. AI research now offers the intriguing possibility of using thousands of hours of powerful computing resources to train and refine a neural network that will then be able to run on cheap, low power devices. This makes possible edge AI devices, which can function even when offline, so a network connection is not essential. This approach leverages the power of evolutionary-style neural network training techniques, such as adversarial neural networks.

A neural network training process with constraints on energy and storage usage naturally creates a neural network that focuses its limited resources on more useful pathways, while pruning less critical areas. This process is ideal for creating a network to run on low power, low-cost devices. Subsequently, neural network compression techniques can further squeeze that trained network down, reducing memory requirements by as much as 90 percent, while barely affecting performance – similar techniques also help reduce the computation load.

Neural network compression can be as simple as reducing the precision of variables, but also includes analyzing, re-organizing and pruning the network. The surprising result is that massive computing power (which perhaps could also be described as ‘intelligence and experience’) may be distilled down into a form that runs on a cheap, low power device. (In fact, very broadly speaking, we could draw parallels between this approach and the billions of years of adversarial evolution that eventually encapsulated intelligent life in tiny strands of DNA).

The goals of this edge computing strategy may seem counter-intuitive if you assume that AI implies a human-style intelligence that learns and improves. That’s because many simple edge applications are best served by devices that do not change or improve significantly after deployment – we just want them to perform their tasks reliably and predictably.

For developers of low-cost edge devices and IoT devices, the AI toolkit is continuously expanding, and prices are falling. New products range from AI-capable FPGAs to neural network accelerators like Redpine’s QueSSence Intelligent Connected Platform, and the AAEON UP AI Core X series that add AI acceleration features to typical low-power computing and connectivity modules, which are ideal for IoT edge devices.

AI Makes Revolutionary New Products Possible

Imagine how adding AI could enhance everyday devices and the IoT. Onboard, low-cost AI can transform the most mundane devices – for example, adding reliable speech recognition to switches for hands-free operation – but let's also look at some more sophisticated potential applications:

An AI-equipped security camera can quickly learn to recognize familiar faces, pets and so on – and trigger an alert when it sees an unexpected person or animal. The ability to decide whether to sound an alarm even when disconnected from power and network is obviously a plus for a security device! With machine learning techniques, the role of such a smart security camera could be extended far beyond traditional automated cameras, for example, it could recognize and warn of fires, leaks, structural failures (such as a damaged roof or window) and a variety of other hazards and incidents.

A trained engine monitoring system can use its onboard AI power to detect anomalies and optimize performance by integrating sensor readings, analyzing patterns of vibration and other cues. Such analysis from broad sensor fusion is beyond the abilities of traditional algorithms and may surpass the performance of a human engineer in some situations. The ability of a localized AI to work without network connectivity allows ultra-low latency response for fine-tuning performance in real-time and is a huge benefit where network connectivity is difficult, due to remote location or radio frequency interference, and in mobile applications, such as vehicles and aircraft.

Ghosts in the Machine: Issues and Solutions

AI can be unpredictable and may go awry in edge cases that we have not had the opportunity to test. Consider methods of mitigating this, such as providing users with a fall-back mode. This might allow them to switch off the 'smart' AI, and allow the device to fall back to relying on simpler old-fashioned algorithms, so it is still functional and useful, albeit without the advanced AI features. Moreover, while on-board intelligence is a key selling point, developers can still weigh the pros and cons of when to make decisions on the edge and when to push them back to the cloud or request human guidance (in fact, that choice itself might be guided in real time by AI on the device).

Dumb Devices are About to get Smart

The potential for packaging powerful, pre-trained AI into low-power, low-cost devices offers almost unlimited opportunities to transform mundane commodity devices, add value to them, and to develop new products and new markets.