Are graphics finally important in embedded systems?

March 18, 2016

Are graphics finally important in embedded systems?

For decades graphics capability was the also ran, either only required for development or displaying a crude and basic GUI. Now expectations have move...

For decades graphics capability was the also ran, either only required for development or displaying a crude and basic GUI. Now expectations have moved on to immersive multi-touch, animated/video experiences, even in the embedded and industrial space.

Arguably, what used to be known as “embedded” defined a headless system, one invisible to the user and deeply embedded into a device. Through the decades, that user interface has evolved from basic power-on/activity LEDs, to calculator-esque liquid crystal displays, to full-color LCDs, and now e-ink driven displays to slash power consumption.

As such, it’s only fairly recently that embedded computing solutions have necessitated any real graphical prowess. Before this development, applications that demanded it suffered either cumbersome industrial systems that supported high-speed peripheral bus expansion to employ a commercial graphics card, or chanced using a completely commercial PC and accepted the consequences.

With power consumption always a driving factor in embedded systems, the problem wasn’t integrating high-performance graphics. Various high-speed peripheral buses have been available to embedded designers; however, they not only exponentially increased wattage requirements, but also dictated a thermal dissipation challenge – so most designers didn’t bother.

Intel led the way by investing heavily in improving their own integrated graphics chipsets. The Intel HD graphics chipsets we see today are so impressive they encompass the majority of today’s embedded graphical applications – struggling only with rendering complex 3D graphics at high resolutions. For those remaining, behemoth graphics cards manufacturers clambered to reduce power consumption to make their top-end products accessible to the embedded space. AMD made an interesting decision in choosing not to pit their new embedded CPU, the G-Series, against the Intel Atom purely on raw CPU performance; instead, they focused on graphical capability, claiming eightfold the graphical performance of their rival. This was enabled by their shrewd purchase of ATI, driving Radeon technology into embedded systems as part of their G-series and R-series ranges.

However, the AMD alternative in my experience hasn’t shaken the ubiquitous popularity of the Intel Atom. Is it that graphical performance remains insufficiently important across our industry to dictate any real impact yet? Or is it that familiarity and comfort with the omnipresent Intel Atom is such that designers shy away from change? Debatably, it’s the burden of familiarity that held back the tidal wave that is ARM; for so long, it felt alien to those who’d spend their careers developing under x86 architectures.

The popularity of smart phones employing PCAP touchscreens drove that technology into our industry, pushing aside its resistive counterpart that had reigned true for years before that. Could it be that the expectation for high-resolution displays borne from the same devices (mine is now 1080p on a 5.5-inch display) is what drives HD displays in embedded and industrial systems? Is that what will truly move forward the necessity of high-performance graphics in our factories and workplaces?

In the past few months I’ve seen a sudden surge of interest in dual displays in such systems, often employing one to control (via touch screen) and the second purely to monitor. Such applications, all else being equal, require at least double the bandwidth of yesteryear.

Rory Dear, European Editor/Technical Contributor
Categories
IoT