Modern vehicles need to provide drivers with timely, accurate ADAS and navigation data, but not so much that it distracts from the road. Passengers want to be entertained by an ever-increasing number of smartphone-like infotainment apps. Both drivers and passengers now expect that virtual assistant capabilities are always available.
All of this is changing automotive HMI design at a time when automotive engineering itself is changing. The software-defined car concept is promoting the release of new features and capabilities through over-the-air firmware updates, and drastically shortening traditional time to market windows. More and more suppliers are now involved in the creation of a single vehicle model, meaning that software from multiple vendors must be both reliable and interoperable.
To navigate this environment in a way that satisfies the safety and entertainment values of drivers, passengers, and automaker brand requirements, automotive UI developers need flexible, end-to-end HMI design solutions.
These are being enabled by more integrated, consolidated automotive subsystems.
The Multimodal Vehicle Experience
As more powerful multicore processor platforms are qualified for automotive use, subsystems that were traditionally isolated are now being consolidated onto the same physical piece of technology. In the case of automotive HMIs, you can see this happening with the convergence of infotainment systems and vehicle instrument clusters, sometimes on the same display.
What this means from a UI development standpoint is that fewer pieces of hardware must be able to support an increasing amount of mixed-criticality content that provides both safety information and entertainment. To achieve this while remaining open and scalable, software engineering has to become more modular.
And that’s not just in terms of housing multiple applications on a single display.
“Screens are getting bigger, allowing for more content. And these apps will have to be integrated with our traditional automotive UIs as well that have different lifecycles,” said Bruno Grasset, Head of Product Management, User Experience, Elektrobit. “HMIs need to be more modular. But modularization is not only needed when you create the UI on your development machine or create it for certain subsystems, it is also needed in the car at runtime.”
Elektrobit’s EB GUIDE 6.8 is an HMI development tool designed for the creation of multi-modal in-vehicle head units, heads-up displays (HUDs), instrument clusters, and IVI platforms (Figure 1). The tool supports 2D and 3D graphics, animations, and visual effects, and is packaged with testing and monitoring software to accelerate the simulation and debugging of robust HMI stacks.
On the modularization front, EB GUIDE 6.8 replaces the practice of modeling large, monolithic models complete with all of a platform’s required functionality by dividing projects into smaller, more manageable blocks. These smaller models run independently within the EB GUIDE Graphics Target Framework (GTF), a runtime environment that executes them in integrated HMIs.
The models are tied to each other and the rest of the system using a well-defined interface, allowing developers to create, update, manage and maintain subsets of the overall HMI design rather than making minute changes that may impact the entire stack.
The models can be executed in the same process, different partitioned processes, separate devices altogether, or a combination of the three. With the correct hardware and virtualization support, this allows safety- and security-critical visuals to run alongside infotainment-centric software on the same processor and display.
Just as important, it allows automakers and their suppliers to develop separate portions of a display unit simultaneously – even collaboratively – to reduce overhead and fast-track product delivery.
“And this feature has been implemented for the purpose of providing more independent content on the same screen or multiple screens, but having different life cycles, different requirements in terms of updates including safety and security,” Grasset explained.
“So, if you want to present oil temperature in a different way tomorrow, modularization enables that by allowing OEM, Tier 1, or UI developers to isolate the HMI into multiple pieces so that they can focus on one piece of information,” he continued. “The developers just agree on the interface and what needs to be shared with the rest of the system. But once the interface is defined, each developer can focus on what they need to do without worrying about the rest of the system.”
This enables a lot of independence between developers and more productivity because they don’t have to wait for other development teams to finish their design.
To assist with such multi-tenant development scenarios, Elektrobit has also introduced a feature called “namespaces” within the EB GUIDE 6.8 that allows software engineers to group objects so that they don’t conflict with other projects.
AR: Another Gear in Automotive HMIs
To further accelerate the agility of HMI development, Elektrobit plans to release a cloud-based version of EB GUIDE at some point in the future. Beyond that, the company is looking at how automotive HMIs are evolving in general.
Currently, EB GUIDE supports voice, touch and gesture interfaces, which the company sees as critical components on the path toward augmented reality-powered automotive UIs. And probably sooner than you’d think.
“There is already hardware that can execute augmented reality being integrated into cars,” Grasset said. “We’re not talking about what you see on your smartphone or on TV, we’re talking about technology that can display something in your field of vision that you interpret as something being highlighted. For example, an indication on the road. That is something that is coming in the mid-term for automotive applications. This really has a lot of potential for new use cases in the car, and EB GUIDE already supports them."
If you’d like to learn more about the EB GUIDE 6.8, visit https://www.elektrobit.com/ebguide/.