Human machine interfaces (HMI) have become omnipresent in new vehicle models. Used initially as a replacement for car radio and cassette/CD entertainment units, they quickly added satellite navigation capabilities to turn them into an infotainment system. Designs vary, some opting for a tablet-style landscape-orientated display, while others are seamlessly integrated into the vehicle’s instrument cluster. Typically, they share the sleek glass touchscreen display approach used by our smartphones and tablets, making the HMI feel familiar and easy to use. Today, an infotainment system has further integrated smartphone functions through apps such as Android Auto and Apple CarPlay. As technology continues to be added to our vehicles, everything from advanced driver assistance systems (ADAS), to changing the color of the interior trim lighting, the infotainment HMI has become a core part of the vehicle’s cockpit and control mechanism. The move to integrating more control functions has also allowed vehicle manufacturers to remove traditional mechanical buttons and switches from the dashboard, further advancing the sleek and stylish aesthetics on a vehicle’s interior design. Here, we’ll look at various automotive HMI user interfaces and what they mean to drivers’ user experiences:
Critical Factors of Automotive HMI Design
Several primary requirements drive the design of automotive HMIs, of which safety is the most important. For an HMI, safety includes preventing driver distraction. Ready access to a plethora of functions and features via the HMI can result in the driver momentarily taking their eyes off the road to select a different music track or read a text message. The design of the HMI user interface (UI) should be intuitive and easy to operate so that the driver is only distracted for the absolute minimum of time. The development of specific standards for automotive HMIs is still in their infancy. However, the international standard ISO 15005:2017 stipulates that glances of 1.5 seconds should be long enough for a driver to gather information from a vehicle’s transport information and control system. Some HMIs monitor driver interaction while the vehicle is moving. If it appears the driver is spending too much time selecting different functions or accessing specific functions, such as news headlines, the HMI issues a warning to the driver. In this scenario, the touchscreen might have been the input method, although using a voice assistant would be a safer option for certain features.
Another driver of HMI adoption comes from the need to provide a more straightforward interface for a host of sophisticated features. This requirement is valid for semi-autonomous and fully autonomous vehicles, where the controls needed are very different from those required by a human driver. Likewise, the gauges and dials for a fossil fuel-powered vehicle are different from those required for an electric or hybrid vehicle. LCD touchscreens provide an extremely flexible HMI on which to create an intuitive UI that is informative and easy to operate. The overall user experience (UX), when carefully planned and designed, yields an engaging visual architecture that eases navigation.
For the vehicle manufacturer, the HMI provides a way to differentiate their vehicle platforms from those of other manufactures. The UX has a large part to play in this, where a standard UI is developed across all the manufacturer’s different vehicle marques. Developing custom icons, symbols, and fonts can be a chore for the embedded developer. Still, attention to this detail is easy for consumers to compare and contrast, and is increasingly part of the vehicle selection process.
For the vehicle owner, a flexible HMI permits the personalization of the vehicle’s various features, creating an experience that suits their mood, the length of the journey ahead, and brings the manufacturer’s brand experience to life. Comfort and interior lighting controls are popular personalization features, as is the driving mode (normal, eco, sport, etc.), and in high-end marques, the firmness of the suspension.
The HMI designer has a wide range of tried and tested technologies readily available on which the UI can be built and controlled. Projected capacitive touch (PCAP) sensing controls are the norm, closely followed
by voice-operated controls. Complementing these input methods are haptics, gestures (touch and in-air), proximity and force detection, eye-tracking, and machine learning.
The use of machine learning-based voice recognition applications is predicted to grow significantly in the coming years. Estimates by Tractica, a research company, forecast that artificial intelligence-based voice-enabled smart assistants will play a major role in HMI, with the global market worth $4.6 billion (USD) by 2025. Within just a few years, 80 percent of vehicle HMIs will integrate a voice recognition system, and that doesn’t include the smartphone assistant applications such as Google Voice and Apple’s Siri. Voice commands are most commonly used to control media players, setting the desired destination in the navigation function, and operating an attached smartphone to make a call. As machine-learning algorithms advance, voice recognition will be reliable enough to control ADAS functions and comfort controls. The availability of high-performance automotive-qualified application processors and inference engines will permit the use of more sophisticated natural-language algorithms that more readily accommodate regional accents and a more comprehensive range of voices. In addition to neural networks, digital voice processing techniques will help remove background noise from other vehicle occupants, tire and wind noise, and reject spurious commands from radio programs and other media sources. The voice UI also requires a configurable voice output for confirmation of spoken commands, asking pertinent questions, and other two-way dialogues. From a safety perspective, using voice controls makes perfect sense, allowing the driver to keep their eyes on the road rather than becoming distracted by options on a touchscreen.
Projected capacitive touch technologies are well advanced, with the ability to manage large display surfaces. For example, Tesla uses a 15-inch portrait-oriented touchscreen. Screens need to accommodate a wide range of ambient light conditions, from bright sunshine to darkness. High-contrast, high-brilliance display panels are critical to deploying a successful UI. The use of small touchscreens in vehicles is not recommended since they can be more distracting for safe driver interaction. That said, large screens with too much information presented at once can equally be distracting for the driver. UI/UX design is crucial and requires an experienced team that understands how a driver will interact with the UI with minimal distraction. Safety and usability are crucial. PCAP functions are routinely available integrated into a microcontroller or application processor. Multi-touch capacitive touch controllers are the norm, so that finger gestures such as pinch and stretch are accommodated. Touch button, slider, and wheel UI features are implemented as touch surfaces, and audible tones give user feedback during operation.
Enhancements to touchscreens include the use of vibrating haptic sensors to provide user feedback, proximity detection, and force detection. Proximity detection can activate the touchscreen or a set of
options as a finger approaches the screen. Force detection is a relatively new development that creates a 3D user experience suiting some applications more than others–for example, allowing the use of a more positive virtual push of a button icon rather than just placing the finger on the touchscreen. Multi-touch force detection is also beneficial for operation when the driver might be wearing gloves.
Haptics sensor development for use in automotive applications is still advancing, and some high-end vehicle manufacturers are incorporating haptics in their HMI. An example is an in-air haptic technology where hand-tracking techniques and an array of ultrasonic speakers are used to form and track an ultrasonic pattern that the human hand can feel. In this way, various ultrasonic 3D shapes can be formed in the air, such as a circular control knob, that the fingertips can feel and appear to hold. Typically, the shapes are formed a couple of inches above a control surface and create a virtual touch experience. Hand-tracking algorithms then detect which way the knob is turned or pushed to complete the UI.
Technical Design Considerations for Automotive HMIs
Several aspects of an HMI’s operating environment need particular attention for automotive applications. These include the changeable environmental conditions with extreme temperature, humidity, and physical factors such as dirt, dust, and vibration. Touchscreens are prone to erratic behavior should a film of condensation form on the glass. Likewise, operation with damp or wet fingers can result in the UI freezing until any remaining moisture is removed. At the other extreme of humidity, dry air increases the chances of static discharge occurring as the driver touches the screen. Any automotive HMI application needs thorough testing across a wide range of temperature and humidity conditions to ensure safe, robust, and reliable operation. HMI touchscreens should be easily cleaned by wiping away marks from everything from sticky sweets, soft drink spills, and moisturized hands. The display brightness needs to quickly and automatically compensate for driving through tunnels and into bright sunshine.
Road and cabin noise are other considerations for AI-based voice-recognition interfaces. Cabin noises can include conversations between other persons in the vehicle and sound from the infotainment application, too. Placement of the microphones that listen for voice commands needs to be optimal and physically isolated from the vehicle’s body. That way, road noise such as low-frequency rumbles and droning tire noise is kept to a minimum. Audible noise from heating and air-conditioning vents can impact voice command performance. If any of the vehicle’s windows are open, the rushing high-frequency audible background noise component is prevalent. Digital signal processing techniques and filters can help isolate the required human speech signals for subsequent processing by the voice recognition neural network algorithms.
Other technical considerations include ensuring the HMI conforms to various safety standards and other legislation such as EMI and EMC. With the increasing use of wireless communication in the vehicle,
including cellular, Wi-Fi, and Bluetooth®, unwanted interference is high. Also, in electric and hybrid vehicles, high dV/dt transients from the electric motor drive chain can be significant.
Some aspects of a vehicle’s ADAS functions will fall under the remit of the automotive functional safety standard ISO 26262. This standard serves to highlight risks in any software-based system that controls a vehicle’s operation and examines three criteria: the potential for harm, the probability it will occur, and how the system might avoid harm. Risk assessments are divided into four automotive safety integrity levels (ASIL), with ASIL A the lowest risk, and ASIL D the highest.
Automotive HMIs have transformed the way we interact with our vehicles. As cars are fitted with more sophisticated infotainment, communication, and ADAS control features, the HMI has become the principal control panel for drivers and passengers. The use of voice recognition is predicted to increase rapidly in the coming years, increasing driver safety and made possible through the use of a generation of new powerful AI processors and inference engines.
Robert Huntley is an HND-qualified engineer and technical writer. Drawing on his background in telecommunications, navigation systems, and embedded applications engineering, he writes a variety of technical and practical articles on behalf of Mouser Electronics.