Simplicity should be a key objective of future HMI (human-machine interfaces) in ACES vehicles. That’s the way to provide an optimal user experience, facilitate quick introduction by dint of shortened development time; build and bolster trust, and drive up feature usage, acceptance, and demand.
According to panel discussions and expert presentations at the WardsAuto User Experience conference held shortly ago near Detroit, new technology needs to be presented to users as simply as possible to avoid causing cognitive stress. This applies all up and down the scale of important aspects like occupant monitoring, safety systems, voice assistants, smart surfaces, 3D imaging, and links to connected services, including MaaS (mobility as a service).
User trust in the system is a crucial key to help users behave safely—especially during the transition between manual and assisted or automated driving.
Level 3—the first “hands-off” level—will need full acceptability for the driver to actually remove hands from the wheel, and that amount of trust will take time to build. Autoliv senior engineering director Rich Matsu says his company expects L3 and L4 vehicles, with a slow introduction rate, to account for only about 10% of global vehicle sales by 2030: “We see Level 2 as being dominant for the foreseeable future. It’s a big focus of our organization and the industry overall,” he says. At L2 the driver has to remain alert enough to quickly take over control from the automated system when needed, so driver trust and a reliable driver monitoring system are critical keys.
Matsu thinks at least 7 seconds will be needed between first alerting the driver to resume control of the vehicle, and their doing so satisfactorily. That’s a challenge when L2 systems begin to approach L3 capabilities—think of Audi, BMW, Tesla, Cadillac, and others with advanced adaptive cruise control systems that take much of the highway driving load off the human driver—and as these highway-pilot systems get more sophisticated and allow drivers to focus away from the road for longer periods, it becomes a longer and more difficult task to regain their attention and control.
Matsu and Michael Godwin, Osram’s North American director of visible-light LED products, both say the car’s steering wheel is coming into focus as the new HMI hub.
Take GM’s Super Cruise, for example, available on various Cadillac models. It’s an L2+ ACC system that uses an infrared sensor to monitor the driver’s eyes, and uses lights along the steering wheel to indicate system status, whether it be “all systems go” or that the system’s about to disengage because the driver’s eyes aren’t on the road. Autoliv’s Matsu says interactivity in the steering wheel like this is “going to be a more prevalent feature in the future”.
Occupant/driver monitoring— knowing precisely where occupants are and what they are doing—is rapidly shaping up as a critical HMI element. Driver monitoring can already help to reduce vigilance and drowsiness issues in existing vehicles. And biometrics to measure the health and task-fitness (including sobriety) of the driver will be needed in a world where the driver isn’t necessarily driving all the time.
Asaphus Vision, headquartered in Berlin, Germany, is a university spinoff with strong links to the research community. They develop software to provide innovative technology for face identification and driver monitoring. Many developments are under way on more advanced biometrics and other locations—in seats and head restraints, for example—and forms of sensing to better understand the position of occupants and their mental and physical condition.
Another challenge is to better inform the driver. Osram’s Godwin says forthcoming models will have “more vivid head-up displays with wider fields of view that will keep the driver informed and engaged”. And he notes that interior lighting is coming to be employed in new ways, like to signal drivers if the auto-piloting vehicle needs attention or intervention. A lot of human-factors study is going on to determine what kinds and colors of light are most effective, where they should be placed, whether and how fast they should blink, and what audible signals should accompany them.
New kinds of controls are rapidly coming forth, too. CGT, based in Ontario, Canada, introduced its Reveal material last year, developed in coordination with audio supplier Harman. The material integrates touch-sensitive electronic controls into the interior panels without the use of physical switches.
A VUI (voice user interface) opens the door to spoken occupant interaction, using speech recognition for voice command. Amazon’s Alexa is one of the leading technologies in this field, and GM recently signed a major deal to bring Alexa Automotive to a wide array of Chevrolet, Buick, GMC, and Cadillac models.
Arianna Walker is Amazon’s “chief evangelist” for Alexa Automotive. At the WardsAuto UX conference, she predicted the voice assistant playing a role in autonomous cars of the future. Alexa could aid in the handoff between autonomous and manual driving modes, for example, and Walker says “You can imagine all sort of other ways voice can help”. Amazon has been working to get automakers and big T1s to embed the Alexa voice assistant in vehicles. Ford did it with Sync App Link, which requires the Alexa app to work, and Faurecia did it for its latest Cockpit of the Future CES demonstrator.
But here again, it has to be simple, and must simply work—all the time, every time. Panelists at the conference all agreed that frictionless usability of a user interface, including a voice-command interface, strongly influences usage rate, and eventually take rates. Another point of agreement among the panelists: today’s first equipped models are not yet at that level.
Still, the technnology is promising. VUI could prepare the car before you go, heating or cooling it to suit, or charging it, or doing maintenance, as public transport companies are trying to do autonomously with their buses when they’re not in use. VUI opens a gateway to connected services; Walker says innovations allowing the likes of “Alexa, please close the garage door and add milk to the grocery list” are under development. Or if there’s room in the car because it’s not full, VUI could help you find people in the neighborhood looking for the same commute—along the lines of French carpooling service BlaBlaCar.
One of the biggest challenges for VUI systems is how to tailor it to the individual operating systems of various automakers.
All in all, it’s increasingly clear that a simple, intuitive user experience, with seamless continuity to other life segments (home, office, grocery, etc) is a critical key to success for vehicle HMI. Buyers won’t buy if they have to spend too long understanding what it is and how it works, if it takes too long to respond or responds wrongly—nobody likes to feel as though they’re not being listened to—if it looks weird versus what’s expected or what’s been experienced, and so on.
Success will help the auto industry to create new revenue opportunities and relationships with vehicle buyers via connectivity and mobility. These new revenues will be even more necessary as the industry has no real idea yet how to cost and price these new features, while buyers, for their part, don’t yet know what they will be ready to pay. Stay tuned!