Although voice command systems have become more capable over the years, there is still plenty of room for improvement. The BMW iNext electric vehicle will feature an improved voice control system with expanded gesture control capabilities and gaze recognition technology. Dubbed BMW Natural Interaction, the system also allows drivers to point to a building and find out more information such as what time it closes.
Drivers can interact with the car via voice controls, gestures, or by looking in a certain direction, depending on their preference at the time. If the driver is talking to a passenger, he or she may choose gesture or gaze controls instead of voice commands, for instance. Functions that the driver can control include opening and closing the windows or sunroof, adjusting the air vents, and changing a selection on the central screen. They can also point to buttons on the car and ask what they do.
Perhaps the most interesting part of the new technology is the driver’s ability to interact with his or her surroundings. Drivers can point their finger to a building and ask what it is, how long it opens, and how it’s rated by customers. They can also point to an area and ask, “Can I park here and what does it cost?” Essentially, BMW wants drivers to interact with their cars more like they would interact with their passengers.
“BMW Natural Interaction is also an important step for the future of autonomous vehicles, when interior concepts will no longer be geared solely towards the driver’s position and occupants will have more freedom,” said Christoph Grote, senior vice president of BMW Group Electronics, in a statement. BMW is showing off the new technology at MWC Barcelona this week.
BMW introduced gesture controls on the 7 Series back in 2015, but they were pretty limited. Now, thanks to improved sensor technologies, BMW can capture hand and finger movements in three dimensions through the driver’s entire space and determine a directional vector, allowing drivers to point to the screen and issue a command. A high-definition camera integrated into the instrument cluster can detect head and eye direction. The information transmitted between the driver and vehicle is evaluated with the help of artificial intelligence.