1. ENGLISH
  2. News
  3. Stories
  4. 2019
  5. 04
  6. When your car becomes your travel guide

We use cookies (our own and those of third parties) to make our websites easier for you to use and to display advertisements in accordance with your browser settings. By continuing to use our websites, you consent to the use of cookies. Please see our Cookie Policy for more information on cookies and information on how you can change your browser's cookie settings: Cookie Policy Accept

When your Car Becomes your Travel Guide

The car of the future will understand us: it will respond to our language and gestures. If there are problems, we’ll get the right tip at the right time. The experts at Volkswagen Electronics Development are working hard to make this goal a reality. Part 3 of our series on user experience trends.

All she has left to do is pop her small bag in the trunk and presto, everything’s ready for her weekend trip to visit her best friend in Berlin. Stephanie watches as the tailgate closes. She gets in the car and leans back. Her seat automatically shifts into a relaxed travel position. She points to the display with her index finger and swipes right until her favorite series appears. “Start,” says Stephanie – and the entertainment is set. “Your trip today will take about three hours. Should we depart?” asks the personal assistant. “Yes, please,” replies Stephanie.

Developer Astrid Kassner is working on enabling future vehicles to understand our language and gestures.

In the future, automated driving and connectivity will fundamentally change our entertainment options on weekend excursions. But that’s not all: in Volkswagen’s Electronics Development department, experts are working on enabling our vehicles to become personal assistants who, with the help of artificial intelligence, understand our needs. Stephanie, who is visiting her friend in Berlin, is a fictitious test person created to help developers with their work. In just a few years though, the new functions will be available to real-life customers as well.

Astrid Kassner is one of the development team’s experts for voice and gesture control. Their goal is to ensure that future vehicles understand – and carry out – wishes expressed with just a small gesture of the finger. This will be particularly important, says Kassner, when the person in a self-driving car no longer has to be responsible for steering. At that point, “we will lean back and no longer be able to reach the cockpit with our arms. So we are developing voice commands and gestures to complement touch operation via the display.”

An infrared camera captures the gestures

Stefan Henze is an expert for good advice at the right moment. He explains: “We don’t want to pester the driver with constant tips.”

Already today the developer can effortlessly control the interior lighting of her cockpit model, for example, with her right hand and a few words. With swiping motions, she navigates between streaming portals and messaging options without touching the display. In tests with test subjects, the developers have had encouraging results. “Many need just a few minutes to get used to the operating concept,” says Astrid Kassner. “That certainly has something to do with the fact that we are accustomed to using similar gestures with our smartphones.”

In technical terms, gesture control utilizes an infrared camera that captures the passenger’s hand motions. “The camera is continuously measuring how long the invisible infrared rays take to travel to the person’s hand and back again. With that information, it is possible to determine the position and the motion of the hand,” explains Kassner.

The cars of tomorrow need not only understand our speech and gestures, but also give us the right tips at the right moment. That’s what developer Stefan Henze is working on. When the car encounters heavy traffic, for example, the vehicle could recommend that the driver use Adaptive Cruise Control (ACC), which regulates the distance between the vehicle and the one in front of it. “What’s makes it special is that the vehicle offers ACC precisely at the moment when the driver could actually make use of the function,” says Stefan Henze. If needed, the car can explain to the person exactly what ACC means. If the customer is interested, they can then activate the function with a voice command.

The driver is in the driver’s seat

A point of the finger suffices – if desired, the car of the future will talk about sights along the route. Developer Astrid Kassner demonstrates.

The biggest challenge with such recommendations is finding the right dose, says Stefan Henze. “We don’t want to pester the driver with constant tips.” In order to provide the right tip at the right moment, artificial intelligence in future vehicles will evaluate vehicle data like speed and GPS position, explains the developer. “The recommendation to use ACC will only be issued when an extended standstill is imminent.” Another benefit is flexibility: in the vehicle of tomorrow, the driver will have the option of whether to buy the function or merely use it for a limited amount of time. Henze: “Let’s assume I only drive a long distance once a year while on vacation. Then I would subscribe to the traffic jam assistant just for that specific time period.”

Once at the destination, the car of the future could become an impromptu tour guide – above all when it has taken over the driving, leaving the occupants to observe the surroundings in peace. To develop such functions, expert Astrid Kassner hung a picture of Berlin on the wall. She takes a seat in her cockpit model, points to the picture and asks: “What’s that?” The car’s reply: “You are looking at the French Cathedral. It was built more than 200 years ago and is a popular tourist attraction. Should I stop so that you can take a closer look?” In the future, customers like Stephanie will be able to simply say “yes.”