By 2022 almost all new cars will be equipped with a voice recognition system, which evolved learn to interpret the tone of voice and facial expression of the driver. The SYNC 3 , the system of infotainment and connectivity to Ford, which already includes voice control to manage devices car and select the smartphone app, and supports popular standard communication CarPlay Apple and Android Auto , soon will also leverage the potential of Alexa, Amazon’s virtual assistant. Ford is working with the RWTH University of Aachen in a project for the development of speech recognition technology, among other things, including the use of multiple microphones for better voice processing, reducing the effect of external noise and limit any disruption.
Today we drive car capable of understanding what we are saying; in the future cars we will evolve to understand our state of mind, without even having to say a word.
By 2022, Ford expects that 90% of all new cars will be equipped with a voice recognition system *. The next step for the cars of the future might be to understand, by changes of facial expression and the inflections of the voice, how to help the driver to improve the time spent behind the wheel.
The more advanced systems, with integrated microphones and cameras inside the car, can learn your favorite songs to lift the driver’s stress, and understand at what point instead prefer silence. Even the interior lighting will be developed to accommodate the mood of the driver.
“We are on track for the development of a car ’empathy’, which could tell a joke to the driver to pull up the mood, give him advice if he needs it, to remind birthdays and help you stay awake during a long journey” , said Fatima Vital, Automotive Marketing Director, Nuance Communications, which helped develop the voice recognition system of the Ford SYNC connectivity system.
It is expected that the voice control technology based on the cloud will be available on 75% of new cars by 2022, and that future systems will evolve into personal assistants, able to reschedule appointments or to order take away meals while drivers they are stuck in traffic.
Movie fans will remember, in “Her” movie, the character of Samantha, played by Scarlett Johansson, who through a voice recognition system responded to commands by Theodore Twombly, failing to understand the mood and feelings just by the tone of voice. Perhaps, one day not too far away, even our car might be able to understand us.
Starting next summer, the SYNC 3, the infotainment system and Ford connectivity, enable connection to Alexa, the Amazon virtual assistant, in 23 different languages and by integrating many local accents. Through access to cloud resources, the cars of the future will allow the driver to speak their native language, a tribute to those who celebrate, just today, the International Mother Language Day **.
“The ability to recognize natural forms of the language, such as ‘voice commands’ am hungry’ or ‘I need a’ coffee, have already led SYNC 3 in the area of personal assistants,” said Mareike Sauer, Voice Control Engineer, Connectivity Application Team, Ford of Europe. “The next step, provides that drivers are put in a position not only to speak in their native language, with its own accent, but also to use their own vocabulary, for an even more natural language”.
The Ford SYNC 3 system supports the most common communication standards, Apple CarPlay ™, which lets you use the iPhone interface on the car’s touch-screen and access voice commands Siri Eyes-Free, Android Auto ™, allows access in a simplified way by car screen Google Maps, and music and phone calls and messages can be handled directly with the *** voice commands.
At the time Ford is collaborating with the RWTH University of Aachen in a project for the development of speech recognition technology, among other things including the use of multiple microphones for better voice processing, reducing the effect of external noise and limit any interruptions. Nuance has added that, within the next two years, the voice control systems we will surprise you with phrases like “Would you like to order some flowers for Mother’s Day?”, “Should I choose a path to fewest hits home but longer?” or “Have you finished your favorite chocolate and there is a store right nearby where being able to buy, we go there?”
In the future the ability to analyze and recognize facial gestures and eye movements will allow drivers to answer calls with a nod of the head, adjust the volume with almost imperceptible gestures, and set sail to a destination with a quick look on the map.
There will be the danger, as in the film Her, to be able to ‘take a crush’ for our advanced speech recognition systems?
“Many people have an emotional relationship with their car, but with the new systems based on advanced speech recognition that will be degrees of ‘feel’ we expect that these relationships can become even more intense,” commented Dominic Watt, Senior Lecturer , Department of Language and Linguistic Science, University of York. “The car will soon become our personal assistant, a pleasant traveling companion, we can chat and ask her anything, to forget – one day – that one is entertaining with a car!”
At Mobile World Congress , to be held in Barcelona next week, Ford will reveal the news on the topic of mobility and connectivity.
Recent Comments