Tactile Mobility adds missing sense of touch to autonomous vehicles

"Where other companies enable smart and autonomous vehicles to see the road, we enable those vehicles to feel the road," Tactile Mobility CEO Amit Nisenbaum said.

Tactile Mobility CEO Amit Nisenbaum (photo credit: ASAF HABER)
Tactile Mobility CEO Amit Nisenbaum
(photo credit: ASAF HABER)
If the world’s automotive giants have their way, autonomous vehicles will gradually roll out onto the world’s streets by the beginning of next decade.
A combination of sophisticated cameras, radar and lidar (light detection and ranging) sensors will enable smart and self-driving vehicles of varying autonomy to “watch” the road ahead and deliver their passengers safely to their destinations.
Of course, soon-to-be-outdated human drivers do not only look for hazards ahead, but feel the dynamics of every bump of the road with their bodies and incorporate that, often subconsciously, into their decision-making process.
Founded in 2012 by serial entrepreneur Boaz Mizrachi, Haifa-based start-up Tactile Mobility aims to add that missing human-like sense of touch to smart and autonomous vehicles, and provide data to make vehicles smarter and safer.
“Where other companies enable smart and autonomous vehicles to see the road, we enable those vehicles to feel the road,” Tactile Mobility CEO Amit Nisenbaum told The Jerusalem Post.
“We do it with software only and based on data generated by multiple, non-visual existing sensors,” said Nisenbaum, who splits his time between Haifa and the company’s Silicon Valley office.
The company’s mapping cloud module applies big data and machine-learning technologies to data gathered from multiple vehicles, which can then be downloaded back to the vehicle’s computers for improved performance and used for analysis by third-party entities, including road authorities and insurers.
Using the signals, the software module runs algorithms and artificial intelligence to generate real-time insights regarding physical factors that impact any ride and augment existing sensory data.
The company’s mapping cloud module applies big data and machine-learning technologies to data gathered from multiple vehicles, which can then be downloaded back to the vehicle’s computers for improved performance and analysis by third-party entities, including road authorities and insurers.
Advertisement
“There is always a compromise between safety and user experience in autonomous vehicles. They rely on cameras and lidar sensors, know the vehicle speed, the speed of the vehicle in front, its distance from your vehicle, and calculate the safe distance of the vehicle given relative velocities, but they don’t know the road grip level,” said Nisenbaum.
“They need to assume that there is a low grip level in order to be safe. But that’s sub-par from the user experience, because a user is used to driving in a certain way and at a certain speed.”
Tactile Mobility is currently working to implement its technology with six leading car manufacturers in Europe and North America, including American automobile giant Ford. While the company’s software can be embedded in existing vehicles, it hopes to see its technology included in mass-market production as soon as the 2020 and 2021 model years.
“The more computerized that vehicles are becoming, and not just fully-autonomous vehicles, the computers need to have an additional sense of tactility as well,” said Nisenbaum.