
Last summer, Ford acquired a small company in Saline, Mich. that specializes in high-end simulation work for robotics. Given that Ford is in the process of developing highly automated vehicles (HAV) that it hopes to start deploying sometime next year, the addition of Quantum Signal AI (QSAI) was timely. At the time, QSAI had about 40 employees working with CEO Mitchell Rohde. Over the past nine months, Ford has found so many projects for Rohde’s team to work on, they are now trying to double their staff.
In the development of automated driving (AD), actually making the vehicle understand and navigate through its environment is only one piece of the puzzle. Back in late 2017 when I first took a ride in one the GM Cruise development vehicles in San Francisco, it managed to safely maneuver without issue. But the quality of the ride left something to be desired with some jerky motions as it braked too hard and steering inputs were a bit too aggressive. It felt like riding shotgun with my kids when they were learning to drive. Even this past January when I rode in one Yandex’s HAVs in Las Vegas, it was smoother, but more aggressive than expected.
The challenge for AD developers is to calibrate these systems to meet or exceed the expectations of the customers that will be paying to ride in these vehicles. The quality of the ride needs to be there to instill confidence that the system knows what it is doing. It also needs to avoid making riders feel ill, a real risk when you take away control from riders.
“You're inside the vehicle and and you're riding an AV and you're no longer actually the driver, you might be in the backseat experiencing this so your visual may or may not match what you're feeling and certainly more more than that is what the self-driving system will do in terms of stopping or starting or breaking or handling or whatever, the algorithms may not emulate exactly how a human would drive and that may create some sort of cognitive dissonance there in terms of how people feel or what they expect the car should feel like that when it's being driven versus what the algorithms are doing,” said Rohde.
That cognitive dissonance may be further exacerbated depending on the vehicle configuration. We don’t yet know what Ford’s HAVs will look like, but many of the others like the Cruise Origin and Zoox’s upcoming vehicle will feature carriage-style seating with seats facing both forward and back. Other concepts that have been shown feature darkened windows or have display screens inside to simulate different environments.
That disconnect between what the body feels vs what the eyes and brain perceive is what can lead to motion sickness or just general discomfort. Among the key areas of research for the QSAI team is ensuring that the HAV is a comfortable and welcoming environment for riders.
Rohde explained that the QSAI team is working closely with Ford’s ride and handling team that has spent decades quantifying what feels good and not so great in the vehicle. QSAI is using its simulation platform to evaluate the nature of the HAV control in differing environments and how customers perceive it based on the metrics developed by the Ford development team.
Earlier this year at CES I got to take a ride in Audi’s AI:ME HAV concept. During a portion of the ride we put on virtual reality headsets that simulate entirely different environments from the parking lot we were navigating around. In the past I’ve had genuine discomfort when using VR systems. However, in the Audi, the movement of what we saw in the VR display was matched directly with the motion of the vehicle. As the vehicle made a turn or accelerated, our motion in the virtual environment followed. This helped to mitigate that cognitive dissonance and make it a more pleasant experience.
Rohde couldn’t discuss specifics of the work QSAI was doing, but creating a better understanding of the sort of environment that riders will need in HAVs is at the heart of it. This includes the nature of the vehicle control, how people interact with the vehicle both inside and out and the sort of feedback that is provided while riding.
Most HAVs today feature some sort of display that indicates what the HAV’s sensors are “seeing” and planned path down the road. This is meant to reassure riders that the vehicle is detecting what their own eyes are seeing and build that trust. The challenge is to provide enough information to give that reassurance without overwhelming the rider.
Finding the right people to tackle these challenges is a huge challenge in itself.
“There's a never-ending need for talent, I mean it is a bottomless thirst that at all technology companies have and we're no exception and so we're looking right now at doubling our staff and and we've been interviewing furiously and and picking up some good people here and there,” added Rohde. “We're looking forward to continuing, we're gonna be interviewing all year long and picking up people and probably in the next year so. We're we're loving it, we're seeing a lot of good responses a lot of good resumes but you know, there's never enough because we've always got more and more challenges and the more people we get the more projects we have and the more projects we have the more people we need.”
“It's certainly a different time than it was back in the day when a lot of engineers were being laid off or there were problems finding jobs. Nowadays it seems like the sky’s the limit in terms of interviewing folks and then getting the right kind of folks.”
With the challenges of living in Silicon Valley getting worse all the time, Michigan suddenly doesn’t seem like such a bad destination anymore. There’s certainly some great opportunities for developers and engineers that want to help make the mobility transformation a reality.
https://www.forbes.com/sites/samabuelsamid/2020/03/05/michigans-quantum-signal-on-the-hunt-for-more-engineers/
2020-03-05T16:50:00+00:00