The automobile is at the focus of just about every advanced technology currently being developed, from artificial intelligence to next-generation wide-bandgap semiconductors. Bringing all of the disparate yet complementary functionalities available together in one solution is the hard part. Powerful software simulation is one way to address this demanding evaluation task.
Recently, NI acquired monoDrive, a leader in advanced high-fidelity simulation software for advanced driver-assistance systems (ADAS) and autonomous vehicle development. NI will leverage monoDrive’s signal processing and advanced simulation tech to create high-fidelity driving environments, modeling multiple sensors in thousands of real-time scenarios. Combined with NI’s software-connected systems, it will better integrate simulation, lab-based and physical testing environments.
NI also announced a strategic collaboration with Ansys, whose simulation solution enables sensor vendors to simulate radars, LiDAR, and cameras recreating real-world simulation to validate sensors and inject data into software and hardware under test in real time.
To get a better insight into these developments and what they can mean for the industry, Machine Design's sister brand, Evaluation Engineering, recently spoke with Noah Redding, senior director, Solutions and Offering Management at NI. What follows are some highlights of the conversation; the full interview can be found here.
On the interrelation of the cloud, IoT, edge computing and AI in autonomous vehicle development:
You’ve got different parts of the vehicle that have to talk to each other, and make new types of decisions that the vehicle has never made before. It creates this challenge where there’s basically an endless, infinite number of scenarios, and things that automotive companies are having to figure out. How do we test and validate that the vehicle is going to be able to process this type of situation?
For example, if a car is driving on the freeway, and there’s a situation where there is a dog, and then there’s a car, and then there’s construction. What does the car decide? So, there’s this infinite state space of scenarios, and so you have to bring these different expertises together. In this ADAS and autonomy space, what we see is there’s a real workflow challenge that people are working through.
It’s not just the different disciplines like you mentioned, but also, for example, if you’re going to produce a high-quality vehicle or component, a lot of companies are trying to figure out “how do I actually test in the right amount, from a pure simulation environment, to lab-based testing, on to road testing?” And finding the right mix of that is a real challenge. So, what we’re doing is working with our customers to unify that workflow and make going between those different phases as seamless and easy as possible.
On tackling this challenge:
We’ve partnered with monoDrive for several years now, and what they do is they provide really powerful and high-fidelity virtual environments that are capable of modeling dozens of sensors in real time. That can be used for cloud-based simulation as well as hardware-in-the-loop or lab-based tests. So their capabilities, combined with our platform of connecting physical IO interfaces in a real-time operating system, allows our customers to do high-fidelity simulation and modeling, and actually connect up to the device itself.
On the fundamental problem of having infinite scenarios to test:
The challenge is when you’re on a test track, you’ve got a few known scenarios but it’s kind of impossible to cover the full gamut of driving situations and capabilities. How do you simulate all of the different traffic scenarios and all of the different downtown-driving scenarios with pedestrians and crosswalks and things like that? Basically, you can do a lot of development and test and validation, and create some scenarios in those types of controlled environments, whether they’re indoor, you have a full vehicle in there, or you’re doing it on a track.
But that’s only part of the problem, because you can only recreate a subset of real-world scenarios. Inside this test track, I’m going to do these 100 scenarios, when in reality, when the car is on the road, you’re going to have millions of different situations that it could come across. You could be really good in those 100 scenarios, but for the other 999,000, how do you get confidence in those?
That’s where we’re going to go out and drive on the road, we’re going to record a bunch of data, and then re-create some scenarios virtually, or we’re going to do a pure software-based simulation on the cloud, where we know we can run through millions of scenarios and just keep the simulations running constantly. It’s that type of balance of simulation versus real, and utilization of assets that companies are going through right now.