MSC Software Corporation (MSC) has released Adams-ready VTD, a software package which combines vehicle dynamics and virtual test drive simulation to speed-up the development process of next-generation advanced driver-assistance systems (ADAS) and safe autonomous vehicles.
Passenger vehicles can already read traffic signs or detect passing traffic, but this ADAS 2+ function depends on improved sensor fusion, meaning the process of merging data from multiple sensors to near certainty so that electronic systems can make safe decisions. At the same time, future autonomous driving algorithms require realistic test data for research and model training. However, the Adams-ready VTD accelerates development by simulating how a dynamic moving vehicle and its sensors will behave in complex road environments.
Using Adams simulation software, automotive manufacturers have validated vehicle dynamics models with road tests to understand a vehicle’s movements and handling. By means of an open interface, it is now possible to ‘drive’ these vehicles in a simulated road environment provided by the Virtual Test Drive (VTD) platform.
The platform simulates a vehicle’s movements based on the road conditions (such as slope or friction) to determine vehicle behaviour, such as skidding or rolling, and then evaluates the best course of action such as whether to change lane, or the necessary braking force.
Luca Castignani, automotive strategist at MSC Software commented, “Simulation must be accurate to centimetres, not metres, because a split second makes the difference in the most complex of circumstances. With Adams-ready VTD, we have brought software development and automotive engineering together so the industry can move from ‘what should the vehicle do?’ to ‘can the vehicle cope with this command? and develop the next generation of safe vehicles.”
Sensor perception
Accurate information from cameras, RADAR/LiDAR or satellite navigation is relied upon by ADAS systems to make safety-critical decisions. According to MSC, the blind spots caused by vehicle-road dynamics can be identified with this package to determine which sensors to rely upon and when. For example, ensuring that a car driving over a speed bump is able to perceive a pedestrian, even if the camera vibrations prevent tracking.
Vehicle OEMs can evaluate how sensors function when subjected to vibrations or changes in orientation, so they can cost-effectively develop sensor fusion between road tests. Castignani added, “The perception of a camera mounted on a truck cabin can change significantly relative to radar measurements during a braking manoeuvre – so what’s the proximity to the car in front? We are enabling ADAS engineers to develop robust test cases like this to improve confidence in the decisions they make and develop accurate sensor fusion.”
Adams models can now be used directly in VTD 2019.1 using the open Functional Mock-up Interface (FMI), with a flexible configuration to simulate any vehicle including trucks with more than four wheels, and trailers. Users can now “bring their own AI” using an open interface to insert their driver-in-the-loop into VTD, then test and train their self-driving algorithms in an accurate simulation.
VTD 2019.1 supports OpenDRIVE 1.5 and OpenSCENARIO 0.9 interoperability standards and features enhanced LiDAR simulation with more accurate GPU-accelerated ray-tracing and capabilities to simulate surface interaction. It is available for Red Hat Linux 7.3, with optional Docker containerisation of modules to aid the integration of VTD into customers virtual test environments and simplified deployment to cloud or on-premise infrastructure.