The Brains Behind the "Dream" Developers of Dreams4cars
DFKI is an important and essential partner in the project. In order to better understand its role and the project overall Rafael Math, from DFKI Saarbrücken, was kind enough to give a written interview, in July 2018, detailing his role and the role of Elmar Berghöfer and Mehmed Yüksel at DFKI Bremen.
What is DFKI’s role in Dreams4Cars?
The main role of DFKI is the development of the simulation environment, the pilot sites organisation, and the hardware/software preparation of the test vehicles (WP4). Furthermore, DFKI designs and implements the fault detection system (WP2) and contributes to the on-the-fly inverse model adaptation (WP2) as well as the forward emulators (WP3) and the instantiation of internal simulations/dreams (WP3). In WP5, DFKI’s main part is the monitoring of the system evolution in test cycles on the test track in Germany and operation of the simulation environment.
Tell us a little about the development of OpenDS and when it will be ready?
The development of OpenDS is a continuous process which already began several years ago (2012) and has been fostered by a series of EU projects. In each project, the focus of development was on a specific feature, e.g., driver assistance systems, truck driver performance. In Dreams4Cars, autonomous driving and the generation of intelligent roads are in the focus. Usually, the latest state of development is released biannually to the scientific community under an open-source license. Each release collects the work of the core development (driven by DFKI’s projects) and the contributions of the OpenDS users. The latest non-public version was released on 28th June 2018 – right before the due date of the Dreams4Cars simulation environment – and extends the previous version with an automatized terrain and road generation process, a highly accurate physics engine, the simulation of virtual sensors and V2X communication, and the implementation of a communication interface for external vehicle control by an artificial driver (“codriver agent”).
How does testing happen?
Validation: OpenDS can be validated by comparing the simulation results with the output of IPG CarMaker, a professional vehicle dynamics simulator, which has been used with the codriver agent since before the launch of Dreams4Cars. For this purpose, both simulators need to be provided with exactly the same driving scenario, which can be achieved by recreating the same road map and same traffic/obstacle interaction in either format manually. The behaviour of the codriver-controlled vehicle can be monitored through log files and the real-time visualization of the renderer. There are several driving scenarios comprising obstacle avoidance, overtaking, and emergency braking on straight and curvy roads in flat and undulating terrain.
Simulation: A typical test run starts with the dream generation, i.e. the automatized creation of a terrain and road model, the setup of road users and obstacles, and the definition of their behaviour at the runtime of the simulation. During the simulation, approx. 200 vehicle and environmental parameters (currently 50 in use) are continuously sent to the codriver agent (called “scenario message”) providing sensor and V2X data of the codriver-controlled vehicle. In the other direction, approx. 80 parameters are returned from the codriver to the simulator (called “manoeuvre message”), e.g., trajectory and motor primitive parameters, which are needed to visualize the planned trajectory of the codriver-controlled vehicle and to compute the longitudinal and lateral vehicle control. A new pair of scenario and manoeuvre messages will be sent every 50 milliseconds. At the end of the simulation, the results will be analysed by the dream generation component in order to, e.g., repeat the simulation with modified codriver settings or to run the codriver on a modified driving scenario, which will start the loop from the beginning.
How accurate and realistic will the simulations be for Dreams4Cars? Are there any aspects of real driving conditions that OpenDS won’t be able to simulate?
Due to the integration of the Chrono physics library and the OpenDRIVE road specification standard, accuracy and realism of the simulation have been drastically improved.
Since the Chrono physics library is very resource consuming, it cannot be executed in real-time with ordinary computing hardware. In these cases, one has to use Bullet, an alternative physics simulation which is less accurate but can be run efficiently on less powerful machines. Bullet implements a basic vehicle dynamics model based on ray-cast wheels and a friction model that applies separate impulses to each wheel where the ray contacts the ground. The simulation of microscopic road surfaces (e.g. driving on gravel roads) and complex tyre-road interaction is not possible with this approach. Furthermore, minor forces (e.g. aerodynamic forces, etc.) are not considered by any of these two physics engines.
The OpenDRIVE format allows the specification of (almost) any road geometry; however, not all features are currently supported by OpenDS, e.g. crossfall, superelevation, etc. Moreover, when using the automatized road generation process to create an OpenDRIVE description, further restrictions apply: roads must have 1, 2, or 4 lanes; intersections must have exact 90-dregree-angles (3-way, 4-way); and roads can only be defined in a tree-like structure (beginning at the root; no cycles allowed). Manual generation of OpenDRIVE files and simulating them with OpenDS is not affected by the latter restrictions.
How do you expect OpenDS to help in developing and ensuring the safety aspect of Dreams4Cars?
OpenDS provides a safe environment for the simulation of critical driving situations, which are rare in the real world. By making the codriver agent drive and learn in an artificial environment, OpenDS helps to reduce the high amount of kilometres needed to be driven under real-world conditions in order to encounter critical situations, which might also be dangerous for the human passengers of the involved vehicles. A further advantage of the simulation is that once experienced situations can simply be repeated (e.g. with modified codriver settings) and modifications of the driving scenario can be simulated in order to achieve different outputs.
Once real road testing begins what will the role of OpenDS be?
OpenDS is used to iteratively improve the behaviour of the codriver agent with the help of artificial simulations. When real road testing begins (wake state), OpenDS will be used to train a copy of the codriver by driving in a virtual world (dream state) in situations that recombine the salient cases annotated in the wake state. By these means, the agent passes through several optimizations of the sensorimotor system and discovers new actions improving the agent’s capabilities. After simulation, the updated agent is transferred back to the host vehicle where it will be used for the next iteration of real driving and collecting salient situations. This closes the feedback loop.
How will the general public be able to use OpenDS and test Dreams4Cars for themselves?
The simulation environment (developed in WP4.3) including the road generation tool chain and the latest version of the OpenDS driving simulation will be made available to the public at the end of the project. Using this software and a set of ready-to-use test cases, which both will be available for download, the user can experience the features of Dreams4Cars on a local computer. For more information and downloads go to https://opends.dfki.de/
Is there anything else that we should know about OpenDS?
- OpenDS is an open-source driving simulator with a growing community (> 4000 users)
- Everybody can freely use the simulator and adjust it to its needs
- Everybody can contribute to the success of OpenDS
- OpenDS implements several hardware (motion platform, authentic car environments (CAN bus), professional steering wheel/pedals, eye tracker, Oculus Rift) and software (OpenDRIVE, external data consumer, traffic simulator, multi-driver environment, multi-screen video output) interfaces.