Longitudinal Vehicle Dynamics: A Comparison of Physical and Data-Driven Models Under Large-Scale Real-World Driving Conditions

17/07/2019

Submitted to Vehicle System Dynamics

Abstract

Mathematical models of vehicle dynamics will form essential components of future autonomous vehicles. They may be used within inverse or forward control loops, or within predictive learning  systems.  Often,  nonlinear  physical  models  are  used  in this  context,  which,  though  conceptually  simple  (especially  for decoupled, longitudinal dynamics), may be computationally costly to  parameterise  and  also  inaccurate  if  they  omit  vehicle-specific dynamics.  In  this  study  we  sought  to  determine  the  relative
merits  of  a  commonly  used  nonlinear  physical  model  of  vehicle dynamics  versus  data-driven  models  in  large-scale  real-world driving  conditions.  To  this  end,  we  compared  the  performance of a standard nonlinear physical model with a linear state-space model and a neural network model. The large-scale experimental
data  was  obtained  from  two  vehicles;  a  Lancia  Delta  car  and a  Jeep  Renegade  SUV.  The  vehicles  were  driven  on  regular, public  roads,  during  normal  human  driving,  across  a  range of  road  gradients.  Both  data-driven  models  outperformed  the physical  model.  The  neural  network  model  performed  best  for both vehicles; the state-space model performed almost as well as the  neural  network  for  the  Lancia  Delta,  but  fell  short  for  the Jeep  Renegade  whose  dynamics  were  more  strongly  nonlinear. Our results suggest that the linear data-driven model gives a good trade-off  in  accuracy  and  simplicity,  whilst  the  neural  network model  is  most  accurate  and  is  extensible  to  more  nonlinear operating  conditions,  and  finally  that  the  widely  used  physical model  may  not  be  the  best  choice  for  control  design.




This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 731593.