Confidence-driven weighted retraining for predicting safety-critical failures in autonomous driving systems.

Journal: Journal of software (Malden, MA)
Published Date:

Abstract

Safe handling of hazardous driving situations is a task of high practical relevance for building reliable and trustworthy cyber-physical systems such as autonomous driving systems. This task necessitates an accurate prediction system of the vehicle's confidence to prevent potentially harmful system failures on the occurrence of unpredictable conditions that make it less safe to drive. In this paper, we discuss the challenges of adapting a misbehavior predictor with knowledge mined during the execution of the main system. Then, we present a framework for the continual learning of misbehavior predictors, which records in-field behavioral data to determine what data are appropriate for adaptation. Our framework guides adaptive retraining using a novel combination of in-field confidence metric selection and reconstruction error-based weighing. We evaluate our framework to improve a misbehavior predictor from the literature on the Udacity simulator for self-driving cars. Our results show that our framework can reduce the false positive rate by a large margin and can adapt to nominal behavior drifts while maintaining the original capability to predict failures up to several seconds in advance.

Authors

  • Andrea Stocco
    Software Institute USI Lugano.
  • Paolo Tonella
    Software Institute USI Lugano.

Keywords

No keywords available for this article.