On the Robustness of Out-of-Distribution Detection Methods for Camera-based Systems

Christian Huber, Bernhard Lehner, Claus Hofmann, Bernhard A. Moser, Reinhard Feger

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Out-of-distribution (OOD) detection refers to recognizing instances that lie outside the scope of what a machine learning model has been exposed to during training. In safety-critical domains like autonomous driving, OOD detection is paramount for enhancing the reliability and safety of machine learning systems. To investigate the robustness of OOD detection methods, we conduct experiments tailored to camera-based autonomous driving scenarios, focusing on realistic challenges these systems may encounter. Our experimental setup includes benchmarking various types of corruption, such as image sensor degradation, lens contamination, adverse weather conditions, and motion blur. The findings suggest intrinsic weaknesses across all tested state-of-the-art OOD detection methods. Unexpectedly, even single-pixel alterations corresponding to image sensor degradation over time can result in notable changes in performance.
Original languageEnglish
Title of host publicationProceedings of the Asilomar Conference on Signals, Systems, and Computers (ACSSC 2024)
Number of pages7
Publication statusPublished - 2024

Fields of science

  • 102019 Machine learning
  • 202037 Signal processing

JKU Focus areas

  • Digital Transformation

Cite this