On the Robustness of Out-of-Distribution Detection Methods for Camera-based Systems

  • Christian Huber (Speaker)

Activity: Talk or presentationPoster presentationscience-to-science

Description

Out-of-distribution (OOD) detection refers to recognizing instances that lie outside the scope of what a machine learning model has been exposed to during training. In safety-critical domains like autonomous driving, OOD detection is paramount for enhancing the reliability and safety of machine learning systems. To investigate the robustness of OOD detection methods, we conduct experiments tailored to camera-based autonomous driving scenarios, focusing on realistic challenges these systems may encounter. Our experimental setup includes benchmarking various types of corruption, such as image sensor degradation, lens contamination, adverse weather conditions, and motion blur. The findings suggest intrinsic weaknesses across all tested state-of-the-art OOD detection methods. Unexpectedly, even single-pixel alterations corresponding to image sensor degradation over time can result in notable changes in performance.
Period28 Oct 2024
Event titleAsilomar Conference on Signals, Systems, and Computers
Event typeConference
LocationUnited StatesShow on map

Fields of science

  • 202037 Signal processing
  • 102019 Machine learning

JKU Focus areas

  • Digital Transformation