Revealing Inherent and Counterintuitive Sensitivities of Out-Of-Distribution Detection Methods

Christian Huber, Bernhard Lehner, Claus Hofmann, Wei Lin, Reinhard Feger, Sepp Hochreiter, Bernhard A. Moser

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Out-of-distribution (OOD) detection identifies samples outside the data distribution used to train a machine learning model and is crucial in safety-critical domains like autonomous driving. While neural network robustness has advanced, its effect on OOD detectors is less studied. We address dataset limitations due to unknown preprocessing artifacts by introducing Shapetastic, a framework to generate annotated images, and introduce a novel synthetic dataset, ShapetasticOOD, generated with it. We propose to incorporate robustness into OOD detection benchmarks, using various image interventions such as rotating, resizing, and compressing. Our experiments reveal inherent and counterintuitive sensitivities in state-of-the-art OOD detectors, highlighting gaps in current research. Codes and dataset are available on https://github.com/chuber1986/ood-robustness
Original languageEnglish
Title of host publicationOut Of Distribution Generalization in Computer Vision Workshop
Number of pages5
Publication statusPublished - 2024

Fields of science

  • 102019 Machine learning
  • 202037 Signal processing

JKU Focus areas

  • Digital Transformation

Cite this