Walk-through the OPPORTUNITY dataset for activity recognition in sensor rich environments

Daniel Roggen, Alberto Calatroni, Mirco Rossi, Thomas Holleczek, Kilian Förster, Gerhard Tröster, Paul Lukowicz, David Bannach, Gerald Pirkl, Florian Wagner, Alois Ferscha, Jakob Doppler, Clemens Holzmann, Marc Kurz, Gerald Holl, Ricardo Chavarriaga, Marco Creatura, Jose Millan

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

We aim at activity and context recognition in opportunistic sensor setups. The system ought to make use of sensor modalities that just happen to be available, rather than to rely on specific sensor deployment. In order to assess opportunistic activity recognition methods, we collected a large-scale dataset of complex activities in a highly sensor rich environment, with 72 sensors of 10 modalities in the environment, in objects and on-body. The dataset contains composite and atomic activities in large numbers (>28000 hand interactions). We present the activity scenario and the sensor setup. We show the user's activities and the corresponding sensor signals side by side. We argue that such a visualization may be an efficient form of dataset documentation, especially when such a dataset is shared, as it gives an insight into the complexity of the activities and richness of the sensor setup.
Original languageEnglish
Title of host publicationEighth International Conference on Pervasive Computing
Number of pages4
Publication statusPublished - May 2010

Fields of science

  • 102 Computer Sciences
  • 102009 Computer simulation
  • 102013 Human-computer interaction
  • 102019 Machine learning
  • 102020 Medical informatics
  • 102021 Pervasive computing
  • 102022 Software development
  • 102025 Distributed systems
  • 202017 Embedded systems
  • 211902 Assistive technologies
  • 211912 Product design

Cite this