Projects per year
Abstract
The eyes are gaining increasing interest within the HCI (human-computer interaction) community as they are a fast and accurate input modality. However, the applicability of mobile eye-based HCI so far is restricted by several issues, such as calibration or the Midas Touch Problem [5]. In this work we propose the idea of contour-guided gaze gestures, which overcome these problems by relying on relative eye movements, as users trace the contours of (interactive) objects within a smart environment. Matching the trajectory of the eye movements and the contour's shape allows to estimate which object was interacted with and to trigger the corresponding actions. We describe the concept of the system and illustrate several application scenarios, demonstrating its value.
Original language | English |
---|---|
Title of host publication | Proceedings of the 7th International Conference on the Internet of Things |
Editors | Simon Mayer, Stefan Schneegass, Bernhard Anzengruber-Tánase, Alois Ferscha, Gabriele Anderst-Kotsis, Joseph Paradiso |
Publisher | ACM DL |
Number of pages | 2 |
DOIs | |
Publication status | Published - 2017 |
Fields of science
- 102 Computer Sciences
- 102009 Computer simulation
- 102013 Human-computer interaction
- 102019 Machine learning
- 102021 Pervasive computing
- 102022 Software development
- 102025 Distributed systems
JKU Focus areas
- Computation in Informatics and Mathematics
- Engineering and Natural Sciences (in general)
Projects
- 1 Finished
-
EyeControl: Eye-Controlled Machines
Amrouche, S. (Researcher), Campos, Y. (Researcher), Elancheliyan, P. (Researcher), Gollan, B. (Researcher), Haslgrübler-Huemer, M. (Researcher), Jungwirth, F. (Researcher), Murauer, M. (Researcher), Timofeev, M. (Researcher), Wirth, C. (Researcher) & Ferscha, A. (PI)
01.09.2016 → 28.02.2020
Project: Funded research › FFG - Austrian Research Promotion Agency