Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- Work in ProgressMay 2024
Leveraging Nonverbal Signals for Hands-Free Input in Digital Surveys
CHI EA '24: Extended Abstracts of the CHI Conference on Human Factors in Computing SystemsArticle No.: 224, Pages 1–6https://doi.org/10.1145/3613905.3650821Collecting end-user feedback is crucial for assessing the acceptance and viability of any product or service. In the digital realm, Likert scale surveys are the most common method for gathering feedback, with end users explicitly selecting their ...
- research-articleOctober 2023
QuantiBike: Quantifying Perceived Cyclists' Safety via Head Movements in Virtual Reality and Outdoors
SUI '23: Proceedings of the 2023 ACM Symposium on Spatial User InteractionArticle No.: 27, Pages 1–12https://doi.org/10.1145/3607822.3614532The current level of road safety for cyclists is estimated mainly based on police reports and self-reports collected during surveys. However, the former focuses on post-accident situations, while the latter is based on subjective perception and focuses ...
- research-articleApril 2023Honorable Mention
Leveraging driver vehicle and environment interaction: Machine learning using driver monitoring cameras to detect drunk driving
- Kevin Koch,
- Martin Maritsch,
- Eva Van Weenen,
- Stefan Feuerriegel,
- Matthias Pfäffli,
- Elgar Fleisch,
- Wolfgang Weinmann,
- Felix Wortmann
CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsArticle No.: 322, Pages 1–32https://doi.org/10.1145/3544548.3580975Excessive alcohol consumption causes disability and death. Digital interventions are promising means to promote behavioral change and thus prevent alcohol-related harm, especially in critical moments such as driving. This requires real-time information ...
- research-articleJune 2021
VXSlate: Exploring Combination of Head Movements and Mobile Touch for Large Virtual Display Interaction
DIS '21: Proceedings of the 2021 ACM Designing Interactive Systems ConferencePages 283–297https://doi.org/10.1145/3461778.3462076Virtual Reality (VR) headsets can open opportunities for users to accomplish complex tasks on large virtual displays using compact and portable devices. However, interacting with such large virtual displays using existing interaction techniques might ...
- posterMay 2021
Quantified Cycling Safety: Towards a Mobile Sensing Platform to Understand Perceived Safety of Cyclists
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing SystemsArticle No.: 262, Pages 1–6https://doi.org/10.1145/3411763.3451678Today’s level of cyclists’ road safety is primarily estimated using accident reports and self-reported measures. However, the former is focused on post-accident situations and the latter relies on subjective input. In our work, we aim to extend the ...
- posterMarch 2020
Detecting Learner Drowsiness Based on Facial Expressions and Head Movements in Online Courses
- Shogo Terai,
- Shizuka Shirai,
- Mehrasa Alizadeh,
- Ryosuke Kawamura,
- Noriko Takemura,
- Yuki Uranishi,
- Haruo Takemura,
- Hajime Nagahara
IUI '20 Companion: Companion Proceedings of the 25th International Conference on Intelligent User InterfacesPages 124–125https://doi.org/10.1145/3379336.3381500Drowsiness is a major factor that hinders learning. To improve learning efficiency, it is important to understand students' physical status such as wakefulness during online coursework. In this study, we have proposed a drowsiness estimation method based ...
- short-paperMarch 2019
Engagement estimation based on synchrony of head movements: application to actual e-learning scenarios
IUI '19 Companion: Companion Proceedings of the 24th International Conference on Intelligent User InterfacesPages 25–26https://doi.org/10.1145/3308557.3308660In video-based learning, engagement estimation is important for increasing learning efficiency. Changes to the appearances of learners (facial expression or posture such as closing eyes, looking away) obtained by a Web camera are often used to estimate ...
- short-paperJune 2018
Eye movements and viewer's impressions in response to HMD-evoked head movements
COGAIN '18: Proceedings of the Workshop on Communication by Gaze InteractionArticle No.: 2, Pages 1–5https://doi.org/10.1145/3206343.3206346The relationships between eye and head movements during the viewing of various visual stimuli using a head mounted display (HMD) and a large flat display were compared. The visual sizes of the images displayed were adjusted virtually, using an image ...
- short-paperJune 2016
The Role of the Vocal Stream in Telepresence Communication
PETRA '16: Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive EnvironmentsArticle No.: 64, Pages 1–4https://doi.org/10.1145/2910674.2910706Most of the work in affective computing within telepresence robot platforms focusses on research and knowledge generation as opposed to application. The main reason behind this is that the actual capabilities we have in the real world do not match the ...
- ArticleSeptember 2014
A time series feature of variability to detect two types of boredom from motion capture of the head and shoulders
ECCE '14: Proceedings of the 2014 European Conference on Cognitive ErgonomicsArticle No.: 37, Pages 1–5https://doi.org/10.1145/2637248.2743000Boredom and disengagement metrics are key to accurately timed adaptive interventions in interactive systems. Psychological research suggests that boredom is a composite state incorporating cycles of lethargy and restlessness. Here we present innovative ...
- ArticleAugust 2014
Smart Tablet Monitoring by a Real-Time Head Movement and Eye Gestures Recognition System
FICLOUD '14: Proceedings of the 2014 International Conference on Future Internet of Things and CloudPages 393–398https://doi.org/10.1109/FiCloud.2014.70Different research works are studying to integrate new ways of interaction with mobiles devices such as smartphone and tablets to provide a natural and easy mode of communication to people. In this paper we propose a new system to monitoring tablets ...
- research-articleMarch 2014
Look and lean: accurate head-assisted eye pointing
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and ApplicationsPages 35–42https://doi.org/10.1145/2578153.2578157Compared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional ...
- ArticleJuly 2013
Multimodal feedback in first encounter interactions
HCI'13: Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IVPages 262–271https://doi.org/10.1007/978-3-642-39330-3_28Human interactions are predominantly conducted via verbal communication which allows presentation of sophisticated propositional content. However, much of the interpretation of the utterances and the speaker's attitudes are conveyed using multimodal ...
- research-articleMarch 2012
Multimodal recognition of reading activity in transit using body-worn sensors
ACM Transactions on Applied Perception (TAP), Volume 9, Issue 1Article No.: 2, Pages 1–21https://doi.org/10.1145/2134203.2134205Reading is one of the most well-studied visual activities. Vision research traditionally focuses on understanding the perceptual and cognitive processes involved in reading. In this work we recognize reading activity by jointly analyzing eye and head ...
- ArticleJuly 2011
Head movements, facial expressions and feedback in Danish first encounters interactions: a culture-specific analysis
This study deals with non-verbal behaviour in a video-recorded and manually annotated corpus of first encounters in Danish. It presents an analysis of head movements and facial expressions in the data, in particular their use to express feedback, and it ...
- research-articleMarch 2010
Visual search in the (un)real world: how head-mounted displays affect eye movements, head movements and target detection
- Tobit Kollenberg,
- Alexander Neumann,
- Dorothe Schneider,
- Tessa-Karina Tews,
- Thomas Hermann,
- Helge Ritter,
- Angelika Dierker,
- Hendrik Koesling
ETRA '10: Proceedings of the 2010 Symposium on Eye-Tracking Research & ApplicationsPages 121–124https://doi.org/10.1145/1743666.1743696Head-mounted displays (HMDs) that use a see-through display method allow for superimposing computer-generated images upon a real-world view. Such devices, however, normally restrict the user's field of view. Furthermore, low display resolution and ...
- ArticleApril 2006
Listening heads
In [1] we discussed functions of head movements and gaze. In this paper, we will go into more depth in the classification of various head movements: how they are distinguished in both formal and functional terms. We look at the distribution of a ...
- ArticleMay 2005
Bangarama: creating music with headbanging
Bangarama is a music controller using headbanging as the primary interaction metaphor. It consists of a head-mounted tilt sensor and a guitar-shaped controller that does not require complex finger positions. We discuss the specific challenges of ...
- ArticleMay 2002
Visual Prosody: Facial Movements Accompanying Speech
As we articulate speech, we usually move the head and exhibit various facial expressions. This visual aspect of speech aids understanding and helps communicating additional information, such as the speaker's mood. In this paper we analyze quantitatively ...