[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
Reflects downloads up to 28 Jan 2025Bibliometrics
Skip Table Of Content Section
research-article
PACMHCI V7, ETRA, May 2023 Editorial
Article No.: 156, Page 1https://doi.org/10.1145/3591125

In 2022, ETRA moved its publication of full papers to a journal-based model, and we are delighted to present the second issue of the Proceedings of the ACM on Human-Computer Interaction to focus on contributions from the Eye Tracking Research and ...

research-article
Classification of Alzheimer's using Deep-learning Methods on Webcam-based Gaze Data
Article No.: 157, Pages 1–17https://doi.org/10.1145/3591126

There has been increasing interest in non-invasive predictors of Alzheimer's disease (AD) as an initial screen for this condition. Previously, successful attempts leveraged eye-tracking and language data generated during picture narration and reading ...

research-article
DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic Conditions
Article No.: 158, Pages 1–17https://doi.org/10.1145/3591127

Enabling gaze interaction in real-time on handheld mobile devices has attracted significant attention in recent years. An increasing number of research projects have focused on sophisticated appearance-based deep learning models to enhance the precision ...

research-article
Exploring Dwell-time from Human Cognitive Processes for Dwell Selection
Article No.: 159, Pages 1–15https://doi.org/10.1145/3591128

In order to develop future implicit interactions, it is important to understand the duration a user needs to recognize a visual object. By providing interactions that are triggered after a user recognizes an object, confusion resulting from the ...

research-article
Exploring Gaze-assisted and Hand-based Region Selection in Augmented Reality
Article No.: 160, Pages 1–19https://doi.org/10.1145/3591129

Region selection is a fundamental task in interactive systems. In 2D user interfaces, users typically use a rectangle selection tool to formulate a region using a mouse or touchpad. Region selection in 3D spaces, especially in Augmented Reality (AR) Head-...

research-article
Open Access
Exploring the Effects of Scanpath Feature Engineering for Supervised Image Classification Models
Article No.: 161, Pages 1–18https://doi.org/10.1145/3591130

Image classification models are becoming a popular method of analysis for scanpath classification. To implement these models, gaze data must first be reconfigured into a 2D image. However, this step gets relatively little attention in the literature as ...

research-article
Open Access
Eyettention: An Attention-based Dual-Sequence Model for Predicting Human Scanpaths during Reading
Article No.: 162, Pages 1–24https://doi.org/10.1145/3591131

Eye movements during reading offer insights into both the reader's cognitive processes and the characteristics of the text that is being read. Hence, the analysis of scanpaths in reading have attracted increasing attention across fields, ranging from ...

research-article
G-DAIC: A Gaze Initialized Framework for Description and Aesthetic-Based Image Cropping
Article No.: 163, Pages 1–19https://doi.org/10.1145/3591132

We propose a new gaze-initialised optimisation framework to generate aesthetically pleasing image crops based on user description. We extended the existing description-based image cropping dataset by collecting user eye movements corresponding to the ...

research-article
Investigating Privacy Perceptions and Subjective Acceptance of Eye Tracking on Handheld Mobile Devices
Article No.: 164, Pages 1–16https://doi.org/10.1145/3591133

Although eye tracking brings many benefits to users of mobile devices and developers of mobile applications, it poses significant privacy risks to both: the users of mobile devices, and the bystanders that surround users, are within the front-facing ...

research-article
Public Access
Practical Perception-Based Evaluation of Gaze Prediction for Gaze Contingent Rendering
Article No.: 165, Pages 1–17https://doi.org/10.1145/3591134

This paper proposes a novel evaluation framework, termed "critical evaluation periods," for evaluating continuous gaze prediction models. This framework emphasizes prediction performance when it is most critical for gaze prediction to be accurate ...

research-article
Public Access
Studying Developer Eye Movements to Measure Cognitive Workload and Visual Effort for Expertise Assessment
Article No.: 166, Pages 1–18https://doi.org/10.1145/3591135

Eye movement data provides valuable insights that help test hypotheses about a software developer's comprehension process. The pupillary response is successfully used to assess mental processing effort and attentional focus. Relatively little is known ...

research-article
Towards Modeling Human Attention from Eye Movements for Neural Source Code Summarization
Article No.: 167, Pages 1–19https://doi.org/10.1145/3591136

Neural source code summarization is the task of generating natural language descriptions of source code behavior using neural networks. A fundamental component of most neural models is an attention mechanism. The attention mechanism learns to connect ...

research-article
Unconscious Frustration: Dynamically Assessing User Experience using Eye and Mouse Tracking
Article No.: 168, Pages 1–17https://doi.org/10.1145/3591137

Eye-tracking has become easier to deploy in user experience (UX) studies to get a sense of where users attend to during interactions. Additionally, mouse tracking grants insights into the cognition driving the user's behaviours and end goals, as can ...

research-article
A Unified Look at Cultural Heritage: Comparison of Aggregated Scanpaths over Architectural Artifacts
Article No.: 169, Pages 1–17https://doi.org/10.1145/3591138

The paper contributes to scanpath bundling methods. We propose an analytical approach for statistical comparisons of aggregated scanpath visualizations by means of second-order gaze analysis metrics. The present study explores differences in attention ...

Subjects

Comments

Please enable JavaScript to view thecomments powered by Disqus.