• Li Y, Chen N, Zhao G and Shen Y. (2025). KD-Eye: Lightweight Pupil Segmentation for Eye Tracking on VR Headsets via Knowledge Distillation. Wireless Artificial Intelligent Computing Systems and Applications. 10.1007/978-3-031-71464-1_18. (209-220).

    https://link.springer.com/10.1007/978-3-031-71464-1_18

  • Kwok T, Kiefer P, Schinazi V, Hoelscher C and Raubal M. (2024). Gaze-based detection of mind wandering during audio-guided panorama viewing. Scientific Reports. 10.1038/s41598-024-79172-x. 14:1.

    https://www.nature.com/articles/s41598-024-79172-x

  • Kwok T, Kiefer P and Raubal M. (2023). Unobtrusive interaction: a systematic literature review and expert survey. Human–Computer Interaction. 10.1080/07370024.2022.2162404. 39:5-6. (380-416). Online publication date: 1-Nov-2024.

    https://www.tandfonline.com/doi/full/10.1080/07370024.2022.2162404

  • Privitera A, Fontana F and Geronazzo M. (2024). The Role of Audio in Immersive Storytelling: a Systematic Review in Cultural Heritage. Multimedia Tools and Applications. 10.1007/s11042-024-19288-4.

    https://link.springer.com/10.1007/s11042-024-19288-4

  • Artizzu V, Luyten K, Ruiz G and Spano L. (2024). ViRgilites: Multilevel Feedforward for Multimodal Interaction in VR. Proceedings of the ACM on Human-Computer Interaction. 8:EICS. (1-24). Online publication date: 17-Jun-2024.

    https://doi.org/10.1145/3658645

  • Roth R, Çöltekin A, Delazari L, Denney B, Mendonça A, Ricker B, Shen J, Stachoň Z and Wu M. (2024). Making maps & visualizations for mobile devices: A research agenda for mobile-first and responsive cartographic design. Journal of Location Based Services. 10.1080/17489725.2023.2251423. (1-71).

    https://www.tandfonline.com/doi/full/10.1080/17489725.2023.2251423

  • Privitera A, Fontana F and Geronazzo M. On the Effect of User Tracking on Perceived Source Positions in Mobile Audio Augmented Reality. Proceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter. (1-9).

    https://doi.org/10.1145/3605390.3605422

  • Rubab S, Yu L, Tang J and Wu Y. (2023). Exploring Effective Relationships Between Visual-Audio Channels in Data Visualization. Journal of Visualization. 10.1007/s12650-023-00909-3. 26:4. (937-956). Online publication date: 1-Aug-2023.

    https://link.springer.com/10.1007/s12650-023-00909-3

  • Chen K, Prouzeau A, Langmead J, Whitelock-Jones R, Lawrence L, Dwyer T, Hurter C, Weiskopf D and Goodwin S. Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications. (1-7).

    https://doi.org/10.1145/3588015.3589844

  • Keskin M and Kettunen P. (2023). Potential of eye-tracking for interactive geovisual exploration aided by machine learning. International Journal of Cartography. 10.1080/23729333.2022.2150379. 9:2. (150-172). Online publication date: 4-May-2023.

    https://www.tandfonline.com/doi/full/10.1080/23729333.2022.2150379

  • Chen Z, Yang Q, Shan J, Lin T, Beyer J, Xia H and Pfister H. iBall: Augmenting Basketball Videos with Gaze-moderated Embedded Visualizations. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. (1-18).

    https://doi.org/10.1145/3544548.3581266

  • Plopski A, Hirzle T, Norouzi N, Qian L, Bruder G and Langlotz T. (2022). The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality. ACM Computing Surveys. 55:3. (1-39). Online publication date: 31-Mar-2023.

    https://doi.org/10.1145/3491207

  • Kwok T, Kiefer P and Raubal M. Two-Step Gaze Guidance. Proceedings of the 2022 International Conference on Multimodal Interaction. (299-309).

    https://doi.org/10.1145/3536221.3556612

  • Zhang Q, Huang Y, Chernyshov G, Li J, Pai Y and Kunze K. GazeSync: Eye Movement Transfer Using an Optical Eye Tracker and Monochrome Liquid Crystal Displays. Companion Proceedings of the 27th International Conference on Intelligent User Interfaces. (54-57).

    https://doi.org/10.1145/3490100.3516469

  • Liao H, Dong W and Zhan Z. (2021). Identifying map users with eye movement data from map-based spatial tasks: user privacy concerns. Cartography and Geographic Information Science. 10.1080/15230406.2021.1980435. 49:1. (50-69). Online publication date: 2-Jan-2022.

    https://www.tandfonline.com/doi/full/10.1080/15230406.2021.1980435

  • Tabbaa L, Searle R, Bafti S, Hossain M, Intarasisrisawat J, Glancy M and Ang C. (2021). VREED. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 5:4. (1-20). Online publication date: 27-Dec-2022.

    https://doi.org/10.1145/3495002

  • Mahajan A, Maidullah S and Hossain M. (2021). Experience Toward Smart Tour Guide Apps in Travelling. Planning and Managing the Experience Economy in Tourism. 10.4018/978-1-7998-8775-1.ch014. (255-273).

    https://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-7998-8775-1.ch014

  • Vinnikov M, Motahari K, Hamilton L and Altin B. Understanding Urban Devotion through the Eyes of an Observer. ACM Symposium on Eye Tracking Research and Applications. (1-6).

    https://doi.org/10.1145/3448018.3458003

  • Lu H, Ren H, Feng Y, Wang S, Ma S and Gao W. (2021). GazeTance Guidance: Gaze and Distance-Based Content Presentation for Virtual Museum 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 10.1109/VRW52623.2021.00113. 978-1-6654-4057-8. (462-463).

    https://ieeexplore.ieee.org/document/9419356/

  • Raubal M, Bucher D and Martin H. (2021). Geosmartness for Personalized and Sustainable Future Urban Mobility. Urban Informatics. 10.1007/978-981-15-8983-6_6. (59-83).

    http://link.springer.com/10.1007/978-981-15-8983-6_6

  • Bruni L, Dini H and Simonetti A. (2021). Narrative Cognition in Mixed Reality Systems: Towards an Empirical Framework. Virtual, Augmented and Mixed Reality. 10.1007/978-3-030-77599-5_1. (3-17).

    https://link.springer.com/10.1007/978-3-030-77599-5_1

  • Millecamp M, Htun N, Conati C and Verbert K. What's in a User? Towards Personalising Transparency for Music Recommender Interfaces. Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization. (173-182).

    https://doi.org/10.1145/3340631.3394844

  • Goebel F, Kurzhals K, Schinazi V, Kiefer P and Raubal M. Gaze-Adaptive Lenses for Feature-Rich Information Spaces. ACM Symposium on Eye Tracking Research and Applications. (1-8).

    https://doi.org/10.1145/3379155.3391323

  • Kurzhals K, Göbel F, Angerbauer K, Sedlmair M and Raubal M. A View on the Viewer: Gaze-Adaptive Captions for Videos. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. (1-12).

    https://doi.org/10.1145/3313831.3376266

  • Smit D. Augmenting Embodied Sensemaking using VR-Enabled New and Unusual perspectives. Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction. (917-923).

    https://doi.org/10.1145/3374920.3374962

  • Kiefer P, Adams B, Kwok T and Raubal M. (2020). Modeling Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: Theory, Design, Methods and Applications. 10.1007/978-3-030-45289-6_17. (307-318).

    http://link.springer.com/10.1007/978-3-030-45289-6_17