• Tabbarah M, Cao Y, Abu Shamat A, Fang Z, Li L and Jeon M. Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation Interfaces. Proceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. (224-233).

    https://doi.org/10.1145/3580585.3607164

  • Hirsch L, Li J, Mayer S and Butz A. A Survey of Natural Design for Interaction. Proceedings of Mensch und Computer 2022. (240-254).

    https://doi.org/10.1145/3543758.3543773

  • Alnfiai M. (2021). Evaluating the accessibility and usability of a universal CAPTCHA based on gestures for smartphones. Universal Access in the Information Society. 20:4. (817-831). Online publication date: 1-Nov-2021.

    https://doi.org/10.1007/s10209-020-00730-x

  • Li J, George C, Ngao A, Holländer K, Mayer S and Butz A. (2021). Rear-Seat Productivity in Virtual Reality: Investigating VR Interaction in the Confined Space of a Car. Multimodal Technologies and Interaction. 10.3390/mti5040015. 5:4. (15).

    https://www.mdpi.com/2414-4088/5/4/15

  • Jeon M. (2019). Exploring Design Constructs In Sound Design With A Focus On Perceived Affordance. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 10.1177/1071181319631340. 63:1. (1199-1203). Online publication date: 1-Nov-2019.

    https://journals.sagepub.com/doi/10.1177/1071181319631340

  • Mathur R, Sheth A, Vyas P and Bolchini D. Typing Slowly but Screen-Free. Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. (427-439).

    https://doi.org/10.1145/3308561.3353789

  • Gross M, Dara J, Meyer C and Bolchini D. Exploring Aural Navigation by Screenless Access. Proceedings of the 15th International Web for All Conference. (1-10).

    https://doi.org/10.1145/3192714.3192815

  • Grussenmeyer W and Folmer E. (2017). Accessible Touchscreen Technology for People with Visual Impairments. ACM Transactions on Accessible Computing. 9:2. (1-31). Online publication date: 17-Jan-2017.

    https://doi.org/10.1145/3022701

  • Lee A, Song K, Ryu H, Kim J and Kwon G. (2015). Fingerstroke time estimates for touchscreen-based mobile gaming interaction. Human Movement Science. 10.1016/j.humov.2015.09.003. 44. (211-224). Online publication date: 1-Dec-2015.

    https://linkinghub.elsevier.com/retrieve/pii/S0167945715300373

  • Jeon M, Gable T, Davison B, Nees M, Wilson J and Walker B. (2014). Menu Navigation With In-Vehicle Technologies: Auditory Menu Cues Improve Dual Task Performance, Preference, and Workload. International Journal of Human-Computer Interaction. 10.1080/10447318.2014.925774. 31:1. (1-16). Online publication date: 2-Jan-2015.

    http://www.tandfonline.com/doi/abs/10.1080/10447318.2014.925774

  • Gable T, Walker B, Moses H and Chitloor R. Advanced auditory cues on mobile phones help keep drivers' eyes on the road. Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. (66-73).

    https://doi.org/10.1145/2516540.2516541

  • Jeon M and Lee J. The ecological AUI (auditory user interface) design and evaluation of user acceptance for various tasks on smartphones. Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV. (49-58).

    https://doi.org/10.1007/978-3-642-39330-3_6

  • Shirali-Shahreza S, Penn G, Balakrishnan R and Ganjali Y. SeeSay and HearSay CAPTCHA for mobile interaction. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. (2147-2156).

    https://doi.org/10.1145/2470654.2481295

  • Riener A, Reiner M, Jeon M and Chalfoun P. Methodical approaches to prove the effects of subliminal perception in ubiquitous computing environments. Proceedings of the 2012 ACM Conference on Ubiquitous Computing. (1120-1121).

    https://doi.org/10.1145/2370216.2370453