[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping, Wheeling, and Flicking

Published: 01 July 2012 Publication History

Abstract

Users interact with many electronic devices via menus such as auditory or visual menus. Auditory menus can either complement or replace visual menus. We investigated how advanced auditory cues enhance auditory menus on a smartphone, with tapping, wheeling, and flicking input gestures. The study evaluated a spindex (speech index), in which audio cues inform users where they are in a menu; 122 undergraduates navigated through a menu of 150 songs. Study variables included auditory cue type (text-to-speech alone or TTS plus spindex), visual display mode (on or off), and input gesture (tapping, wheeling, or flicking). Target search time and subjective workload were lower with spindex than without for all input gestures regardless of visual display mode. The spindex condition was rated subjectively higher than plain speech. The effects of input method and display mode on navigation behaviors were analyzed with the two-stage navigation strategy model. Results are discussed in relation to attention theories and in terms of practical applications.

Supplementary Material

JPG File (tochi_spindex_2528flicking_2529.jpg)
JPG File (tochi_spindex_2528tapping_2529v2_08312011.jpg)
JPG File (tochi_spindex_2528wheeling_2529v2.jpg)
WMV File (tochi_spindex_2528flicking_2529.wmv)
The proof is given in an electronic appendix, available online in the ACM Digital Library.
WMV File (tochi_spindex_2528tapping_2529v2_08312011.wmv)
The proof is given in an electronic appendix, available online in the ACM Digital Library.
WMV File (tochi_spindex_2528wheeling_2529v2.wmv)
The proof is given in an electronic appendix, available online in the ACM Digital Library.

References

[1]
Absar, R. and Guastavino, C. 2008. Usability of non-speech sounds in user interfaces. In Proceedings of the International Conference on Auditory Display (ICAD’08).
[2]
Andersen, T. H. and Zhai, S. 2010. “Writing with music”: Exploring the use of auditory feedback in gesture interfaces, ACM Trans. App. Percept. 7, 3, 17:1--24.
[3]
Arons, B. 1997. SpeechSkimmer: A system for interactively skimming recorded speech. ACM Trans. Comput.-Hum. Interact. 4, 1, 3--38.
[4]
Beck, D. and Elkerton, J. 1989. Development and evaluation of direct manipulation list. SIGCHI Bull. 20, 3, 72--78.
[5]
Blattner, M. M., Sumikawa, D. A., and Greenberg, R. M. 1989. Earcons and icons: Their structure and common design principles. Hum.-Comput. Interact. 4, 11--44.
[6]
Blythe, M., Monk, A., Oyerbeeke, C., and Wright, P. Eds. 2003. Funology: From Usability to User Enjoyment. Kluwer Academic Publishers.
[7]
Brewster, S. A. 1997. Using non-speech sound to overcome information overload. Displays. 17, 179--189.
[8]
Brewster, S. A. 2002. Overcoming the lack of screen space on mobile computers. Pers. Ubiq. Comput. 6, 3, 188--205.
[9]
Brewster, S. A. 2008. Chapter13: Nonspeech auditory output. In The Human Computer Interaction Handbook, A. Sears and J. Jacko Eds., Lawrence Erlbaum Associates, New York, 247--264.
[10]
Brewster, S. A. and Cryer, P. G. 1999. Maximising screen-space on mobile computing devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’99). 224--225.
[11]
Brewster, S. A., Wright, P. C., and Edwards, A. D. N. 1992. A detailed investigation into the effectiveness of earcons. In Proceedings of the 1st International Conference on Auditory Display (ICAD’94). 471--478.
[12]
Brewster, S. A., Raty, V. P., and Kortekangas, A. 1996. Earcons as a method of providing navigational cues in a menu hierarchy. In Proceedings of the BCS Conference on Human-Computer Interaction (HCI’96). Springer, 167--183.
[13]
Brewster, S. A., Leplâtre, G., and Crease, M. G. 1998. Using non-speech sounds in mobile computing devices. In Proceedings of the 1st Workshop on Human Computer Interaction with Mobile Devices.
[14]
Brewster, S. A., Lumsden, J., Bell, M., Hall, M., and Tasker, S. 2003. Multimodal ‘eyes-free’ interaction techniques for wearable devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’03). 473--480.
[15]
Broadbent, D. E. 1977. The hidden preattentive processes. Amer. Psych. 32, 2, 109--118.
[16]
Davison, B. D. and Walker, B. N. 2008. AudioPlusWidgets: Bringing sound to software widgets and interface components. In Proceedings of the International Conference on Auditory Display (ICAD’08).
[17]
Dingler, T., Lindsay, J., and Walker, B. N. 2008. Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In Proceedings of the International Conference on Auditory Display (ICAD’08).
[18]
Edwards, A. D. N. 1989. Soundtrack: An auditory interface for blind users. Hum.-Comput. Interact. 4, 45--66.
[19]
Edworthy, J. 1998. Does sound help us to work better with machines? A commentary on Rauterberg’s paper ‘About the importance of auditory alarms during the operation of a plant simulator’. Interact. Comput. 10, 401--409.
[20]
Fisk, A. D. and Kirlik, A. 1996. Practical relevance and age-related research: Can theory advance without practice? In Aging and Skilled Performance: Advances in Theory and Application, W. A. Rogers, A. D. Fisk, and N. Walker Eds., Erlbaum, NJ, 1--15.
[21]
Gaver, W. W. 1986. Auditory icons: Using sound in computer interfaces. Hum.-Comput. Interact. 2, 167--177.
[22]
Gaver, W. W. 1989. The SonicFinder, a prototype interface that uses auditory icons. Hum.-Comput. Interact. 4, 67--94.
[23]
Goose, S. and Moller, C. 1999. A 3D audio only interactive web browser: Using spatialization to convey hypermedia document structure. In Proceedings of the 7th ACM international conference on Multimedia (Part 1) (MULTIMEDIA’99). 363--371.
[24]
Hart, S. G. 2006. NASA-Task Load Index (NASA-TLX); 20 years later. In Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting (HFES’06).
[25]
Helle, S., Leplâtre, G., Marila, J., and Laine, P. 2001. Menu sonification in a mobile phone: A prototype study. In Proceedings of the International Conference on Auditory Display (ICAD’01).
[26]
Jeon, M. and Walker, B. N. 2011. Spindex (speech index) improves auditory menu acceptance and navigation performance. ACM Trans. Access. Comput. 3, 3, 10:1--26.
[27]
Jeon, M., Na, D., Ahn, J., and Hong, J. 2008. User segmentation & UI optimization through mobile phone log analysis. In Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI’08). 495--496.
[28]
Jeon, M., Davison, B. K., Nees, M. A., Wilson, J., and Walker, B. N. 2009. Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies. In Proceedings of the the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’09). 91--98.
[29]
Kane, S. K., Bigham, J, P., and Wobbrock, J. O. 2008. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the Annual ACM Conference on Assistive Technologies (ASSETS’08). 73--80.
[30]
Klante, P. 2004. Auditory interaction objects for mobile applications. In Proceedings of the 7th International Conference on Work with Computing Systems (WWCS’04).
[31]
Kramer, G. 1994. An introduction to auditory display. In Auditory Display: Sonification, Audification, and Auditory Interfaces, G. Kramer Ed., Addison-Wesley, 1--77.
[32]
Lee, J. and Spence, C. 2008a. Feeling what you hear: Task-irrelevant sounds modulate tactile perception delivered via a touch screen. J. Multimodal User Interfaces 2, 3--4, 1783--7677.
[33]
Lee, J. and Spence, C. 2008b. Spatiotemporal visuotactile interaction. In Proceedings of the Haptics: Perception, Devices and Scenarios, 6th International Conference (EuroHaptics’08). Lecture Notes in Computer Science, vol. 5024, 826--831.
[34]
Leplâtre, G. and Brewster, S. A. 2000. Designing non-speech sounds to support navigation in mobile phone menus. In Proceedings of the International Conference on Auditory Display (ICAD’00). 190--199.
[35]
Leplâtre, G. and McGregor, I. 2004. How to tackle auditory interface aesthetics? Discussion and case study. In Proceedings of the International Conference on Auditory Display (ICAD’04).
[36]
Li, K. A., Baudisch, P., and Hinckley, K. 2008. BlindSight: Eyes-free access to mobile phones. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’08). 1389--1398.
[37]
Marila, J. 2002. Experimental comparison of complex and simple sounds in menu and hierarchy sonification. In Proceedings of the International Conference on Auditory Display (ICAD’00).
[38]
Morley, S., Petrie, H., and McNally, P. 1998. Auditory navigation in hyperspace: Design and evaluation of a non-visual hypermedia system for blind users. In Proceedings of the Annual ACM Conference on Assistive Technologies (ASSETS98).
[39]
Mynatt, E. 1997. Transforming graphical interfaces into auditory interfaces for blind users. Hum.-Comput. Interact. 12, 7--45.
[40]
Mynatt, E. and Weber, G. 1994. Nonvisual presentation of graphical user interfaces: Contrasting two approaches. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’94). 166--172.
[41]
Nees, M. A. and Walker, B. N. 2009. Auditory interfaces and sonification. In The Universal Access Handbook, C. Stephanidis Ed., CRC Press Taylor & Francis, 507--521.
[42]
Norman, D. A. 2004. Emotional Design. Basic Books, New York.
[43]
Norman, D. A. 2007. The Design of Future Things. Basic Books, New York.
[44]
Norman, D. A. and Nielsen, J. 2010. Gestural interfaces: A step backward in usability. Interactions 17, 5, 46--49.
[45]
Norman, K. 1991. The Psychology of Menu Selection: Designing Cognitive Control at the Human/Computer Interface. Ablex Publishing Corp., Norwood, New Jersey.
[46]
Oh, J. W., Park, J. H., Jo, J. H., Lee, C., and Yun, M. H. 2007. Development of a kansei analysis system on the physical user interface. In Proceedings of the Korean Conference on Human Computer Interaction.
[47]
Palladino, D. K. and Walker, B. N. 2007. Learning rates for auditory menus enhanced with spearcons versus earcons. In Proceedings of the International Conference on Auditory Display (ICAD’07). 274--279.
[48]
Palladino, D. K. and Walker, B. N. 2008a. Efficiency of spearcon-enhanced navigation of one dimensional electronic menus. In Proceedings of the International Conference on Auditory Display (ICAD’08).
[49]
Palladino, D. K. and Walker, B. N. 2008b. Navigation efficiency of two dimensional auditory menus using spearcon enhancements. In Proceedings of the Annual Meeting of the Human Factors and Ergonomics Society (HFES’08). 1262--1266.
[50]
Pirhonen, A., Brewster, S. A., and Holguin, C. 2002. Gestural and audio metaphors as a means of control for mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’02). 291--298.
[51]
Pitts, M. J., Williams, M. A., Wellings, T., and Attridge, A. 2009. Assessing subjective response to haptic feedback in automotive touchscreens. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’09). 11--18.
[52]
Raman, T. V. 1997. Auditory User Interfaces: Toward the Speaking Computer. Kluwer Academic Publishers.
[53]
Russell, D. C. and Bryan, R. 2009. To touch or not to touch: A brief guide for designing or selecting touch screen computers and touch software for consumer use. In Proceedings of the Human Factors and Ergonomics Society 53rd Annual Meeting (HFES’09). 980--984.
[54]
Sanders, M. S. and McCormick, E. J. 1993. Chapter 11: Controls and data entry devices. In Human Factors in Engineering and Design, M. S. Sanders and E. J. McCormick Eds., McGraw-Hill, Inc, New York, 334--382.
[55]
Sawhney, N. and Schmandt, C. 2000. Nomadic radio: Speech and audio interaction for contextual messaging in nomadic environments. ACM Trans. Comput.-Hum. Interact. 7, 3, 353--383.
[56]
Treisman, A. M. and Gelade, G. 1980. A feature-integration theory of attention. Cogn. Psych. 12, 97--136.
[57]
Treisman, A. M. and Gormican, S. 1988. Feature analysis in early vision: Evidence from search asymmetries, Psych. Rev. 95, 1, 15--48.
[58]
Vargas, M. L. M. and Anderson, S. 2003. Combining speech and earcons to assist menu navigation. In Proceedings of the International Conference on Auditory Display (ICAD’03).
[59]
Walker, B. N. and Kogan, A. 2009. Spearcons enhance performance and preference for auditory menus on a mobile phone. In Universal Access in HCI, Part II, Lecture Notes in Computer Science vol. 5615, C. Stephanidis Ed., Springer, 445--454.
[60]
Walker, B. N. and Kramer, G. 2004. Ecological psychoacoustics and auditory displays: Hearing, grouping, and meaning making. In Ecological Psychoacoustics, J. G. Neuhoff Ed., Academic Press, New York, 150--175.
[61]
Walker, B. N. and Kramer, G. 2006. Auditory displays, alarms, and auditory interfaces. In International Encyclopedia of Ergonomics and Human Factors 2nd Ed., W. Karwowski Ed., CRC Press, New York, 1021--1025.
[62]
Walker, B. N., and Nees, M. A. 2012. Theory of sonification. In Handbook of Sonification. T. Hermann, A. Hunt, and J. Neuhoff Eds., Academic Press: New York, 9--39.
[63]
Walker, B. N., Nance, A., and Lindsay, J. 2006. Spearcons: Speech-based earcons improve navigation performance in auditory menus. In Proceedings of the International Conference on Auditory Display (ICAD’06).95--98.
[64]
Walker, B. N., Lindsay, J., Nance, A., Nakano, Y., Palladino, D. K., Dingler, T., and Jeon, M. in press. Spearcons (Speech-based earcons) improve navigation performance in advanced auditory menus. Human Factors.
[65]
Wilson, J., Walker, B. N., Lindsay, J., Cambias, C., and Dellaert, F. 2007. SWAN: System for wearable audio navigation. In Proceedings of the 11th International Symposium on Wearable Computers (ISWC’07).
[66]
Yalla, P. and Walker, B. N. 2008. Advanced auditory menus: Design and evaluation of auditory scrollbars. In Proceedings of the Annual ACM Conference on Assistive Technologies (ASSETS’08). 105--112.
[67]
Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., and Baudisch, P. 2007. earPod: Eyes-free menu selection using touch input and reactive audio feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’07). 1395--1404.

Cited By

View all
  • (2023)Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation InterfacesProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607164(224-233)Online publication date: 18-Sep-2023
  • (2022)A Survey of Natural Design for InteractionProceedings of Mensch und Computer 202210.1145/3543758.3543773(240-254)Online publication date: 4-Sep-2022
  • (2021)Rear-Seat Productivity in Virtual Reality: Investigating VR Interaction in the Confined Space of a CarMultimodal Technologies and Interaction10.3390/mti50400155:4(15)Online publication date: 26-Mar-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Computer-Human Interaction
ACM Transactions on Computer-Human Interaction  Volume 19, Issue 2
July 2012
226 pages
ISSN:1073-0516
EISSN:1557-7325
DOI:10.1145/2240156
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2012
Accepted: 01 March 2012
Revised: 01 November 2011
Received: 01 May 2011
Published in TOCHI Volume 19, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Auditory menus
  2. flicking
  3. input gestures
  4. spindex
  5. tapping
  6. touch screen
  7. wheeling

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)3
Reflects downloads up to 09 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation InterfacesProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607164(224-233)Online publication date: 18-Sep-2023
  • (2022)A Survey of Natural Design for InteractionProceedings of Mensch und Computer 202210.1145/3543758.3543773(240-254)Online publication date: 4-Sep-2022
  • (2021)Rear-Seat Productivity in Virtual Reality: Investigating VR Interaction in the Confined Space of a CarMultimodal Technologies and Interaction10.3390/mti50400155:4(15)Online publication date: 26-Mar-2021
  • (2021)Evaluating the accessibility and usability of a universal CAPTCHA based on gestures for smartphonesUniversal Access in the Information Society10.1007/s10209-020-00730-x20:4(817-831)Online publication date: 1-Nov-2021
  • (2019)Exploring Design Constructs In Sound Design With A Focus On Perceived AffordanceProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/107118131963134063:1(1199-1203)Online publication date: 20-Nov-2019
  • (2019)Typing Slowly but Screen-FreeProceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3308561.3353789(427-439)Online publication date: 24-Oct-2019
  • (2018)Exploring Aural Navigation by Screenless AccessProceedings of the 15th International Web for All Conference10.1145/3192714.3192815(1-10)Online publication date: 23-Apr-2018
  • (2017)Accessible Touchscreen Technology for People with Visual ImpairmentsACM Transactions on Accessible Computing10.1145/30227019:2(1-31)Online publication date: 17-Jan-2017
  • (2015)Fingerstroke time estimates for touchscreen-based mobile gaming interactionHuman Movement Science10.1016/j.humov.2015.09.00344(211-224)Online publication date: Dec-2015
  • (2014)Menu Navigation With In-Vehicle Technologies: Auditory Menu Cues Improve Dual Task Performance, Preference, and WorkloadInternational Journal of Human-Computer Interaction10.1080/10447318.2014.92577431:1(1-16)Online publication date: 22-Oct-2014
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media