[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3343413.3377950acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
abstract

Mobile Voice Query Reformulation by Visually Impaired People

Published: 14 March 2020 Publication History

Abstract

Voice search has the potential to assist people who cannot look at their device, whether because of a particular situation like driving, a device that does not have a visual interface, or for people who are visually impaired. This study will compare the mobile query reformulation patterns of visually impaired and sighted people using a voice user interface. Query reformulation is a cognitively demanding stage in the information search process and has been studied with sighted users and, to a limited extent, with visually impaired people, but comparative data is needed to explore the differences between these two groups in terms of their interactions patterns and the potential effects of voice query reformulation on task completion.

References

[1]
Abdolrahmani, A. Kuber, R., & Hurst, A. (2016). An empirical investigation of the situationally induced impairments experienced by blind mobile device users. In Proceedings of the 13th Web for All Conference (W4A '16). ACM. New York, NY, USA, Article 21, 8 pages.
[2]
Azenkot, S., & Lee, N. B. (2013, October). Exploring the use of speech input by blind people on mobile devices. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (p. 11). ACM.
[3]
Baddeley, A. D. (1986). Working memory. New York, NY: Oxford University Press.
[4]
Brunsman-Johnson, C., Narayanan, S., Shebilske, W., Alakke, G., & Narakesari, S. (2011). Modeling web-based information seeking by users who are blind. Disability and Rehabilitation: Assistive Technology, 6(6), 511--525.
[5]
Crestani, F. & Du, H. (2006). Written versus spoken queries: A qualitative and quantitative comparative analysis. Journal of the American Society for Information Science and Technology, 57(7), 881--890.
[6]
Du, H., & Crestani, F. (2004, June). Retrieval effectiveness of written and spoken queries: An experimental evaluation. In International Conference on Flexible Query Answering Systems (pp. 376--389). Springer, Berlin, Heidelberg.
[7]
Giraud, S., Thérouanne, P., & Steiner, D. D. (2018). Web accessibility: Filtering redundant and irrelevant information improves website usability for blind users. International Journal of Human-Computer Studies, 111, 23--35.
[8]
Gooda Sahib, N., Tombros, A., & Stockman, T. (2015). Evaluating a search interface for visually impaired searchers. Journal of the Association for Information Science and Technology, 66(11), 2235--2248.
[9]
Guy, I. (2016, July). Searching by talking: Analysis of voice queries on mobile web search. In Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval (pp. 35--44). ACM.
[10]
Hong, J., & Findlater, L. (2018, April). Identifying speech input errors through audio-only interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 567). ACM.
[11]
Jiang, J., Jeng, W., & He, D. (2013, July). How do users respond to voice input errors? Lexical and phonetic query reformulation in voice search. In Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 143--152). ACM.
[12]
Kamvar, M. & Beerferman, D. (2010). Say what? Why users choose to speak their web queries. In Proceeding of Interspeech 2010 (pp. 1966--1969). ISCA.
[13]
Kouroupetroglou, G. & Katsoulis, P. (2016). How blind and sighted individuals perceive the typographic text-signals of a document. In M. Antona & C. Stephanidis (Eds.), International Conference on Universal Access in Human-Computer Interaction, Part 1, Lecture Notes in Computer Science (vol. 9737, pp. 81--90). Springer, Cham.
[14]
Pradhan, A., Mehta, K., & Findlater, L. (2018, April). "Accessibility came by accident": Use of voice-controlled intelligent personal assistants by people with disabilities. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 459). ACM.
[15]
Reeves, S., Porcheron, M., Fischer, J. E., Candello, H., McMillan, D., McGregor, M., ... & Zouinar, M. (2018, April). Voice-based Conversational UX Studies and Design. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (p. W38). ACM.
[16]
Reyes-Cruz, G., Fischer, J., & Reeves, S. (2019). An ethnographic study of visual impairments for voice user interface design. arXiv preprint arXiv:1904.06123.
[17]
Rieh, S. Y., & Xie, H. (Iris). (2006). Analysis of multiple query reformulations on the web: The interactive information retrieval context. Information Processing and Management, 42(3), 751--768.
[18]
Sa, N. (2016). Improving query reformulation in voice search system. In Proceedings of the 2016 ACM on Conference on Human Information Interaction and Retrieval (pp. 365--367). ACM.
[19]
Sa, N., & Yuan, X. (2019). Examining users' partial query modification patterns in voice search. Journal of the Association for Information Science and Technology, 0(0), 1--13.
[20]
Shokouhi, M., Jones, R., Ozertem, U., Raghunathan, K., & Diaz, F. (2014, July). Mobile query reformulations. In Proceedings of the 37th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 1011--1014). ACM.
[21]
Verma, P., Singh, R., & Singh, A. K. (2013). A framework to integrate speech-based interface for blind web users on the websites of public interest. Human-Centric Computing and Information Sciences, 3(1), 21.
[22]
Ye, H. Malu, M., Oh, U., & Findlater, L. (2014). Current and future mobile and wearable device use by people with visual impairments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14) (pp. 3123--3132). ACM.
[23]
Yuan, X., Belkin, N., & Sa, N. (2013). Speak to Me: A wizard of Oz study on a language spoken interface. HCIR 2013.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHIIR '20: Proceedings of the 2020 Conference on Human Information Interaction and Retrieval
March 2020
596 pages
ISBN:9781450368926
DOI:10.1145/3343413
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2020

Check for updates

Author Tags

  1. mobile web search
  2. query reformulation
  3. user study
  4. visually impaired people
  5. voice search

Qualifiers

  • Abstract

Conference

CHIIR '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 55 of 163 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 188
    Total Downloads
  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media