[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Talking about Thinking Aloud: Perspectives from Interactive Think-Aloud Practitioners

Published: 20 June 2023 Publication History

Abstract

It is widely reported in the literature that intervening during usability testing sessions affects user behavior and compromises the validity of the test. However, this contrasts with the ongoing popularity of Interactive Think-Aloud (ITA) amongst practitioners. We report an in-depth qualitative study that explored this tension between theory and practice through nine interviews with ITA practitioners. Our findings add nuance to many established ideas about ITA but also reveal novel practices and attitudes. For example, ITA is sometimes used to slow down users as they navigate through a system, to manage external pressures such as recruitment difficulties, and to reframe a session as a kind of interview or participatory study. We also found that participants (ITA practitioners) experienced unexpected difficulties with ITA, including the risk that it results in overly reflective think-aloud and creates challenges in team working. Participants understood that ITA causes reactivity, and they reported taking steps to reduce it. However, overall, they did not see the traditional positivist objective of valid problem discovery as a realistic or high-priority goal for usability testing. They believed that ITA data can be useful and valid even if user behavior is not wholly realistic. Based on this, we argue against the narrow problem-counting approach often employed in the comparative usability evaluation studies that have sometimes seemed to discredit ITA. We also make the case for broadening how we think about the validity of usability testing data, and we argue that forms of ITA may be appropriate in some situations.

References

[1]
Alhadreti, O., & Mayhew, P. (2017). To intervene or not to intervene: An investigation of three think-aloud protocols in usability testing. J. Usability Studies, 12(3), 111--132.
[2]
Barker, A. A., Wilkes Musso, M., & Gouvier, W. D. (2014, October 28). Ecological validity. In Encyclopedia Britannica. Retrieved April 26, 2023, from https://www.britannica.com/science/ecological-validity
[3]
Bias, R. G. (1994). The pluralistic usability walkthrough: Coordinated empathies. In Usability inspection methods (pp. 63--76). John Wiley & Sons, Inc.
[4]
Boren, T., & Ramey, J. (2000). Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3), 261--278.
[5]
Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper et al. (Eds.), APA Handbook of Research Methods in Psychology (Vol. 2, pp. 57--71). American Psychological Association.
[6]
Braun, V., & Clarke, V. (2021). To saturate or not to saturate? Questioning data saturation as a useful concept for thematic analysis and sample-size rationales. Qualitative Research in Sport, Exercise and Health, 13(2), 201--216.
[7]
Dam, R. F. (n.d.). The 5 stages in the design thinking process. The Interaction Design Foundation. Retrieved July 2022 from https://www.interaction-design.org/literature/article/5-stages-in-the-design-thinking-process
[8]
Dumas, J. S., Dumas, J. S., & Redish, J. (1999). A practical guide to usability testing. Intellect Books.
[9]
Ericsson, K. A., & Simon, H. A. (1980). Verbal reports as data. Psychological Review, 87(3), 215--251.
[10]
Frøkjær, E., & Hornbæk, K. (2005). Cooperative usability testing: Complementing usability tests with user-supported interpretation sessions. CHI '05 Extended Abstracts on Human Factors in Computing Systems, 1383--1386.
[11]
Goh, W. (2020, May 11). 5 types of biases in user interviews and how to mitigate them. UX Collective. Retrieved December 2022 from https://uxdesign.cc/5-types-of-biases-in-user-interviews-and-how-to-reduce-them-6b863fb65af1
[12]
Goodman, E., Kuniavsky, M., & Moed, A. (2012). Chapter 11 - Usability tests. In E. Goodman, M. Kuniavsky, & A. Moed (Eds.), Observing the user experience (2nd ed., pp. 273--326). Morgan Kaufmann.
[13]
Hertzum, M., Hansen, K. D., & Andersen, H. H. K. (2009). Scrutinising usability evaluation: Does thinking aloud affect behaviour and mental workload? Behaviour & Information Technology, 28(2), 165--181.
[14]
Hertzum, M., Hornbæk, K., Shi, Q., & Yammiyavar, P. (2009). Cultural cognition in usability evaluation. Interacting with Computers, 21, 212--220.
[15]
Hertzum, M., & Kristoffersen, K. B. (2018). What do usability test moderators say? 'Mm hm', 'uh-huh', and beyond. Proceedings of the 10th Nordic Conference on Human-Computer Interaction, 364--375.
[16]
Hornbæk, K. (2010). Dogmas in the assessment of usability evaluation methods. Behaviour & Information Technology, 29(1), 97--111.
[17]
Krause, R., & Rosala, M. (2020, April) User experience careers: What a career in UX looks like today. Nielsen Norman Group. Retrieved December 2022 from https://www.nngroup.com/reports/user-experience-careers/
[18]
Lewis, C. (1982). Using the 'thinking-aloud' method in cognitive interface design. IBM T.J. Watson Research Center.
[19]
Makri, S., Blandford, A., & Cox, A. (2011). This is what I'm doing and why: Methodological reflections on a naturalistic think-aloud study of interactive information behaviour. Information Processing & Management, 47, 336--348.
[20]
McDonald, S., Edwards, H. M., & Zhao, T. (2012). Exploring think-alouds in usability testing: An international survey. IEEE Transactions on Professional Communication, 55(1), 2--19.
[21]
McDonald, S., Zhao, T., & Edwards, H. M. (2016). Look who's talking: Evaluating the utility of interventions during an interactive think-aloud. Interacting with Computers, 28(3), 387--403.
[22]
Molich, R. (2018). Are usability evaluations reproducible? Interactions, 25(6), 82--85.
[23]
Molich, R., Wilson, C., Barnum, C., Cooley, D., Krug, S., LaRoche, C., Martin, B. A., Patrowicz, J., & Traynor, B. (2020, August 22). How professionals moderate usability tests. Journal of User Experience, 15(4), 184--209. https://uxpajournal.org/moderate-usability-tests/
[24]
Moran, K. (2019, December 1). Usability testing 101. Nielsen Norman Group. Retrieved December 2022 from https://www.nngroup.com/articles/usability-testing-101/
[25]
Oates, B. J. (2012). Researching information systems and computing. Sage Publications Ltd.
[26]
Olmsted-Hawala, E. L., Murphy, E. D., Hawala, S., & Ashenfelter, K. T. (2010). Think-aloud protocols: A comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2381--2390.
[27]
Pernice, K. (2014, January 26). Talking with users in a usability test. Nielsen Norman Group. Retrieved December 2022 from https://www.nngroup.com/articles/talking-to-users/
[28]
Pernice, K. (2020, February 9). How to maximize insights in user testing: Stepped user tasks. Nielsen Norman Group. Retrieved December 2022 from https://www.nngroup.com/articles/user-testing-stepped-tasks/
[29]
Sauro, J. (2010, November 23). What metrics are collected in usability tests? MeasuringU. Retrieved December 2022 from https://measuringu.com/usability-metrics/
[30]
Sauro, J. (2018, March 28). How large is the evaluator effect in usability testing? MeasuringU. Retrieved December 2022 from https://measuringu.com/evaluator-effect/
[31]
Wilson, C. E. (2007). The problem with usability problems: Context is critical. Interactions, 14(5), 46. https://interactions.acm.org/archive/view/september-october-2007/the-problem-with-usability-problems1
[32]
Wixon, D. (2003). Evaluating usability methods: Why the current literature fails the practitioner. Interactions, 10(4), 28--34.
[33]
Wright, P. C., & Monk, A. F. (1991). A cost-effective evaluation method for use by designers. International Journal of Man-Machine Studies, 35(6), 891--912.
[34]
Zhao, T., & McDonald, S. (2010). Keep talking: An analysis of participant utterances gathered using two concurrent think-aloud methods. Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, 581--590.

Cited By

View all
  • (2023)Designing SafeMap Based on City Infrastructure and Empirical Approach: Modified A-Star Algorithm for Earthquake Navigation ApplicationProceedings of the 1st ACM SIGSPATIAL International Workshop on Advances in Urban-AI10.1145/3615900.3628788(61-70)Online publication date: 13-Nov-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Journal of User Experience
Journal of User Experience   Volume 18, Issue 3
May 2023
65 pages

Publisher

Usability Professionals' Association

Bloomingdale, IL

Publication History

Published: 20 June 2023
Published in JUX Volume 18, Issue 3

Author Tags

  1. usability testing
  2. traditional think-aloud
  3. interactive think-aloud (ITA)
  4. active intervention think-aloud
  5. relaxed think-aloud

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)50
  • Downloads (Last 6 weeks)9
Reflects downloads up to 16 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Designing SafeMap Based on City Infrastructure and Empirical Approach: Modified A-Star Algorithm for Earthquake Navigation ApplicationProceedings of the 1st ACM SIGSPATIAL International Workshop on Advances in Urban-AI10.1145/3615900.3628788(61-70)Online publication date: 13-Nov-2023

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media