[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1240624.1240838acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

What happened to remote usability testing?: an empirical study of three methods

Published: 29 April 2007 Publication History

Abstract

The idea of conducting usability tests remotely emerged ten years ago. Since then, it has been studied empirically, and some software organizations employ remote methods. Yet there are still few comparisons involving more than one remote method. This paper presents results from a systematic empirical comparison of three methods for remote usability testing and a conventional laboratory-based think-aloud method. The three remote methods are a remote synchronous condition, where testing is conducted in real time but the test monitor is separated spatially from the test subjects, and two remote asynchronous conditions, where the test monitor and the test subjects are separated both spatially and temporally. The results show that the remote synchronous method is virtually equivalent to the conventional method. Thereby, it has the potential to conveniently involve broader user groups in usability testing and support new development approaches. The asynchronous methods are considerably more time-consuming for the test subjects and identify fewer usability problems, yet they may still be worthwhile.

References

[1]
Ames, M. Final Report on Remote Usability Studies.
[2]
Andreasen, M. S., Nielsen, H. V., Schræder, S. O. and Stage, J. Usability in open source software development: Opinions and practice. Information Technology and Control 35A, 3 (2006), 303--312.
[3]
Bartek, V. and Cheatham, D. Experience Remote Usability Testing, Part 1.
[4]
Bartek, V. and Cheatham, D. Experience Remote Usability Testing, Part 2.
[5]
Bartek, V. and Cheatham, D. Experiences in Remote Rsability Evaluations.
[6]
Benson, C., Muller-Prove, M. and Mzourek, J. Professional usability in open source projects: Gnome, openoffice.org, netbeans. Proceedings of CHI 2004, ACM Press (2004), 1083--1084.
[7]
Brush, A. B., Ames, M. and Davis, J. A comparison of synchronous remote and local usability studies for an expert interface. Proceedings of CHI 2004, ACM Press (2004), 1179--1182.
[8]
Castillo, J. C., Hartson, H. R. andHix, D. Remote usability evaluation: Can users report their own critical incidents? Proceedings of CHI 1998, ACM Press (1998), 253--254.
[9]
de Vreede, G.-J., Fruhling, A. and Chakrapani, A. A repeatable collaboration process for usability testing. Proceedings of HICSS 2005, IEEE Computer Society (2005), Track 1, p. 46.
[10]
Dempsey, B. J., Weiss, D., Jones, P. and Greenberg, J. Who is an open source software developer? Communications of the ACM 45, 2 (2002), 67--72.
[11]
Dray, S. and Siegel, D. Remote possibilities?: International usability testing at a distance. interactions 11, 2 (2004), 10--17.
[12]
Eklund, S., Feldman, M., Trombley, M. and Sinha, R. Improving the Usability of Open Source Software: Usability Testing of staroffice calc. http://www.sims.berkeley.edu/~sinha/opensource.html.
[13]
Frishberg, N., Dirks, A. M., Benson, C., Nickell, S. and Smith, S. Getting to know you: Open source development meets usability. Proceedings of CHI 2002, ACM Press (2002), 932--933.
[14]
Gough, D. and Phillips, H. Remote Online Usability Testing: Why, How, and When to Use it.
[15]
Hammontree, M., Weiler, P. and Nayak, N. Remote usability testing. Interactions 1, 3 (1994), 21--25.
[16]
Hartson, H. R. and Castillo, J. C. Remote evaluation for post-deployment usability improvement. Proceedings of AVI 1998, ACM Press (1998), 22--29.
[17]
Hartson, H. R., Castillo, J. C., Kelso, J. and Neale, W. C. Remote evaluation: The network as an extension of the usability laboratory. Proceedings of CHI 1996, ACM Press (1996), 228--235.
[18]
Hertzum, M. and Jacobsen, N. E. The evaluator effect: A chilling fact about usability evaluation methods. International Journal of Human-Computer Interaction 15, 1 (2003), 183--204.
[19]
Houck-Whitaker, J. Remote Testing versus Lab Testing. http://boltpeters.com/articles/versus.html.
[20]
Karat, C.-M., Campbell, R. and Fiegel, T. Comparison of empirical testing and walkthrough methods in user interface evaluation. Proceedings of CHI 1992, ACM Press (1992), 397--404.
[21]
Kjeldskov, J., Skov, M. B. and Stage, J. Does time heal: A longitudinal study of usability. Proceedings of OZCHI 2005, ACM Press (2005), 1--10.
[22]
Krauss, F. S. H. Methodology for remote usability activities: A case study. IBM Systems Journal 42, 4 (2003), 582--593.
[23]
McFadden, E., Hager, D. R., Elie, C. J. and Blackwell, J. M. Remote usability evaluation: Overview and case studies. International Journal of Human-Computer Interaction 14, 3&4 (2002), 489--502.
[24]
Moon, J. Y. and Sproull, L. Essence of Distributed Work: The Case of the linux Kernel. http://www.firstmonday.org/issues/issue511/moon/index.html.
[25]
Murphy J., Howard S., Kjeldskov K. and Goschnick, S. Location, location, location: Challenges of outsourced usability evaluation. Proceedings of the Workshop on Improving the Interplay between Usability Evaluation and User Interface Design, NordiCHI 2004, Aalborg University, Department of Computer Science, HCI-Lab Report no. 2004/2 (2004), 12--15.
[26]
Nichols, D. M. and Twidale, M. B. Usability and open source software. Technical Report 10/02, Department of Computer Science, University of Waikato, Working Paper Series ISSN 1170--487X, 2002.
[27]
Olmsted, E. and Gill, M. In-person usability study compared with self-administered web (remote-different time-place) study: Does mode of study produce similar results? Proceedings of UPA 2005, UPA (2005).
[28]
Petrie, H., Hamilton, F., King, N. and Pavan, P. Remote usability evaluation with disabled people. Proceedings of CHI 2006, ACM Press (2006), 1133--1141.
[29]
Raymond. E. The Revenge of the Hackers. O'Reilly and Associates, 1999.
[30]
Rubin, J. Handbook of Usability Testing. Wiley, 1994.
[31]
Safire, M. Remote moderated usability. http://www.upassoc.org/usability resources/conference/ 2004/im safire.html, 2004.
[32]
Scholtz, J. Adaption of traditional usability testing methods for remote testing. Proceedings of HICCS '01, IEEE (2001).
[33]
Skov, M. B. and Stage, J. Supporting problem identification in usability evaluations. Proceedings of OzCHI 2005, ACM Press (2005), 1--9.
[34]
Thompson, K. E., Rozanski, E. P. and Haake, A. R. Here, there, anywhere: Remote usability testing that works. Proceedings of CITC5 2004, ACM Press (2004), 132--137.
[35]
Winckler, M. A. A., Freitas, C. M. D. S. and de Lima, J. V. Usability remote evaluation for www. Proceedings of CHI 2000, ACM Press (2000), 131--132.

Cited By

View all
  • (2024)Challenges of Remote XR Experimentation: Balancing the Trade-Offs for a Successful Remote StudyProceedings of the 27th International Academic Mindtrek Conference10.1145/3681716.3689451(294-300)Online publication date: 8-Oct-2024
  • (2024)Design Principles for a Study Planning Assistant in Higher EducationProceedings of the 2024 Conference on Human Information Interaction and Retrieval10.1145/3627508.3638327(243-253)Online publication date: 10-Mar-2024
  • (2024)Remote HRI: a Methodology for Maintaining COVID-19 Physical Distancing and Human Interaction Requirements in HRI StudiesInformation Systems Frontiers10.1007/s10796-021-10162-426:1(91-106)Online publication date: 1-Feb-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '07: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2007
1654 pages
ISBN:9781595935939
DOI:10.1145/1240624
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 April 2007

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. empirical study
  2. remote testing
  3. usability testing

Qualifiers

  • Article

Conference

CHI07
Sponsor:
CHI07: CHI Conference on Human Factors in Computing Systems
April 28 - May 3, 2007
California, San Jose, USA

Acceptance Rates

CHI '07 Paper Acceptance Rate 182 of 840 submissions, 22%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)166
  • Downloads (Last 6 weeks)12
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Challenges of Remote XR Experimentation: Balancing the Trade-Offs for a Successful Remote StudyProceedings of the 27th International Academic Mindtrek Conference10.1145/3681716.3689451(294-300)Online publication date: 8-Oct-2024
  • (2024)Design Principles for a Study Planning Assistant in Higher EducationProceedings of the 2024 Conference on Human Information Interaction and Retrieval10.1145/3627508.3638327(243-253)Online publication date: 10-Mar-2024
  • (2024)Remote HRI: a Methodology for Maintaining COVID-19 Physical Distancing and Human Interaction Requirements in HRI StudiesInformation Systems Frontiers10.1007/s10796-021-10162-426:1(91-106)Online publication date: 1-Feb-2024
  • (2024)User-Centered Design and Evaluation of Health Information Systems: A Rapid Usability Engineering ApproachHuman Computer Interaction in Healthcare10.1007/978-3-031-69947-4_10(235-261)Online publication date: 14-Nov-2024
  • (2023)Initial User Evaluation for a Neck Gaiter for Tracing Swallowing Movements2023 8th International Conference on Smart and Sustainable Technologies (SpliTech)10.23919/SpliTech58164.2023.10193315(1-6)Online publication date: 20-Jun-2023
  • (2023)Empowering Researchers to Query Medical Data and Biospecimens by Ensuring Appropriate Usability of a Feasibility Tool: Evaluation StudyJMIR Human Factors10.2196/4378210(e43782)Online publication date: 19-Apr-2023
  • (2023)Making Usability Test Data Actionable! A Quantitative Test-Driven Prototyping ApproachExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585659(1-6)Online publication date: 19-Apr-2023
  • (2023)Near-Live Simulations to the Rescue: Lessons Learned from Using Alternative Simulation Approaches for Evaluating New TechnologiesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3573850(1-8)Online publication date: 19-Apr-2023
  • (2023)A New Method for Identifying Low-Quality Data in Perceived Usability Crowdsourcing Tests: Differences in Questionnaire ScoresInternational Journal of Human–Computer Interaction10.1080/10447318.2023.226369440:22(7297-7313)Online publication date: 9-Oct-2023
  • (2023)Choosing between remote and face-to-face features in a test setting: methodology used in two usability and user experience case studiesBehaviour & Information Technology10.1080/0144929X.2023.216545342:3(298-306)Online publication date: 13-Jan-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media