[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
Skip header Section
Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability ProblemsDecember 2009
Publisher:
  • New Riders Publishing
  • Post Office Box 4846 Thousand Oaks, CA
  • United States
ISBN:978-0-321-65729-9
Published:18 December 2009
Pages:
168
Skip Bibliometrics Section
Reflects downloads up to 25 Dec 2024Bibliometrics
Skip Abstract Section
Abstract

It's been known for years that usability testing can dramatically improve products. But with a typical price tag of $5,000 to $10,000 for a usability consultant to conduct each round of tests, it rarely happens. In this how-to companion to Don't Make Me Think: A Common Sense Approach to Web Usability, Steve Krug spells out an approach to usability testing that anyone can easily apply to their own web site, application, or other product. (As he said in Don't Make Me Think, "It's not rocket surgery".)In this new book, Steve explains how to: Test any design, from a sketch on a napkin to a fully-functioning web site or applicationKeep your focus on finding the most important problems (because no one has the time or resources to fix them all)Fix the problems that you find, using his "The least you can do" approachBy paring the process of testing and fixing products down to its essentials (A morning a month, that's all we ask ), Rocket Surgery makes it realistic for teams to test early and often, catching problems while it's still easy to fix them. Rocket Surgery Made Easy adds demonstration videos to the proven mix of clear writing, before-and-after examples, witty illustrations, and practical advice that made Don't Make Me Think so popular.

Cited By

  1. ACM
    Reeves S (2019). How UX Practitioners Produce Findings in Usability Testing, ACM Transactions on Computer-Human Interaction, 26:1, (1-38), Online publication date: 28-Feb-2019.
  2. ACM
    Schörkhuber D, Seitner F, Salzbrunn B, Gelautz M and Braun G Intelligent Film Assistant for Personalized Video Creation on Mobile Devices Proceedings of the 15th International Conference on Advances in Mobile Computing & Multimedia, (210-215)
  3. Arroyo I, Giné F, Roig C and Granollers T (2016). Analyzing Google Earth application in a heterogeneous commodity cluster display wall, Multimedia Tools and Applications, 75:18, (11391-11416), Online publication date: 1-Sep-2016.
  4. ACM
    Moroni A, Talamo M and Dimitri A Adoption factors of NFC Mobile Proximity Payments in Italy Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services, (393-399)
  5. Rose E and Tenenberg J (2015). UX as Disruption, International Journal of Sociotechnology and Knowledge Development, 7:3, (1-19), Online publication date: 1-Jul-2015.
  6. Dighe S and Joshi A An Autoethnographic Study of HCI Effort Estimation in Outsourced Software Development Proceedings of the 5th IFIP WG 13.2 International Conference on Human-Centered Software Engineering - Volume 8742, (19-35)
  7. Bond R, Finlay D, Nugent C, Moore G and Guldenring D (2014). A usability evaluation of medical software at an expert conference setting, Computer Methods and Programs in Biomedicine, 113:1, (383-395), Online publication date: 1-Jan-2014.
  8. Redish J Content as conversation in government websites Proceedings of the Second international conference on Design, User Experience, and Usability: web, mobile, and product design - Volume Part IV, (294-303)
  9. ACM
    Friess E (2012). Do usability evaluators do what we think usability evaluators do?, Communication Design Quarterly Review, 13:1, (9-13), Online publication date: 1-Mar-2012.
  10. ACM
    Chalil Madathil K and Greenstein J Synchronous remote usability testing Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (2225-2234)
  11. Friess E (2011). Discourse Variations Between Usability Tests and Usability Reports, Journal of Usability Studies, 6:3, (102-116), Online publication date: 1-May-2011.
Contributors

Reviews

Will Wallace

Usability testing involves watching people actually use whatever is being designed or developed. The purpose of usability testing is to gain insights that will allow you to improve the product. While some experts believe that only trained professionals should do usability testing, Krug asserts that anyone can do it. The book is centered around testing Web sites. During a usability test, a facilitator provides the test participant with tasks to perform and asks him or her to think out loud while performing the task. For Web sites, members of the development team and others who are interested in the project use screen-sharing software to observe the session from another room. Early in the book, the reader is encouraged to watch a video of a demo test. The video shows what the people in the observation room see on the screen and what they hear. It is a good introduction to what the test involves. The reader is asked to write down any usability problems noticed during the video and then, at the end of the test, to identify the top three usability problems. At the end of the video, the author lists his top three usability problems and then suggests simple solutions for two of them. Testing should begin as soon as the design or development process is started. Even having someone look at a sketch on a napkin can provide insight. The do-it-yourself testing described is limited to a minimum of one morning a month. During half of a day, there is normally time to go through the script with three test participants, each taking a little less than an hour. Even though the actual testing only takes a half day, the preparation can take a total of two to three full days. The preparation process is made easier for the reader by the included detailed checklist. According to the checklist, the process begins three weeks before the actual testing, by making a list of the five to ten most important tasks people accomplish on the Web site. Then, the list is narrowed down to those tasks people must be able to do. The next step is to expand the tasks into scenarios. Each scenario gives the test participants some background for the task. A day or two before the actual test, it's a good idea to make sure the scenarios are complete by doing a pilot test. This involves having someone sit down with whatever you're testing, giving him or her the scenario, and having him or her start the task. The test facilitator is responsible for providing the scenarios, keeping the participants on task, and getting them to verbalize what they are thinking. Having the test participants think aloud is important for effective usability testing. One of the maxims the author uses is: "Make it a spectator sport." It's important to get as many people as possible to observe the testing in person. This includes not only those working directly on the project, but also stakeholders and even executives. Krug suggests providing snacks in the observation room to encourage attendance. The observers should take notes while watching. At the end of each session, each observer has to sort through the usability problems they noticed and identify the top three. They should also suggest questions for the facilitator to ask the test participant before each session is completed. A debriefing session should be held to go over the information gathered. The author recommends serving lunch (the "good" pizza) during the debriefing, but limit attendance to those who attended at least one of the test sessions, as this will encourage people to attend the test sessions. The purpose of the debriefing is to create a list of the most serious usability problems encountered by the test participants, and to identify those problems to be fixed before next month's testing. Focus must stay on the most serious problems, as there are always more problems to deal with than there are resources. Once everyone has described the usability problems observed, a prioritized list of the top ten problems is created. Then, starting at the top of the list, fixes for each of the problems are discussed. Do not skip any of the problems. As Krug states, "When fixing problems, try to do the least you can do." Instead of doing a complete redesign, tweak the current one. The author gives a list of nine reasons why tweaking is better than doing a redesign. After the debriefing, an email summarizes what was tested, along with the list of problems to be fixed before next month's testing. Testing can also be done remotely, using the same screen-sharing software used in local testing. I highly recommend this detailed guide to do-it-yourself usability testing. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Please enable JavaScript to view thecomments powered by Disqus.

Recommendations