[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Mixing Evaluation Methods for Assessing the Utility of an Interactive InfoVis Technique

  • Conference paper
Human-Computer Interaction. Interaction Design and Usability (HCI 2007)

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 4550))

Included in the following conference series:

Abstract

We describe the results of an empirical study comparing an interactive Information Visualization (InfoVis) technique called Gravi++ (GRAVI), Exploratory Data Analysis (EDA) and Machine Learning (ML). The application domain is the psychotherapeutic treatment of anorectic young women. The three techniques are supposed to support the therapists in finding the variables which influence success or failure in therapy.

To evaluate the utility of the three techniques we developed on the one hand a report system which helped subjects to formulate and document in a self-directed manner the insights they gained when using the three techniques. On the other hand, focus groups were held with the subjects. The combination of these very different evaluation methods prevents jumping to false conclusions and enables for an comprehensive assessment of the tested techniques.

The combined results indicate that the three techniques (EDA, ML, and GRAVI) are complementary and therefore should be used in conjunction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 103.50
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 129.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Chen, C.: Empirical evaluation of information visualizations: an introduction. Int. J. Human-Computer Studies 53(5), 631–635 (2000)

    Article  Google Scholar 

  2. Plaisant, C.: The challenge of information visualization evaluation. In: Costabile, M.F. (ed.) Proceedings of the working conference on Advanced visual interfaces, pp. 109–116. ACM Press, New York (2004)

    Chapter  Google Scholar 

  3. Tory, M., Möller, T.: Human factors in visualization research. Visualization and Computer Graphics, IEEE Transactions on 10(1), 72–84 (2004)

    Article  Google Scholar 

  4. Spence, R.: Information Visualization. ACM Press, New York (2001)

    Google Scholar 

  5. Stasko, J.: Evaluating information visualizations: Issues and opportunities (position statement). In: Bertini, E., Plaisant, C., Santucci, G. (eds.): Beyond time and errors: novel evaLuation methods for Information Visualization – Proceedings of BELIV 2006, Venice, Italy, pp. 5–7 ( 2006)

    Google Scholar 

  6. Saraiya, P., North, C., Duca, K.: An insight-based methodology for evaluating bioinformatics visualizations. Visualization and Computer Graphics, IEEE Transactions on 11(4), 443–456 (2005)

    Article  Google Scholar 

  7. Eysenck, M.W., Keane, M.T.: Cognitive Psychology. A Student’s Handbook. Psychology Press, Taylor and Francis Group, London, New York (2005)

    Google Scholar 

  8. North, C.: Toward measuring visualization insight. Computer Graphics and Applications, IEEE 26(3), 6–9 (2006)

    Article  Google Scholar 

  9. Lanzenberger, M.: The Interactive Stardinates – An Information Visualization Technique Applied in a Multiple View System. PhD thesis, Vienna University of Technology, Vienna, Austria (September 2003)

    Google Scholar 

  10. Hinum, K., Miksch, S., Aigner, W., Ohmann, S., Popow, C., Pohl, M., Rester, M.: Gravi++: Interactive information visualization to explore highly structured temporal data. Journal of Universal Comp. Science 11(11), 1792–1805 (2005)

    Google Scholar 

  11. Tukey, J.W.: Exploratory Data Analysis. Addison-Wesley, Reading, Mass (1998)

    Google Scholar 

  12. Witten, I.H., Frank, E.: Data Mining: Practical machine learning tools and techniques, 2nd edn. Morgan Kaufmann, San Francisco, CA (2005)

    MATH  Google Scholar 

  13. Platt, J.: Fast training of support vector machines using sequential minimal optimization. In: Schoelkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning, pp. 185–210. MIT Press, Cambridge (1998)

    Google Scholar 

  14. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO Algorithm for SVM Classifier Design. Neural Computing 13(3), 637–649 (2001)

    Article  MATH  Google Scholar 

  15. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco, CA (1993)

    Google Scholar 

  16. Rester, M., Pohl, M., Hinum, K., Miksch, S., Popow, C., Ohmann, S., Banovic, S.: Methods for the evaluation of an interactive infovis tool supporting exploratory reasoning processes. In: BELIV ’06: Proceedings of the 2006 AVI workshop on Beyond time and errors, New York, NY, pp. 32–37. ACM Press, New York (2006)

    Google Scholar 

  17. Rester, M., Pohl, M., Hinum, K., Miksch, S., Ohmann, S., Popow, C., Banovic, S.: Assessing the usability of an interactive information visualization method as the first step of a sustainable evaluation. In: Proc. Empowering Software Quality: How can Usability Engineering reach these goals?, Austrian Computer Society, pp. 31–44 (2005)

    Google Scholar 

  18. Kuniavsky, M.: User Experience: A Practitioner’s Guide for User Research. Morgan Kaufmann, San Francisco (2003)

    Google Scholar 

  19. Mazza, R.: Evaluating information visualization applications with focus groups: the coursevis experience. In: BELIV ’06: Proceedings of the 2006 AVI workshop on BEyond time and errors, New York, NY, USA, pp. 1–6. ACM Press, New York (2006)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Julie A. Jacko

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rester, M. et al. (2007). Mixing Evaluation Methods for Assessing the Utility of an Interactive InfoVis Technique. In: Jacko, J.A. (eds) Human-Computer Interaction. Interaction Design and Usability. HCI 2007. Lecture Notes in Computer Science, vol 4550. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73105-4_67

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-73105-4_67

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-73104-7

  • Online ISBN: 978-3-540-73105-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics