[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2414536.2414557acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Application of domain specific heuristics to an innovative computer based assessment strategy

Published: 26 November 2012 Publication History

Abstract

Students undertaking summative and formative assessment are very much aware of the imperative of the outcome and often find themselves in a stressful situation with a high level of required concentration. Usability testing of assessment tool interfaces is hindered by the ability to sufficiently replicate the exam environment or to disrupt, watch or monitor a student while they are undertaking an exam or rely on their memories to reproduce their concerns at a later stage. This research demonstrates how a set of heuristics is adapted and redefined to enable an iterative approach to the improvement of a computer aided online assessment tool. The revised set of heuristics offers a tool for future developers to assist in the development of online assessment interfaces.

References

[1]
Black, P., Wiliam, D. (1998): Inside the Black Box: Raising Standards Through Classroom Assessment. Phi Delta, Kappan, School of Education, King's College London, Vol. 80, Iss. 2, pp 139--149
[2]
Black, P. and Wiliam, D. (2009), Developing The Theory of Formative Assessment, Educational Assessment, Evaluation and Accountability, Springer Netherlands, Vol. 21 Iss. 1, p 5--31.
[3]
Farrell, G. and Farrell, V. (2012) Online Assessment: Getting to See the Whole Picture with Limited Screen Estate. Measurement and Traditional Assessment. Computer Assisted Assessment Conference Proceedings, Southampton, England, p 110--134.
[4]
Farrell, G. and Leung, Y. (2008) Convergence of Validity for the Results of a Summative Assessment with confidence Measurement and Traditional Assessment. Computer Assisted Assessment Conference Proceedings, Loughborough, England, p 123--134.
[5]
Gardner-Medwin, A. R. (2006), Confidence-Based Marking: Towards Deeper Learning and Better Exams in Bryan, C., Clegg., K. (Eds), Innovative Assessment in Higher Education, Taylor & Francis, London.
[6]
Hattie, J., Timperley, H. (2007), The Power of Feedback. Review of Educational Research, American Educational Research Association, Washington, USA, Vol. 77, Iss. 1, p 81--112.
[7]
Krätzig, G., Arbuthnott, K. (2009), Metacognitive Learning: The Effect of Item-Specific Experience and Age on Metamemory Calibration and Planning, Metacognition and Learning, Vol. 4, Iss. 2, Springer, New York, USA, p 125--144.
[8]
Nielsen, J. (1994), Enhancing the Explanatory Power of Usability Heuristics. Proceedings of the SIGCHI conference on Human Factors in Computing Systems: Celebrating Interdependence. Boston, Massachusetts, USA, p 152--158.
[9]
Preece, J., Rogers, Y., Sharp, H. (2007), Interaction Design: Beyond Human-Computer Interaction, 2nd Edition. John Wiley and Sons, New York, USA.
[10]
Reference withheld (2008)
[11]
Sim, G., Read, J., Holfeild, P. (2008), Heuristics for Evaluating the Usability of CAA Applications, Computer Assisted Assessment Conference Proceedings, Loughborough, England p 283--294.
[12]
Te'eni, D., Carey. J., Zhang. P. (2006), Human Computer Interaction: Developing Effective Organizational Information Systems, John Wiley & Sons, New York, USA, p 1--19.
[13]
Torrance, H. (2008), Assessment as Learning? How The Use of Explicit Learning Objectives, Assessment Criteria and Feedback in Post Secondary Education and Training Can Come to Dominate Learning in Hall, K., Murphy, P., Soler, J. (Eds) Pedagogy and Practice: Culture and Identities, Sage, London, UK, Chap 1.

Index Terms

  1. Application of domain specific heuristics to an innovative computer based assessment strategy

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OzCHI '12: Proceedings of the 24th Australian Computer-Human Interaction Conference
    November 2012
    692 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • New Zealand Chapter of ACM SIGCHI
    • Human Factors & Ergonomics Soc: Human Factors & Ergonomics Soc

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 November 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. MCQ
    2. MCQCM
    3. design
    4. experimentation
    5. heuristics
    6. online assessment
    7. reliability

    Qualifiers

    • Research-article

    Conference

    OzCHI '12
    Sponsor:
    • Human Factors & Ergonomics Soc

    Acceptance Rates

    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 77
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media