[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2462476.2465586acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

Assessment of programming: pedagogical foundations of exams

Published: 01 July 2013 Publication History

Abstract

Previous studies of assessment of programming via written examination have focused on analysis of the examination papers and the questions they contain. This paper reports the results of a study that investigated how these final exam papers are developed, how students are prepared for these exams, and what pedagogical foundations underlie the exams. The study involved interviews of 11 programming lecturers. From our analysis of the interviews, we find that most exams are based on existing formulas that are believed to work; that the lecturers tend to trust in the validity of their exams for summative assessment; and that while there is variation in the approaches taken to writing the exams, all of the exam writers take a fairly standard approach to preparing their students to sit the exam. We found little evidence of explicit references to learning theories or models, indicating that the process is based largely on intuition and experience.

References

[1]
Balch, W., "Practice versus review exams and final exam performance," Teaching of Psychology, vol. 25, pp. 181--184, 1998.
[2]
Bennedsen, J. and Caspersen, M., "Assessing process and product: a practical lab exam for an introductory programming course," in 36th ASEE/IEEE Frontiers in Education Conference, San Diego, USA, 2006.
[3]
Bergqvist, E., "Types of reasoning required in university exams in mathematics," Journal of Mathematical Behavior, vol. 26, pp. 348--370, 2007.
[4]
Biggs, J., "Enhancing teaching through contructive alignment," Higher Education, vol. 32, pp. 347--364, 1996.
[5]
Biggs, J., "Aligning teaching and assessing to course objectives," presented at the Teaching and Learning in Higher Education: New Trends and Innovation, University of Aveiro, 2003.
[6]
Elliott Tew, A. and Guzdial, M., "Developing a validated assessment of fundamental CS1 concepts," in SIGCSE'10, Milwaukee, Wisconsin, USA, 2010, pp. 97--101.
[7]
Haghighi, P. D. and Sheard, J., "Summative computer programming assessment using both computer and paper.," in International Conference on Computers in Education (ICCE 2005), Singapore, 2005.
[8]
Hamer, J., Cutts, Q., Jackova, J., Luxton-Reilly, A., McCartney, R., Purchase, H., Reidsel, C., Saeli, M., Sanders, K., and Sheard, J., "Contributing Student Pedagogy," inroads - SIGCSE Bulletin, vol. 40, pp. 194--212, 2008.
[9]
Higgins, C., Gray, G., Symeonidis, P., and Tsintsifas, A., "Automated assessment and experiences of teaching programming," ACM Journal of Educational Resources in Computing, vol. 5, 2005.
[10]
Morrison, B., Clancy, M., McCartney, R., Richards, B., and Sanders, K., "Applying data structures in exams," in 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), Dallas, Texas, USA, 2011, pp. 353--358.
[11]
Petersen, A., Craig, M., and Zingaro, D., "Reviewing CS1 exam question content," in 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), Dallas, Texas, USA, 2011, pp. 631--636.
[12]
Ramsden, P., Learning to Teach in Higher Education. London: Routledge, 1996.
[13]
Sheard, J., Simon, Carbone, A., Chinn, D., Clear, T., Corney, M., D'Souza, D., Fenwick, J., Harland, J., Laakso, M., and Teague, D., "How difficult are exams? A framework for assessing the complexity of introductory programming exams," in Australasian Computing Education conference, Adelaide, Australia, 2013, p. (forthcoming).
[14]
Shuhidan, S., Hamilton, M., and D'Souza, D., "Instructor perspectives of multiple-choice questions in summative assessment for novice programmers," Computer Science Education, vol. 20, pp. 229--259, 2010.
[15]
Simon, Sheard, J., Carbone, A., Chinn, D., Laakso, M.-J., Clear, T., de Raadt, M., D'Souza, D., Lister, R., Philpott, A., Skene, J., and Warburton, G., "Introductory programming: Examining the exams," in 14th Australasian Computing Education conference, Melbourne, Australia, 2012.
[16]
Simon, Sheard, J., Carbone, A., D'Souza, D., Harland, J., and Laakso, M.-J., "Can computing academics assess the difficulty of programming examination questions?," in 11th Koli Calling International Conference on Computing Education Research, Finland, 2012.
[17]
Simon, B., Clancy, M., McCartney, R., Morrison, B., Richards, B., and Sanders, K., "Making sense of data structure exams," in Sixth International Computing Education Research workshop (ICER 2010), Aarhus, Denmark, 2010, pp. 97--105.

Cited By

View all
  • (2024)A Feasibility Study on Automated SQL Exercise Generation with ChatGPT-3.5Proceedings of the 3rd International Workshop on Data Systems Education: Bridging education practice with education research10.1145/3663649.3664368(13-19)Online publication date: 9-Jun-2024
  • (2024)A Proposal for a Standard Evaluation Method for Assessing Programming Proficiency in Assembly LanguageIntelligent Sustainable Systems10.1007/978-981-99-8031-4_17(183-192)Online publication date: 23-Feb-2024
  • (2024)A Design of Experiments Approach for Assessing Programming Performance in Assembly LanguageICT for Intelligent Systems10.1007/978-981-97-5799-2_23(253-261)Online publication date: 27-Sep-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ITiCSE '13: Proceedings of the 18th ACM conference on Innovation and technology in computer science education
July 2013
384 pages
ISBN:9781450320788
DOI:10.1145/2462476
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. CS1
  2. assessment
  3. examination papers
  4. programming pedagogy

Qualifiers

  • Research-article

Conference

ITiCSE '13
Sponsor:

Acceptance Rates

ITiCSE '13 Paper Acceptance Rate 51 of 161 submissions, 32%;
Overall Acceptance Rate 552 of 1,613 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)2
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A Feasibility Study on Automated SQL Exercise Generation with ChatGPT-3.5Proceedings of the 3rd International Workshop on Data Systems Education: Bridging education practice with education research10.1145/3663649.3664368(13-19)Online publication date: 9-Jun-2024
  • (2024)A Proposal for a Standard Evaluation Method for Assessing Programming Proficiency in Assembly LanguageIntelligent Sustainable Systems10.1007/978-981-99-8031-4_17(183-192)Online publication date: 23-Feb-2024
  • (2024)A Design of Experiments Approach for Assessing Programming Performance in Assembly LanguageICT for Intelligent Systems10.1007/978-981-97-5799-2_23(253-261)Online publication date: 27-Sep-2024
  • (2022)A Qualitative Study of Experienced Course Coordinators’ Perspectives on Assessment in Introductory Programming Courses for Non-CS MajorsACM Transactions on Computing Education10.1145/351713422:4(1-29)Online publication date: 15-Sep-2022
  • (2022)Predicting Student Success in CS2Proceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499276(140-146)Online publication date: 22-Feb-2022
  • (2021)Una revisión sistemática sobre aula invertida y aprendizaje colaborativo apoyados en inteligencia artificial para el aprendizaje de programaciónTecnura10.14483/22487638.1693425:69(196-214)Online publication date: 1-Jul-2021
  • (2020)Choosing Code Segments to Exclude from Code Similarity DetectionProceedings of the Working Group Reports on Innovation and Technology in Computer Science Education10.1145/3437800.3439201(1-19)Online publication date: 17-Jun-2020
  • (2020)Creative Assessment in ProgrammingProceedings of the 4th Conference on Computing Education Practice10.1145/3372356.3372369(1-4)Online publication date: 9-Jan-2020
  • (2020)Securing Bring-Your-Own-Device (BYOD) Programming ExamsProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366907(880-886)Online publication date: 26-Feb-2020
  • (2020)Software Development Processes Designed for First Year Computing UndergraduatesEncyclopedia of Education and Information Technologies10.1007/978-3-030-10576-1_180(1564-1576)Online publication date: 14-Jun-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media