[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2843043.2843062acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesaus-cswConference Proceedingsconference-collections
research-article

Automatically assessed electronic exams in programming courses

Published: 01 February 2016 Publication History

Abstract

Educational technology is nowadays utilized frequently in programming courses. Still, the final exams are mostly done using "traditional" pen-and-paper approach. In this paper, we present the adaptation of automatically assessed electronic exams in two programming courses. The first course was an introductory programming course taught using Java and the second one an advanced course about object-oriented programming. The usage of electronic exams offers several potential benefits for students, including, for example, the possibility to compile, test and debug the program code. To study the adaptation of electronic exams, we observed two instances of the courses mentioned above. Individual scores, submission counts and time spent on each task were analyzed. This data enabled us to classify the exercises in exams according to their difficulty level. This information can be used to further design exams to measure students' knowledge and skills adequately. The analyzed data and the student feedback seem to confirm that electronic exams are an excellent tool for evaluating students in programming courses, and can be recommended to other educators as well.

References

[1]
Barros, J. P., Estevens, L., Dias, R., Pais, R. & Soeiro, E. 2003. Using lab exams to ensure programming practice in an introductory programming course. SIGCSE Bull. 35, 3 (June 2003), 16--20
[2]
Daly, C. & Waldron, J. 2004. Assessing the assessment of programming ability. SIGCSE Bull. 36, 1 (March 2004)
[3]
Dougiamas, M., & Taylor, P. (2003). Moodle: Using learning communities to create an open source course management system. In World conference on educational multimedia, hypermedia and telecommunications (Vol. 2003, No. 1, pp. 171--178).
[4]
Harland, J., D'Souza, D, & Hamilton, M. 2013. A comparative analysis of results on programming exams. In Proceedings of the Fifteenth Australasian Computing Education Conference - Volume 136 (ACE '13)
[5]
Jacobson, N. 2000. Using on-computer exams to ensure beginning students' programming competency. SIGCSE Bull. 32.
[6]
Ihantola, P., Ahoniemi, T., Karavirta, V., & Seppälä, O. (2010, October). Review of recent systems for automatic assessment of programming assignments. In Proceedings of the 10th Koli Calling International Conference on Computing Education Research (pp. 86--93). ACM
[7]
Kaila, E., Rajala, T., Laakso, M.-J. & Salakoski, T. 2009. Effects, Experiences and Feedback from Studies of a Program Visualization Tool. Informatics in Education, 8, 1, 17--34.
[8]
Kaila, E., Kurvinen, E., Lokkila, E., Laakso, M.-J., Salakoski, T. (2015). Redesigning an Object-Oriented Programming Course. Submitted to ACM Transactions on Computing Education.
[9]
Korhonen, A. & Malmi, L. 2000. Algorithm simulation with automatic assessment. ACM SIGCSE Bulletin 32.3 (2000): 160--163.
[10]
Liaw, S. S. (2008). Investigating students' perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the Blackboard system. Computers & Education, 51(2), 864--873.
[11]
Lokkila, E., Kaila, E., Karavirta, V., Salakoski, T. & Laakso, M.-J. (2015) Redesigning Introductory Computer Science Courses to Use Tutorial-Based Learning. ICEE 2015 -- International Conference on Engineering Education.
[12]
Navrat, P. & Tvarozek, J. 2014. Online programming exercises for summative assessment in university courses. In Proceedings of the 15th International Conference on Computer Systems and Technologies (CompSysTech '14)
[13]
Parsons, D. & Haden, P. 2006. Parson's programming puzzles: a fun and effective learning tool for first programming courses. In Proceedings of the 8th Australasian Conference on Computing Education-Volume 52. Australian Computer Society, Inc., 2006.
[14]
Rajala, T., Lokkila, E., Lindén, R. & Laakso, M.-J. 2015. Student Feedback about Electronic Exams in Introductory Programming Courses. Proceedings of EDULEARN15 -- 7th International Conference on Education and New Learning Technologies. IATED Academy.
[15]
Sheard, J., Simon, Carbone, A., Chinn, D., Laakso, M.-J., Clear, T., de Raadt, M., D'Souza, D., Harland, J., Lister, R., Philpott, A & Warburton, G. 2011. Exploring programming assessment instruments: a classification scheme for examination questions. In Proceedings of the seventh international workshop on Computing education research (ICER '11). ACM, New York, NY, USA, 33--38
[16]
Simon, Sheard, J., D'Souza, D., Lopez, M., Luxton-Reilly, A., Putro, I. H., Robbins, P., Teague, D., & Whalley, J. 2015. How (not) to write an introductory programming exam. In Proceedings of the 17th Australasian Computing Education Conference (ACE 2015) (Vol. 27, p. 30).

Cited By

View all
  • (2024)Transforming Computer-Based Exams with BYOD: An Empirical StudyProceedings of the 24th Koli Calling International Conference on Computing Education Research10.1145/3699538.3699560(1-11)Online publication date: 12-Nov-2024
  • (2023)Computer Aided Design and Grading for an Electronic Functional Programming ExamElectronic Proceedings in Theoretical Computer Science10.4204/EPTCS.382.2382(22-44)Online publication date: 14-Aug-2023
  • (2023)Pseudocode vs. Compile-and-Run Prompts: Comparing Measures of Student Programming Ability in CS1 and CS2Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 110.1145/3587102.3588834(519-525)Online publication date: 29-Jun-2023
  • Show More Cited By

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ACSW '16: Proceedings of the Australasian Computer Science Week Multiconference
February 2016
654 pages
ISBN:9781450340427
DOI:10.1145/2843043
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 February 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. automatic assessment
  2. electronic exams
  3. exercise difficulty
  4. programming

Qualifiers

  • Research-article

Conference

ACSW '16
ACSW '16: Australasian Computer Science Week
February 1 - 5, 2016
Canberra, Australia

Acceptance Rates

ACSW '16 Paper Acceptance Rate 77 of 172 submissions, 45%;
Overall Acceptance Rate 204 of 424 submissions, 48%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)6
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Transforming Computer-Based Exams with BYOD: An Empirical StudyProceedings of the 24th Koli Calling International Conference on Computing Education Research10.1145/3699538.3699560(1-11)Online publication date: 12-Nov-2024
  • (2023)Computer Aided Design and Grading for an Electronic Functional Programming ExamElectronic Proceedings in Theoretical Computer Science10.4204/EPTCS.382.2382(22-44)Online publication date: 14-Aug-2023
  • (2023)Pseudocode vs. Compile-and-Run Prompts: Comparing Measures of Student Programming Ability in CS1 and CS2Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 110.1145/3587102.3588834(519-525)Online publication date: 29-Jun-2023
  • (2022)A Systematic Review of Deep Learning Based Online Exam Proctoring Systems for Abnormal Student Behaviour DetectionInternational Journal of Scientific Research in Science, Engineering and Technology10.32628/IJSRSET229428(192-209)Online publication date: 5-Jul-2022
  • (2022)Parsons Problems and BeyondProceedings of the 2022 Working Group Reports on Innovation and Technology in Computer Science Education10.1145/3571785.3574127(191-234)Online publication date: 27-Dec-2022
  • (2022)Lessons Learned from Asynchronous Online Assessment Formats in CS0 and CS3Proceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499386(640-646)Online publication date: 22-Feb-2022
  • (2021)A Systematic Review of Online Exams Solutions in E-Learning: Techniques, Tools, and Global AdoptionIEEE Access10.1109/ACCESS.2021.30601929(32689-32712)Online publication date: 2021
  • (2020)Securing Bring-Your-Own-Device (BYOD) Programming ExamsProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366907(880-886)Online publication date: 26-Feb-2020
  • (2020)Measuring the Score Advantage on Asynchronous Exams in an Undergraduate CS CourseProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366859(873-879)Online publication date: 26-Feb-2020
  • (2019)Experiences from Digital Learning Analytics in Finland and Sweden: A Collaborative Approach2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO)10.23919/MIPRO.2019.8757204(627-632)Online publication date: May-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media