[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3501385.3543979acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article
Open access

A Pair of ACES: An Analysis of Isomorphic Questions on an Elementary Computing Assessment

Published: 03 August 2022 Publication History

Abstract

Background and Context. With increasing efforts to bring computing education opportunities into elementary schools, there is a growing need for assessments, with arguments for validity, to support research evaluation at these grade levels. After successfully piloting a 10-question computational thinking assessment (Assessment of Computing for Elementary Students – ACES) for 4th graders in Spring 2020, we used our analyses of item difficulty and discrimination to iterate on the assessment. Objectives. To increase the number of potential items for ACES, we created isomorphic versions of existing questions. The nature of the changes varied from incidental changes that we did not believe would impact student performance to more radical changes that seemed likely to influence question difficulty. We sought to understand the impact of these changes on student performance. Method. Using these isomorphic questions, we created two versions of our assessment and piloted them in Spring 2021 with 235 upper-elementary (4th grade) students. We analyzed the reliability of the assessments using Cronbach’s alpha. We used Chi-squared tests to analyze questions that were identical across the two assessments to form a baseline of comparison and then ran Chi-Squared and Kruskal-Wallis H tests to analyze the differences between the isomorphic copies of the questions. Findings. Both assessment versions demonstrated good reliability, with identical Cronbach’s alphas of 0.868. We found statistically similar performance on the identical questions between our two groups of students, allowing us to compare their performance on the isomorphic questions. Students performed differently on the isomorphic questions, indicating the changes to the questions had a differential impact on student performance. Implications. This paper builds on existing work by presenting methods for creating isomorphic questions. We provide valuable lessons learned, both on those methods and on the impact of specific types of changes on student performance.

References

[1]
American Educational Research Association 2018. Standards for educational and psychological testing. American Educational Research Association.
[2]
Satabdi Basu, Daisy Rutstein, Yuning Xu, and Linda Shear. 2020. A principled approach to designing a computational thinking practices assessment for early grades. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education. 912–918.
[3]
Marina Umaschi Bers. 2019. Coding as another language: a pedagogical approach for teaching computer science in early childhood. Journal of Computers in Education 6, 4 (2019), 499–528.
[4]
Marina Umaschi Bers. 2020. Coding as a playground: Programming and computational thinking in the early childhood classroom. Routledge.
[5]
Paul D Bliese, David Chan, and Robert E Ployhart. 2007. Multilevel methods: Future directions in measurement, longitudinal analyses, and nonnormal outcomes. Organizational Research Methods 10, 4 (2007), 551–563.
[6]
Ryan Bockmon, Stephen Cooper, William Koperski, Jonathan Gratch, Sheryl Sorby, and Mohsen Dorodchi. 2020. A cs1 spatial skills intervention and the impact on introductory programming abilities. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education. 766–772.
[7]
Karen Brennan and Mitchel Resnick. 2012. New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 annual meeting of the American educational research association, Vancouver, Canada, Vol. 1. 25.
[8]
Liia Butler, Geoffrey Challen, and Tao Xie. 2020. Data-Driven Investigation into Variants of Code Writing Questions. In 2020 IEEE 32nd Conference on Software Engineering Education and Training (CSEE&T). IEEE, 1–10.
[9]
Guanhua Chen, J. Shen, Lauren Barth-Cohen, Shiyan Jiang, X. Huang, and M. Eltoukhy. 2017. Assessing elementary students’ computational thinking in everyday reasoning and robotics programming. Comput. Educ. 109(2017), 162–175.
[10]
Catherine S Clause, Morell E Mullins, Marguerite T Nee, Elaine Pulakos, and Neal Schmitt. 1998. Parallel test form development: A procedure for alternate predictors and an example. Personnel Psychology 51, 1 (1998), 193–208.
[11]
Lee J Cronbach. 1951. Coefficient alpha and the internal structure of tests. psychometrika 16, 3 (1951), 297–334.
[12]
Valentina Dagienė and Sue Sentance. 2016. It’s computational thinking! Bebras tasks in the curriculum. In International conference on informatics in schools: Situation, evolution, and perspectives. Springer, 28–39.
[13]
Valentina Dagiene and Gabriele Stupuriene. 2016. Bebras–A Sustainable Community Building Model for the Concept Based Learning of Informatics and Computational Thinking.Informatics in education 15, 1 (2016), 25–44.
[14]
Laura E de Ruiter and Marina U Bers. 2021. The Coding Stages Assessment: development and validation of an instrument for assessing young children’s proficiency in the ScratchJr programming language. Computer Science Education(2021), 1–30.
[15]
[15] International Society for Technology in Education and Computer Science Teachers Association.[n.d.]. https://id.iste.org/docs/ct-documents/computational-thinking-operational-definition-flyer.pdf
[16]
Max Fowler and Craig Zilles. 2021. Superficial Code-guise: Investigating the Impact of Surface Feature Changes on Students’ Programming Question Scores. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education. 3–9.
[17]
Mary L Gick and Keith J Holyoak. 1980. Analogical problem solving. Cognitive psychology 12, 3 (1980), 306–355.
[18]
Alessandro Guida, Ahmed M Megreya, Magali Lavielle-Guida, Yvonnick Noël, Fabien Mathy, Jean-Philippe van Dijck, and Elger Abrahamse. 2018. Spatialization in working memory is related to literacy and reading direction: Culture “literarily” directs our thoughts. Cognition 175(2018), 96–100.
[19]
John R Hayes and Herbert A Simon. 1977. Psychological differences among problem isomorphs. Cognitive theory 2(1977), 21–41.
[20]
Kenneth R Koedinger, Martha W Alibali, and Mitchell J Nathan. 2008. Trade-offs between grounded and abstract representations: Evidence from algebra problem solving. Cognitive Science 32, 2 (2008), 366–397.
[21]
Kenneth Kotovsky, John R Hayes, and Herbert A Simon. 1985. Why are some problems hard? Evidence from Tower of Hanoi. Cognitive psychology 17, 2 (1985), 248–294.
[22]
Patrick C Kyllonen. 2002. Item generation for repeated testing of human performance. Item generation for test development(2002), 251–276.
[23]
Filip Lievens and Paul R Sackett. 2007. Situational judgment tests in high-stakes settings: Issues and strategies with generating alternate forms.Journal of Applied Psychology 92, 4 (2007), 1043.
[24]
James Lockwood and Aidan Mooney. 2017. Computational thinking in education: Where does it fit? A systematic literary review. arXiv preprint arXiv:1703.07659(2017).
[25]
Feiya Luo, Maya Israel, and Brian Gane. 2022. Elementary Computational Thinking Instruction and Assessment: A Learning Trajectory Perspective. ACM Transactions on Computing Education (TOCE) 22, 2 (2022), 1–26.
[26]
Lauren E Margulieux. 2020. Spatial encoding strategy theory: The relationship between spatial skill and stem achievement. ACM Inroads 11, 1 (2020), 65–75.
[27]
Nicole M McNeil, David H Uttal, Linda Jarvin, and Robert J Sternberg. 2009. Should you show me the money? Concrete objects both hurt and help performance on mathematics problems. Learning and instruction 19, 2 (2009), 171–184.
[28]
Jum Nunnally and Ira H Bernstein. 1994. Psychometric Theory. The McGraw-Hill Companies.
[29]
Miranda C Parker, Mark Guzdial, and Shelly Engleman. 2016. Replication, validation, and use of a language independent CS1 knowledge assessment. In Proceedings of the 2016 ACM conference on international computing education research. 93–101.
[30]
Miranda C Parker, Yvonne S Kao, Dana Saito-Stehberger, Diana Franklin, Susan Krause, Debra Richardson, and Mark Warschauer. 2021. Development and Preliminary Validation of the Assessment of Computing for Elementary Students (ACES). In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education. 10–16.
[31]
Miranda C Parker, Amber Solomon, Brianna Pritchett, David A Illingworth, Lauren E Marguilieux, and Mark Guzdial. 2018. Socioeconomic status and computer science achievement: Spatial ability as a mediating variable in a novel model of understanding. In Proceedings of the 2018 ACM Conference on International Computing Education Research. 97–105.
[32]
Jack Parkinson and Quintin Cutts. 2020. The effect of a spatial skills training course in introductory computing. In Proceedings of the 2020 ACM Conference on Innovation and Technology in Computer Science Education. 439–445.
[33]
Leo Porter, Cynthia Bailey Lee, Beth Simon, and Daniel Zingaro. 2011. Peer instruction: Do students really learn from peer discussion in computing?. In Proceedings of the seventh international workshop on Computing education research. 45–52.
[34]
Project Quantum. [n.d.]. Project Quantum – A Collection of Computing Quizzes. https://diagnosticquestions.com/quantum
[35]
Emily Relkin, Laura de Ruiter, and Marina Umaschi Bers. 2020. TechCheck: Development and validation of an unplugged assessment of computational thinking in early childhood education. Journal of Science Education and Technology 29 (2020), 482–498.
[36]
Kathryn M. Rich, Carla Strickland, T. Andrew Binkowski, Cheryl Moran, and Diana Franklin. 2018. K–8 learning trajectories derived from research literature: Sequence, repetition, conditionals. ACM Inroads 9, 1 (mar 2018), 46–55. https://doi.org/10.1145/3105726.3106166
[37]
Marcos Román-González, Juan-Carlos Pérez-González, and Carmen Jiménez-Fernández. 2017. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in human behavior 72 (2017), 678–691.
[38]
Dana Saito-Stehberger, Leiny Garcia, and Mark Warschauer. 2021. Modifying Curriculum for Novice Computational Thinking Elementary Teachers and English Language Learners. In Proceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education V. 1. 136–142.
[39]
Valerie J Shute, Chen Sun, and Jodi Asbell-Clarke. 2017. Demystifying computational thinking. Educational Research Review 22 (2017), 142–158.
[40]
Xiaodan Tang, Yue Yin, Qiao Lin, Roxana Hadad, and Xiaoming Zhai. 2020. Assessing computational thinking: A systematic review of empirical studies. Computers & Education 148 (2020), 103798.
[41]
Cynthia Taylor, Michael Clancy, Kevin C Webb, Daniel Zingaro, Cynthia Lee, and Leo Porter. 2020. The practical details of building a CS concept inventory. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education. 372–378.
[42]
Yune Tran. 2019. Computational thinking equity in elementary classrooms: What third-grade students know and can do. Journal of Educational Computing Research 57, 1 (2019), 3–31.
[43]
Michele Weston, Kevin C Haudek, Luanna Prevost, Mark Urban-Lurain, and John Merrill. 2015. Examining the impact of question surface features on students’ answers to constructed-response questions on photosynthesis. CBE—Life Sciences Education 14, 2 (2015), ar19.
[44]
Benjamin Lee Whorf. 2012. Language, thought, and reality: Selected writings of Benjamin Lee Whorf. MIT press.
[45]
Jeannette M Wing. 2006. Computational thinking. Commun. ACM 49, 3 (2006), 33–35.
[46]
María Zapata-Cáceres, Estefanía Martín-Barroso, and Marcos Román-González. 2020. Computational thinking test for beginners: Design and content validation. In 2020 IEEE Global Engineering Education Conference (EDUCON). IEEE, 1905–1914.
[47]
Samar Zebian. 2005. Linkages between number concepts, spatial thinking, and directionality of writing: The SNARC effect and the reverse SNARC effect in English and Arabic monoliterates, biliterates, and illiterate Arabic speakers. Journal of Cognition and Culture 5, 1-2 (2005), 165–190.
[48]
Daniel Zingaro and Leo Porter. 2015. Tracking student learning from class to exam using isomorphic questions. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education. 356–361.

Cited By

View all
  • (2024)Prevalence of Programming Misconceptions in Primary School StudentsProceedings of the 24th Koli Calling International Conference on Computing Education Research10.1145/3699538.3699568(1-11)Online publication date: 12-Nov-2024
  • (2024)Quickly Producing "Isomorphic" Exercises: Quantifying the Impact of Programming Question PermutationsProceedings of the 2024 on Innovation and Technology in Computer Science Education V. 110.1145/3649217.3653617(178-184)Online publication date: 3-Jul-2024
  • (2024)Evaluating LLM-generated Worked Examples in an Introductory Programming CourseProceedings of the 26th Australasian Computing Education Conference10.1145/3636243.3636252(77-86)Online publication date: 29-Jan-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICER '22: Proceedings of the 2022 ACM Conference on International Computing Education Research - Volume 1
August 2022
372 pages
ISBN:9781450391948
DOI:10.1145/3501385
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 August 2022

Check for updates

Author Tags

  1. assessment
  2. computational thinking
  3. elementary education

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

ICER 2022
Sponsor:
ICER 2022: ACM Conference on International Computing Education Research
August 7 - 11, 2022
Lugano and Virtual Event, Switzerland

Acceptance Rates

Overall Acceptance Rate 189 of 803 submissions, 24%

Upcoming Conference

ICER 2025
ACM Conference on International Computing Education Research
August 3 - 6, 2025
Charlottesville , VA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)275
  • Downloads (Last 6 weeks)22
Reflects downloads up to 05 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Prevalence of Programming Misconceptions in Primary School StudentsProceedings of the 24th Koli Calling International Conference on Computing Education Research10.1145/3699538.3699568(1-11)Online publication date: 12-Nov-2024
  • (2024)Quickly Producing "Isomorphic" Exercises: Quantifying the Impact of Programming Question PermutationsProceedings of the 2024 on Innovation and Technology in Computer Science Education V. 110.1145/3649217.3653617(178-184)Online publication date: 3-Jul-2024
  • (2024)Evaluating LLM-generated Worked Examples in an Introductory Programming CourseProceedings of the 26th Australasian Computing Education Conference10.1145/3636243.3636252(77-86)Online publication date: 29-Jan-2024
  • (2023)How are primary school computer science curricular reforms contributing to equity? Impact on student learning, perception of the discipline, and gender gapsInternational Journal of STEM Education10.1186/s40594-023-00438-310:1Online publication date: 10-Oct-2023
  • (2023)Generating Multiple Choice Questions for Computing Courses Using Large Language Models2023 IEEE Frontiers in Education Conference (FIE)10.1109/FIE58773.2023.10342898(1-8)Online publication date: 18-Oct-2023

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media