[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article
Public Access

An Improved Grade Point Average, With Applications to CS Undergraduate Education Analytics

Published: 13 September 2018 Publication History

Abstract

We present a methodological improvement for calculating Grade Point Averages (GPAs). Heterogeneity in grading between courses systematically biases observed GPAs for individual students: the GPA observed depends on course selection. We show how a logistic model can account for course selection by simulating how every student in a sample would perform if they took all available courses, giving a new “modeled GPA.” We then use 10 years of grade data from a large university to demonstrate that this modeled GPA is a more accurate predictor of student performance in individual courses than the observed GPA. Using Computer Science (CS) as an example learning analytics application, it is found that required CS courses give significantly lower grades than average courses. This depresses the recorded GPAs of CS majors: modeled GPAs are 0.25 points higher than those that are observed. The modeled GPA also correlates much more closely with standardized test scores than the observed GPA: the correlation with Math ACT is 0.37 for the modeled GPA and is 0.20 for the observed GPA. This implies that standardized test scores are much better predictors of student performance than might otherwise be assumed.

References

[1]
W. D. Cohen. 2000. The grade point average (GPA): An exercise in academic absurdity. National Teaching 8 Learning Forum 9, 5 (2000), 1--4.
[2]
J. G. Cromley, T. Perez, and A. Kaplan. 2016. Undergraduate STEM achievement and retention: Cognitive, motivational, and institutional factors and solutions. Policy Insights from the Behavioral and Brain Sciences 3, 1 (2016), 4--11.
[3]
T. Hastie, R. Tibshirani, and J. Friedman. 2009. The Elements of Statistical Learning. Springer-Verlag.
[4]
S. Hurtado, M. K., Eagan, J. H. Pryor, H. Whang, and S. Tran. 2012. Undergraduate Teaching Faculty: The 2010--2011 HERI Faculty Survey. Higher Education Research Institute, UCLA, Los Angeles, CA.
[5]
V. E. Johnson. 2003. Grade Inflation: A Crisis in College Education. Springer-Verlag, New York.
[6]
S. Katz, D. Allbritton, J. Aronis, C. Wilson, and M. L. Soffa. 2006. Gender, achievement, and persistence in an undergraduate computer science program. SIGMIS Database 37, 4 (2006), 42--57.
[7]
B. P. Koester, B. G. Galina, and T. A. McKay. 2016. Patterns of gendered performance difference in introductory STEM courses. Arxiv Preprint.
[8]
M. L. Nering and R. Ostini (Eds.). 2010. Handbook of Polytomous Item Response Theory Models. Routledge.
[9]
B. Ost. 2010. The role of peers and grades in determining major persistence in sciences. Economics of Education Review 29, 6 (2010), 923--934.
[10]
K. Rask. 2010. Attrition in STEM fields at a liberal arts college: The importance of grades and pre-collegiate preferences. Economics of Education Review 29, 6 (2010), 892--900.
[11]
H. Rosovsky and M. Hartley. 2002. Evaluation and the Academy: Are We Doing the Right Thing? Grade Inflation and Letters of Recommendation. American Academy of Arts 8 Sciences, Cambridge, MA.
[12]
E. Seymour and N. M. Hewitt. 1997. Talking About Leaving: Why Undergraduates Leave the Sciences. Westview Press, Boulder, CO.
[13]
J. P. Simmons, L. D. Nelson, and U. Simonsohn. 2011. False-positive psychology undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science 22, 11 (2011), 1359--1366.
[14]
T. R. Stinebrickner and R. Stinebrickner. 2011. Math or Science? Using Longitudinal Expectations Data to Examine the Process of Choosing a College Major. National Bureau of Economic Research, Cambridge, MA.
[15]
J. G. Stout, N. Dasgupta, M. Hunsinger, and M. A McManus. 2011. STEMing the tide: Using ingroup experts to inoculate women’s self-concept in science, technology, engineering, and mathematics (STEM). Journal of Personality and Social Psychology 100, 2 (2011), 255--270.
[16]
A. C. Strenta, R. Elliott, R. Adair, M. Matier, and J. Scott. 1994. Choosing and leaving science in highly selective institutions. Research in Higher Education 35, 5 (1994), 513--547.
[17]
J. Tomkin, M. West, and G. L. Herman. 2016. A methodological refinement for studying the STEM grade-point penalty. In 46th Annual Frontiers IEEE Frontiers in Education Conference (FIE’16).
[18]
R. J. Vanderbei, G. Scharf, and D. Marlow. 2014. A regression approach to fairer grading. SIAM Review 56, 2 (2014), 337--352.

Cited By

View all
  • (2023)Analysis of the Factors Affecting Student Performance Using a Neuro-Fuzzy ApproachEducation Sciences10.3390/educsci1303031313:3(313)Online publication date: 17-Mar-2023
  • (2023)Enhancing English Proficiency Test Evaluation: Leveraging Artificial Intelligence for Result Classification2023 10th International Conference on Soft Computing & Machine Intelligence (ISCMI)10.1109/ISCMI59957.2023.10458530(183-187)Online publication date: 25-Nov-2023
  • (2023)Genetic Algorithm-Based Approach for Predicting Student Academic Success2023 24th International Arab Conference on Information Technology (ACIT)10.1109/ACIT58888.2023.10453789(1-5)Online publication date: 6-Dec-2023
  • Show More Cited By

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Computing Education
ACM Transactions on Computing Education  Volume 18, Issue 4
Special Issue on Learning Analytics and Regular Papers
December 2018
118 pages
EISSN:1946-6226
DOI:10.1145/3278118
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 September 2018
Accepted: 01 October 2017
Revised: 01 April 2017
Received: 01 September 2016
Published in TOCE Volume 18, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. GPA
  2. Learning analytics
  3. gender disparity
  4. women in computing

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)112
  • Downloads (Last 6 weeks)27
Reflects downloads up to 15 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Analysis of the Factors Affecting Student Performance Using a Neuro-Fuzzy ApproachEducation Sciences10.3390/educsci1303031313:3(313)Online publication date: 17-Mar-2023
  • (2023)Enhancing English Proficiency Test Evaluation: Leveraging Artificial Intelligence for Result Classification2023 10th International Conference on Soft Computing & Machine Intelligence (ISCMI)10.1109/ISCMI59957.2023.10458530(183-187)Online publication date: 25-Nov-2023
  • (2023)Genetic Algorithm-Based Approach for Predicting Student Academic Success2023 24th International Arab Conference on Information Technology (ACIT)10.1109/ACIT58888.2023.10453789(1-5)Online publication date: 6-Dec-2023
  • (2023)A neuro-fuzzy model for predicting and analyzing student graduation performance in computing programsEducation and Information Technologies10.1007/s10639-022-11205-228:3(2455-2484)Online publication date: 1-Mar-2023
  • (2022)Development and Use of Domain-specific Learning Theories, Models, and Instruments in Computing EducationACM Transactions on Computing Education10.1145/353022123:1(1-48)Online publication date: 29-Dec-2022
  • (2021)Particle Swarm Model for Predicting Student Performance in Computing ProgramsIntelligent Systems and Applications10.1007/978-3-030-82196-8_7(84-96)Online publication date: 3-Aug-2021
  • (2020)Using Spatio-Algorithmic Problem Solving Strategies to Increase Access to Data StructuresProceedings of the 2020 ACM Conference on Innovation and Technology in Computer Science Education10.1145/3341525.3394004(577-578)Online publication date: 15-Jun-2020
  • (2019)Learning Analytics in Mathematics: A Systematic ReviewInternational Journal of Academic Research in Progressive Education and Development10.6007/IJARPED/v8-i4/65638:4Online publication date: 4-Dec-2019

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media