[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3531146.3533129acmotherconferencesArticle/Chapter ViewAbstractPublication PagesfacctConference Proceedingsconference-collections
research-article
Open access

Don’t let Ricci v. DeStefano Hold You Back: A Bias-Aware Legal Solution to the Hiring Paradox

Published: 20 June 2022 Publication History

Abstract

Companies that try to address inequality in employment face a hiring paradox. Failing to address workforce imbalance can result in legal sanctions and scrutiny, but proactive measures to address these issues might result in the same legal conflict. Recent run-ins of Microsoft and Wells Fargo with the Labor Department’s Office of Federal Contract Compliance Programs (OFCCP) are not isolated and are likely to persist. To add to the confusion, existing scholarship on Ricci v. DeStefano often deems solutions to this paradox impossible. Circumventive practices such as the 4/5ths rule further illustrate tensions between too little action and too much action.
In this work, we give a powerful way to solve this hiring paradox that tracks both legal and algorithmic challenges. We unpack the nuances of Ricci v. DeStefano and extend the legal literature arguing that certain algorithmic approaches to employment are allowed by introducing the legal practice of banding to evaluate candidates. We thus show that a bias-aware technique can be used to diagnose and mitigate “built-in” headwinds in the employment pipeline. We use the machinery of partially ordered sets to handle the presence of uncertainty in evaluations data. This approach allows us to move away from treating “people as numbers” to treating people as individuals—a property that is sought after by Title VII in the context of employment.

References

[1]
1971. Griggs v. Duke Power Co., 401 U.S. 424, 431.
[2]
1971. Griggs v. Duke Power Co., 401 U.S. 424, 432.
[3]
1978. Regents of University of California v. Bakke, 438 U.S. 265, 317.
[4]
1979. EEOC Guidelines on Affirmative Action, 29 C.F.R. § 1608.1(c).
[5]
1980. Guardians Ass’n of New York City v. Civil Serv, 630 F.2d 79 (2d Cir.).
[6]
1983. Kirkland v. N.Y. State Dep’t of Correctional Serv., 711 F.2d 1117, 1133 (2d Cir.).
[7]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 623-624.
[8]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 627.
[9]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 631.
[10]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 635.
[11]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 636-637.
[12]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 638.
[13]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 639.
[14]
1987. Johnson v. Transportation Agency, Santa Clara Cty., 480 U.S. 616, 640-641.
[15]
1995. Brunet v. City of Columbus, Ohio, 58 F.3d 251, 255 (6th Cir.).
[16]
1998. Boston Police Superior Officers Fed’n v. City of Boston, 147 F.3d 13.
[17]
2005. Bradley v. City of Lynn, 403 F. Supp. 2d 161.
[18]
2006. Title VII, 29 CFR Parts 1600, 1607, 1608.
[19]
2009. Ricci v. DeStefano, 557 U.S. 557, 632.
[20]
2009. Ricci v. DeStefano, 557 U.S. 557.
[21]
2009. Ricci v. DeStefano, 557 U.S. 557, 564.
[22]
2009. Ricci v. DeStefano, 557 U.S. 557, 565.
[23]
2009. Ricci v. DeStefano, 557 U.S. 557, 567.
[24]
2009. Ricci v. DeStefano, 557 U.S. 557, 580-581.
[25]
2009. Ricci v. DeStefano, 557 U.S. 557, 583.
[26]
2009. Ricci v. DeStefano, 557 U.S. 557, 585.
[27]
2020. Bostock v. Clayton County, 590 U.S. ___.
[28]
42 U.S.C. § 2000e-2(k)(1)(A).
[29]
Accessed January 17, 2022. McKinsey’s online application FAQs: Careers. https://www.mckinsey.com/careers/application-faq
[30]
Julia Angwin, Noam Scheiber, and Ariana Tobin. 2017. Dozens of companies are using Facebook to exclude older workers from job ads. ProPublica, December(2017).
[31]
Solon Barocas. 2014. Data mining and the discourse on discrimination. In Data Ethics Workshop, Conference on Knowledge Discovery and Data Mining. 1–4.
[32]
Solon Barocas, Moritz Hardt, and Arvind Narayanan. 2019. Fairness and Machine Learning. fairmlbook.org. http://www.fairmlbook.org.
[33]
Solon Barocas and Andrew D Selbst. 2016. Big data’s disparate impact. Calif. L. Rev. 104(2016), 671.
[34]
Dina Bass and Josh Eidelson. 2020. Microsoft plan to add Black executives draws U.S. Labor inquiry. Seattle Times (October 6, 2020). https://www.seattletimes.com/business/microsoft/microsoft-plan-to-add-black-executives-draws-u-s-labor-department-inquiry/
[35]
Ashley B Batastini, Angelea D Bolaños, Robert D Morgan, and Sean M Mitchell. 2017. Bias in hiring applicants with mental illness and criminal justice involvement: A follow-up study with employers. Criminal Justice and Behavior 44, 6 (2017), 777–795.
[36]
Jason R Bent. 2019. Is algorithmic affirmative action legal. Geo. LJ 108(2019), 803.
[37]
Avrim Blum and Kevin Stangl. 2020. Recovering from Biased Data: Can Fairness Constraints Improve Accuracy?. In Symposium on Foundations of Responsible Computing (FORC), Vol. 1.
[38]
Miranda Bogen and Aaron Rieke. 2018. Help wanted: An examination of hiring algorithms, equity, and bias. (2018).
[39]
Aylin Caliskan, Joanna J Bryson, and Arvind Narayanan. 2017. Semantics derived automatically from language corpora contain human-like biases. Science 356, 6334 (2017), 183–186.
[40]
L Elisa Celis, Anay Mehrotra, and Nisheeth K Vishnoi. 2020. Interventions for ranking in the presence of implicit bias. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. 369–380.
[41]
Ronald Christensen. 1996. Analysis of variance, design, and regression: applied statistical methods. Page 173. CRC Press.
[42]
Sam Corbett-Davies and Sharad Goel. 2018. The measure and mismeasure of fairness: A critical review of fair machine learning. arXiv preprint arXiv:1808.00023(2018).
[43]
Sam Corbett-Davies, Emma Pierson, Avi Feller, Sharad Goel, and Aziz Huq. 2017. Algorithmic decision making and the cost of fairness. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 797–806.
[44]
Vedant Das Swain, Koustuv Saha, Manikanta D Reddy, Hemang Rajvanshy, Gregory D Abowd, and Munmun De Choudhury. 2020. Modeling organizational culture with workplace experiences shared on glassdoor. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–15.
[45]
Jeffrey Dastin. 2018. Amazon scraps secret AI recruiting tool that showed bias against women. Reuters (Oct 2018). https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
[46]
Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in bios: A case study of semantic representation bias in a high-stakes setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency. 120–128.
[47]
Deven R Desai and Joshua A Kroll. 2017. Trust but verify: A guide to algorithms and the law. Harv. JL & Tech. 31(2017), 1.
[48]
Shalia Dewan. 2014. How Businesses Use Your SATs. New York Times (Mar 2014). https://www.nytimes.com/2014/03/30/sunday-review/how-businesses-use-your-sats.html
[49]
Lucas Dixon, John Li, Jeffrey Sorensen, Nithum Thain, and Lucy Vasserman. 2018. Measuring and mitigating unintended bias in text classification. In Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society. 67–73.
[50]
Ezekiel Dixon-Roman, Howard Everson, and John Mcardle. 2013. Race, Poverty and SAT Scores: Modeling the Influences of Family Income on Black and White High School Students’ SAT Performance. Teachers College Record 115 (05 2013).
[51]
Pedro Domingos. 2015. The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World (1st.ed.). Basic Books, New York, NY.
[52]
Claire Duffy. 2020. In the face of a cultural reckoning, it turns out massive corporations can move fast and fix things. CNN Business (June 21, 2020). https://www.cnn.com/2020/06/21/business/corporate-america-addresses-racism/index.html
[53]
Claire Duffy. 2020. Plans at Microsoft and Wells Fargo to increase Black leadership are under scrutiny from the Labor Dept.CNN Business (October 7, 2020). https://www.cnn.com/2020/10/07/business/microsoft-wells-fargo-diverse-hiring-probe/index.html
[54]
Cynthia Dwork, Moritz Hardt, Toniann Pitassi, Omer Reingold, and Richard Zemel. 2012. Fairness through awareness. In Proceedings of the 3rd innovations in theoretical computer science conference. 214–226.
[55]
Benjamin Edelman, Michael Luca, and Dan Svirsky. 2017. Racial discrimination in the sharing economy: Evidence from a field experiment. American Economic Journal: Applied Economics 9, 2 (2017), 1–22.
[56]
Vitalii Emelianov, Nicolas Gast, Krishna P. Gummadi, and Patrick Loiseau. 2020. On Fair Selection in the Presence of Implicit Variance. In Proceedings of the 2020 ACM Conference on Economics and Computation.
[57]
Cynthia L. Estlund. 2005. Putting Grutter to Work: Diversity, Integration, and Affirmative Action. Berkeley J. of Labor and Employment 26 (2005), 1.
[58]
Yuri Faenza, Swati Gupta, and Xuan Zhang. 2020. Impact of Bias on School Admissions and Targeted Interventions. arxiv:2004.10846 [cs.CY]
[59]
Mary J Fischer and Douglas S Massey. 2007. The effects of affirmative action in higher education. Social Science Research 36, 2 (2007), 531–549.
[60]
Sorelle A Friedler, Carlos Scheidegger, Suresh Venkatasubramanian, Sonam Choudhary, Evan P Hamilton, and Derek Roth. 2019. A comparative study of fairness-enhancing interventions in machine learning. In Proceedings of the conference on fairness, accountability, and transparency. 329–338.
[61]
FTC Report. January 2016. Big data: a tool for inclusion or exclusion?Federal Trade Commission.
[62]
Rachel Goodman. 2018. Why Amazon’s Automated Hiring Tool Discriminated Against Women. https://www.aclu.org/blog/womens-rights/womens-rights-workplace/why-amazons-automated-hiring-tool-discriminated-against Published Oct. 12, 2018. Last accessed Jun. 6, 2019.
[63]
Alison Griswold. 2014. Why Major Companies Like Amazon Ask Job Candidates For Their SAT Scores. Business Insider (Mar 2014). https://www.businessinsider.com/goldman-sachs-bain-mckinsey-job-candidates-sat-scores-2014-3
[64]
Craig Hanks. 2009. Technology and values: Essential readings. John Wiley & Sons, 7.
[65]
Rema N Hanna and Leigh L Linden. 2012. Discrimination in Grading. American Economic Journal: Economic Policy 4 (02 2012), 146–68.
[66]
Anikó Hannák, Claudia Wagner, David Garcia, Alan Mislove, Markus Strohmaier, and Christo Wilson. 2017. Bias in online freelance marketplaces: Evidence from taskrabbit and fiverr. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. 1914–1933.
[67]
Moritz Hardt, Eric Price, and Nati Srebro. 2016. Equality of opportunity in supervised learning. In Advances in neural information processing systems. 3315–3323.
[68]
Madeline E Heilman, Caryn J Block, and Peter Stathatos. 1997. The affirmative action stigma of incompetence: Effects of performance information ambiguity. Academy of Management Journal 40, 3 (1997), 603–625.
[69]
Kimberly A. Houser. 2019. Can AI Solve the Diversity Problem in the Tech Industry? Mitigating Noise and Bias in Employment Decision-Making. Stanford Tech. L. Rev. 22 (2019), 290.
[70]
Jobscan. Accessed Sept. 11, 2020. Applicant Tracking Systems. https://www.jobscan.co/applicant-tracking-systems Available at https://www.jobscan.co/applicant-tracking-systems. .
[71]
Faisal Kamiran, Toon Calders, and Mykola Pechenizkiy. 2010. Discrimination aware decision tree learning. In 2010 IEEE International Conference on Data Mining. IEEE, 869–874.
[72]
Toshihiro Kamishima, Shotaro Akaho, Hideki Asoh, and Jun Sakuma. 2012. Fairness-aware classifier with prejudice remover regularizer. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, 35–50.
[73]
Michael Kearns and Aaron Roth. 2020. The Ethical Algorithm: The Science of Socially Aware Algorithm Design (1st. ed.). oxford University Press, New York, NY.
[74]
Matt Kempner. 2021. Georgia’s big businesses reveal staff — and management — diversit. The Atlanta Constitution Journal (October 8, 2021). https://www.ajc.com/news/business/georgias-big-businesses-reveal-staff-and-management-diversity/GUZ26V3JMRCRDB2ZF4X57F3OEY/
[75]
Pauline Kim. 2017. Auditing Algorithms for Discrimination. U. Pa. L. Rev. Online 166 (2017), 189.
[76]
Pauline Kim. 2017. Data-driven discrimination at work. Wm. & Mary L. Rev. 58(2017), 8657.
[77]
Pauline T Kim. 2020. Manipulating opportunity. Va. L. Rev. 106(2020), 867.
[78]
Jon M. Kleinberg and Manish Raghavan. 2018. Selection Problems in the Presence of Implicit Bias. In 9th Innovations in Theoretical Computer Science Conference, ITCS 2018, January 11-14, 2018, Cambridge, MA, USA. 33:1–33:17. https://doi.org/10.4230/LIPIcs.ITCS.2018.33
[79]
Kristian Lum and William Isaac. 2016. To predict and serve?Significance 13, 5 (2016), 14–19.
[80]
Mark MacCarthy. 2017. Standards of Fairness for Disparate Impact Assessment of Big Data Algorithms. Cumberland L. Rev. 48(2017), 102.
[81]
James March. 1989. Exploration and Exploitation in Organizational Learning. Organizational Science 2(1989), 71.
[82]
Weiwen Miao and Joseph L Gastwirth. 2013. Properties of statistical tests appropriate for the analysis of data in disparate impact cases. Law, Probability and Risk 12, 1 (2013), 37–61.
[83]
Tara Sophia Mohr. 2014. Why women don’t apply for jobs unless they’re 100% qualified. Harvard Business Review 25 (2014).
[84]
Corinne Moss-Racusin, Dovidio, Victoria Brescoll, Mark Graham, and Jo Handelsman. 2012. Science Faculty’s Subtle Gender Biases Favor Male Students. Proceedings of the National Academy of Sciences 109 (09 2012), 16474–16479. https://doi.org/10.1073/pnas.1211286109
[85]
U.S. Department of Labor Office of Federal Contract Compliance Programs. 2020. U.S. DEPARTMENT OF LABOR AND MICROSOFT CORP. ENTER AGREEMENT TO RESOLVE ALLEGED HIRING DISCRIMINATION AFFECTING 1,229 APPLICANTS IN FOUR STATES. CNN Business (September 18, 2020). https://www.dol.gov/newsroom/releases/ofccp/ofccp20200918
[86]
Richard Primus. 2010. The future of disparate impact. Mich. L. Rev. 108(2010), 1341.
[87]
Manish Raghavan, Solon Barocas, Jon Kleinberg, and Karen Levy. 2020. Mitigating bias in algorithmic hiring: Evaluating claims and practices. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. 469–481.
[88]
Jad Salem and Swati Gupta. 2020. Under major revision at Management Science, 2021. Closing the GAP: Group-Aware Parallelization for Online Selection of Candidates with Biased Evaluations. In International Conference on Web and Internet Economics (WINE). Springer.
[89]
Javier Sánchez-Monedero, Lina Dencik, and Lilian Edwards. 2020. What Does It Mean to “solve” the Problem of Discrimination in Hiring? Social, Technical and Legal Perspectives from the UK on Automated Hiring Systems. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency(Barcelona, Spain) (FAT* ’20). Association for Computing Machinery, New York, NY, USA, 458–468. https://doi.org/10.1145/3351095.3372849
[90]
Steven D Schlachter and Jenna R Pieper. 2019. Employee referral hiring in organizations: An integrative conceptual review, model, and agenda for future research.Journal of Applied Psychology(2019).
[91]
Oscar Schwartz. 2019. Untold History of AI: Algorithmic Bias Was Born in the 1980s. IEEE Spectrum (2019).
[92]
Jon Shields. 2018. Over 98% of fortune 500 companies use applicant tracking systems (ATS).
[93]
Gerry Smedinghoff. 2007. The art, philosophy and science of data. Contingencies May/June(2007), 37–40.
[94]
Marion Gross Sobol and Charles J Ellard. 1988. Measures of employment discrimination: A statistical alternative to the four-fifths rule. Industrial Relations Law Journal(1988), 381–399.
[95]
Claude M. Steele and J. Aronson. 1995. Stereotype threat and the intellectual test performance of African Americans. Journal of Personality and Social Psychology 69 (1995), 797–811. Issue 5.
[96]
Charles A. Sullivan. 2005. Disparate Impact: Looking Past the Desert Palace Mirage. William & Mary Law Review(2005).
[97]
Nina Totenberg. 2020. Supreme Court Delivers Major Victory To LGBTQ Employees. Retrieved July 22, 2020 from https://www.npr.org/2020/06/15/863498848/supreme-court-delivers-major-victory-to-lgbtq-employees
[98]
Jingyan Wang and Nihar Shah. 2019. Your 2 is My 1, Your 3 is My 9: Handling Arbitrary Miscalibrations in Ratings. In AAMAS Conference proceedings.
[99]
Jingyan Wang, Ivan Stelmakh, Yuting Wei, and Nihar B Shah. 2020. Debiasing Evaluations That are Biased by Evaluations. arXiv preprint arXiv:2012.00714(2020).
[100]
Agnes Wold and Christine Wennerås. 1997. Nepotism and sexism in peer review. Nature 387, 6631 (1997), 341–343.
[101]
Seyma Yucer, Samet Akçay, Noura Al-Moubayed, and Toby P Breckon. 2020. Exploring Racial Bias within Face Recognition via per-subject Adversarially-Enabled Data Augmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 18–19.
[102]
Rich Zemel, Yu Wu, Kevin Swersky, Toni Pitassi, and Cynthia Dwork. 2013. Learning fair representations. In International Conference on Machine Learning. 325–333.

Cited By

View all
  • (2024)Wise Fusion: Group Fairness Enhanced Rank FusionProceedings of the 33rd ACM International Conference on Information and Knowledge Management10.1145/3627673.3679649(163-174)Online publication date: 21-Oct-2024
  • (2023)Equality, Equity, and Algorithms: Learning from Justice Rosalie AbellaUniversity of Toronto Law Journal10.3138/utlj-2023-006473:Supplement 2(163-178)Online publication date: 1-Sep-2023
  • (2023)Fair&Share: Fast and Fair Multi-Criteria SelectionsProceedings of the 32nd ACM International Conference on Information and Knowledge Management10.1145/3583780.3614874(152-162)Online publication date: 21-Oct-2023
  • Show More Cited By

Index Terms

  1. Don’t let Ricci v. DeStefano Hold You Back: A Bias-Aware Legal Solution to the Hiring Paradox
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        FAccT '22: Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency
        June 2022
        2351 pages
        ISBN:9781450393522
        DOI:10.1145/3531146
        This work is licensed under a Creative Commons Attribution International 4.0 License.

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 20 June 2022

        Check for updates

        Author Tags

        1. anti-discrimination laws
        2. bias
        3. hiring
        4. resume screening
        5. uncertainty

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        FAccT '22
        Sponsor:

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)1,653
        • Downloads (Last 6 weeks)1,292
        Reflects downloads up to 16 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Wise Fusion: Group Fairness Enhanced Rank FusionProceedings of the 33rd ACM International Conference on Information and Knowledge Management10.1145/3627673.3679649(163-174)Online publication date: 21-Oct-2024
        • (2023)Equality, Equity, and Algorithms: Learning from Justice Rosalie AbellaUniversity of Toronto Law Journal10.3138/utlj-2023-006473:Supplement 2(163-178)Online publication date: 1-Sep-2023
        • (2023)Fair&Share: Fast and Fair Multi-Criteria SelectionsProceedings of the 32nd ACM International Conference on Information and Knowledge Management10.1145/3583780.3614874(152-162)Online publication date: 21-Oct-2023
        • (2023)Phase Transitions of Diversity in Stochastic Block Model Dynamics2023 59th Annual Allerton Conference on Communication, Control, and Computing (Allerton)10.1109/Allerton58177.2023.10313364(1-8)Online publication date: 26-Sep-2023

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media