[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

A little bird told me your gender: : Gender inferences in social media

Published: 01 May 2021 Publication History

Highlights

Inferential analytics may reinforce existing biases about gender stereotyping.
In a N = 109 pilot sample, 19% of Twitter users were misgendered.
Gender classifiers are binary and do not account for other than male/female gender categories.
LGBTQAI+ community members and straight women may be more often misgendered than straight men in social media.

Abstract

Online and social media platforms employ automated recognition methods to presume user preferences, sensitive attributes such as race, gender, sexual orientation, and opinions. These opaque methods can predict behaviors for marketing purposes and influence behavior for profit, serving attention economics but also reinforcing existing biases such as gender stereotyping. Although two international human rights treaties include explicit obligations relating to harmful and wrongful stereotyping, these stereotypes persist online and offline. By identifying how inferential analytics may reinforce gender stereotyping and affect marginalized communities, opportunities for addressing these concerns and thereby increasing privacy, diversity, and inclusion online can be explored. This is important because misgendering reinforces gender stereotypes, accentuates gender binarism, undermines privacy and autonomy, and may cause feelings of rejection, impacting people's self-esteem, confidence, and authenticity. In turn, this may increase social stigmatization. This study brings into view concerns of discrimination and exacerbation of existing biases that online platforms continue to replicate and that literature starts to highlight. The implications of misgendering on Twitter are investigated to illustrate the impact of algorithmic bias on inadvertent privacy violations and reinforcement of social prejudices of gender through a multidisciplinary perspective, including legal, computer science, and critical feminist media-studies viewpoints. An online pilot survey was conducted to better understand how accurate Twitter's gender inferences of its users’ gender identities are. This served as a basis for exploring the implications of this social media practice.

References

[1]
Article 29 Working Party, A29WP (2014). Opinion 06/2014 on the notion of legitimate interest of the data controller under Article 7 of Directive 95/46/EC. 844/14/EN, WP 217, adopted 9 April 2014, https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf.
[2]
Ashley, F. (2017). No, pronouns won't send you to jail: The misunderstood scope of Bill C-16. Medium, https://medium.com/@florence.ashley/no-pronouns-wont-send-you-to-jail-43c268cffd55 (accessed 30 May 2020).
[3]
E.A. Baatarjav, S. Phithakkitnukoon, R. Dantu, Group recommendation system for Facebook, in: R. Meersman, Z. Tari, P. Herrero (Eds.), On the Move to Meaningful Internet Systems: OTM 2008 Workshops. OTM 2008. Lecture Notes in Computer Science, 5333, Springer, Berlin, Heidelberg, 2008,.
[4]
Z. Bauman, Liquid love: On the frailty of human bonds, John Wiley & Sons, Cambridge, UK, 2013.
[5]
S.D. Beauvoir, Le deuxième sexe, Éditions Gallimard, Paris, 1949.
[6]
M. Bogen, All the ways hiring algorithms can introduce bias, Harvard Business Review, 2019, https://hbr.org/2019/05/all-the-ways-hiring-algorithms-can-introduce-bias (accessed 30 May 2020).
[7]
F. Bray, Gender and technology, Annual Review of Anthropology 36 (2007) 37–53.
[8]
P. Bray, When Is My Tweet's Prime of Life? (A brief statistical interlude.), Moz. (2012) Available at https://moz.com/blog/when-is-my-tweets-prime-of-life (accessed 2 September 2020).
[9]
J. Buolamwini, T. Gebru, Gender shades: Intersectional accuracy disparities in commercial gender classification, Proceedings of Machine Learning Research 81 (2018) 77–91.
[10]
B.J. Burdge, Bending gender, ending gender: Theoretical foundations for social work practice with the transgender community, Social Work 52 (3) (2007) 243–250.
[11]
J. Butler, Gender trouble, feminist theory, and psychoanalytic discourse, Feminism/postmodernism (1990) 327.
[12]
T. Calders, B.H.M. Custers, What is data mining and how does it work?, in: B.H.M. Custers, T. Calders, B. Schermer, T. Zarsky (Eds.), Discrimination and privacy in the information society, Springer, Heidelberg, 2013.
[13]
A. Caliskan, J.J. Bryson, A. Narayanan, Semantics derived automatically from language corpora contain human-like biases, Science (New York, N.Y.) 356 (6334) (2017) 183–186.
[14]
Campa, S., Davis, M., & Gonzalez, D. (2019). Deep & machine learning approaches to analyzing gender representations in journalism. https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/reports/custom/15787612.pdf.
[15]
S. Capsuto, Alternate channels: The uncensored story of gay and lesbian images on radio and television, Ballantine Books, New York City, 2000.
[16]
Jacob Cohen, Statistical power analysis for the behavioural sciences, 2nd edition, L. Erlbaum Associates, Hillsdale, New Jersey, 1988.
[17]
M. Corney, O. De Vel, A. Anderson, G. Mohay, Gender-preferential text mining of e-mail discourse, in: Proceedings of the 18th IEEE Annual Computer Security Applications Conference, 2002, 2002, pp. 282–289.
[18]
B.H.M. Custers, Data Dilemmas in the Information Society, in: B.H.M. Custers, T. Calders, B. Schermer, T. Zarsky (Eds.), Discrimination and privacy in the information society, Springer, Heidelberg, 2013.
[19]
B.H.M. Custers, Profiling as inferred data: Amplifier effects and positive feedback loops., (red.) in: E. Bayamlioglu, I. Baraliuc, L. Janssens, M. Hildebrandt (Eds.), Being profiled: Cogitas ergo sum: 10 years of “profiling the European citizen”, Amsterdam University Press, Amsterdam, 2018, pp. 112–116.
[20]
B.H.M. Custers, D. Bachlechner, Advancing the EU data economy: Conditions for realizing the full potential of data reuse, Information Polity 22 (4) (2018) 291–309.
[21]
T.H. Davenport, J.C. Beck, The attention economy, Harvard Business School Press, Boston, 2001.
[22]
A. Fabris, A. Purpura, G. Silvello, G.A. Susto, Gender stereotype reinforcement: Measuring the gender bias conveyed by ranking algorithms, Information Processing & Management 57 (6) (2020).
[23]
Fergus, J. (2020). Twitter is guessing users' genders to sell ads and often getting it wrong, input, https://www.inputmag.com/tech/twitter-guesses-your-gender-to-serve-you-ads-relevant-tweets-wrong-misgendered (accessed 30 May 2020).
[24]
Filho, Lopes, Ahirton, José, & Pasti, Rodrigo, & De Castro, Leandro (2016). Gender classification of Twitter data based on textual meta-attributes extraction. 10.1007/978-3-319-31232-3_97.
[25]
C. Fink, J. Kopecky, M. Morawski, Inferring gender from the content of tweets: A region specific example, in: Sixth International AAAI Conference on Weblogs and Social Media. Association for the Advancement of Artificial Intelligence (AAAI), 2012, pp. 459–462. Available at https://www.aaai.org/ocs/index.php/ICWSM/ICWSM12/paper/viewFile/4644/5031 (accessed 28 May 2020).
[26]
Font, J.E., & Costa-jussa, M.R. (2019). Equalizing gender bias in neural machine translation with word embeddings techniques. Available at https://arxiv.org/pdf/1901.03116.pdf (last accessed 12 February 2021).
[27]
E. Fosch-Villaronga, Robots, healthcare, and the law: Regulating automation in personal care, Routledge, New York City, 2019.
[28]
E. Fosch-Villaronga, P. Kieseberg, T. Li, Humans forget, machines remember: Artificial intelligence and the rightto be forgotten, Computer Law & Security Review 34 (2) (2018) 304–313.
[29]
O. Garibo-Orts, A Big Data approach to gender classification in Twitter, in: Proceedings of PAN at CLEF 2018. Avaialble at, 2018, http://ceur-ws.org/Vol-2125/paper_204.pdf (accessed 25 May 2020).
[30]
Y. Ge, C.R. Knittel, D. MacKenzie, S. Zoepf, Racial and gender discrimination in transportation network companies (No. w22776), National Bureau of Economic Research, 2016, https://www.nber.org/system/files/working_papers/w22776/w22776.pdf (last accessed 12 February 2021).
[31]
S.C. Geyik, S. Ambler, K. Kenthapadi, Fairness-aware ranking in search & recommendation systems with application to LinkedIn talent search., Paper presented at the in: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 2019,.
[32]
P. Glick, C. Gangl, S. Gibb, S. Klumpner, E. Weinberg, Defensive reactions to masculinity threat: More negative affect toward effeminate (but not masculine) gay men, Sex roles 57 (1–2) (2007) 55–59.
[33]
A. Gomes, D. Antonialli, T. Dias-Oliva, Drag queens and Artificial Intelligence: Should computers decide what is 'toxic' on the internet?, Internet Lab Blog (2019) https://www.internetlab.org.br/en/freedom-of-expression/drag-queens-and-artificial-intelligence-should-computers-decide-what-is-toxic-on-the-internet/ (accessed 4 June 2020).
[34]
A. Grant, S. Grey, J.G. van Hell, Male fashionistas and female football fans: Gender stereotypes affect neurophysiological correlates of semantic processing during speech comprehension, Journal of Neurolinguistics 53 (2020).
[35]
L. Haas, C.P. Hwang, Gender and organizational culture: Correlates of companies' responsiveness to fathers in Sweden, Gender & Society 21 (1) (2007) 52–79.
[36]
F. Hamidi, M.K. Scheuerman, S.M. Branham, Gender recognition or gender reductionism? The social implications of embedded gender recognition systems, in: Proceedings of the 2018 chi conference on human factors in computing systems, 2018, pp. 1–13.
[37]
S. Hänold, Profiling and Automated Decision-Making: Legal Implications and Shortcomings, in: M. Corrales, M. Fenwick, N. Forgó (Eds.), Robotics, ai and the future of law. perspectives in law, business and innovation, Springer, Singapore, 2018, pp. 123–153.
[38]
K. Hao, Facebook's ad-serving algorithm discriminates by gender and race, MIT Technology Review (2019) https://www.technologyreview.com/2019/04/05/1175/facebook-algorithm-discriminates-ai-bias/ (accessed 30 May 2020).
[39]
K. Hao, This is how AI bias really happens-and why it's so hard to fix, MIT Technology Review (2019) https://www.technologyreview.com/2019/02/04/137602/this-is-how-ai-bias-really-happensand-why-its-so-hard-to-fix/ (accessed 30 May 2020).
[40]
Y.N. Harari, The world after coronavirus, Financial Times (2020) 19 March. Available at: https://www.ft.com/content/19d90308-6858-11ea-a3c9-1fe6fedcca75 (accessed 30 May 2020).
[41]
T. Hentschel, M.E. Heilman, C.V. Peus, The multiple dimensions of gender stereotypes: A current look at men's and women's characterizations of others and themselves, Frontiers in psychology 10 (11) (2019) 1–19.
[42]
M. Hildebrandt, S. Gutwirth, Profiling the European citizen, Springer, Heidelberg, 2008.
[43]
C. Hooper, Manly states: Masculinities, international relations, and gender politics, Columbia University Press, New York City, 2001.
[44]
K. Howansky, L.S. Wilton, D.M. Young, S. Abrams, R. Clapham, Trans) gender stereotypes and the self: Content and consequences of gender identity stereotypes, Self and Identity (2019) 1–18.
[46]
J. Ito, Supposedly 'fair' algorithms can perpetuate discrimination, MIT Media Lab. (2019) Retrieved from https://www.media.mit.edu/articles/supposedly-fair-algorithms-can-perpetuate-discrimination/ (accessed 30 May 2020).
[47]
H. Jenkins, M. Ito, Participatory culture in a networked era: A conversation on youth, learning, commerce, and politics, John Wiley & Sons, Hoboken, New Jersey, 2015.
[48]
C. Jernigan, B.F. Mistree, Gaydar: Facebook friendships expose sexual orientation, First Monday 14 (10) (2009).
[49]
A. Jobin, M. Ienca, E. Vayena, The global landscape of AI ethics guidelines, Nature Machine Intelligence 1 (9) (2019) 389–399.
[50]
S.F. Johnston, The technological fix as social cure-all: Origins and implications, 37, IEEE Technology and Society Magazine, 2018, pp. 47–54.
[51]
S. Kachel, M.C. Steffens, C. Niedlich, Traditional masculinity and femininity: Validation of a new scale assessing gender roles, Frontiers in Psychology 7 (956) (2016) 1–19.
[52]
F. Kamiran, T. Calders, M. Pechenizkiy, Techniques for discrimination-free predictive models, in: B.H.M. Custers, T. Calders, B. Schermer, T. Zarsky (Eds.), (2013) discrimination and privacy in the information society, Springer, Heidelberg, 2013, pp. 223–239.
[53]
C. Katzenbach, L. Ulbricht, Algorithmic governance, Internet Policy Review 8 (4) (2019),.
[54]
O. Keyes, The misgendering machines: Trans/HCI implications of automatic gender recognition, in: Proceedings of the ACM on Human-Computer Interaction, 2, 2018, pp. 1–22.
[55]
S.A. Khan, Munir Ahmad, Muhammad Nazir, N. Riaz, A comparative analysis of gender classification techniques, Middle - East Journal of Scientific Research 20 (2014) 1–13,.
[56]
E. Klein, Gender politics: From consciousness to mass politics, Harvard University Press, New York City, 2013.
[57]
M. Kosinski, D. Stillwell, T. Graepel, Private traits and attributes are predictable from digital records of human behavior, Proceedings of the national academy of sciences 110 (15) (2013) 5802–5805.
[58]
Deepak Kumar, Rajat Gupta, Ashirwad Sharma, Sushil Kumar Saroj, Gender Classification using Skin Patterns, in: March 12,. Proceedings of 2nd International Conference on Advanced Computing and Software Engineering (ICACSE), 2019, 2019. Available at SSRN: https://ssrn.com/abstract=3351003.
[59]
Lambrecht, A., & Tucker, C.E. (.2018). Algorithmic bias? An empirical study into apparent gender-based discrimination in the display of STEM career ads. An Empirical Study into Apparent Gender-Based Discrimination in the Display of STEM Career Ads (March 9, 2018). Available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2852260 (accessed 30 May 2020).
[60]
B. Li, X.-.C. Lian, B.-.L. Lu, Gender classification by combining clothing, hair and facial component classifiers, Neurocomputing 76 (1) (2012) 18–27,.
[61]
Lin, Feng, & Wu, Yingxiao, & Zhuang, Yan, & Long, Xi, & Xu, Wenyao. (2015). Human gender classification: A review. http://cse.ucdenver.edu/~linfen/papers/2016_IJBM_gender.pdf.
[62]
P. Mathivanan, K. Poornima, Biometric authentication for gender classification techniques: A review, Journal of The Institution of Engineers (India): Series B 99 (2018) 79–85,.
[63]
S.C. Matz, J.I. Menges, D.J. Stillwell, H.A. Schwartz, Predicting individual-level income from Facebook profiles, PloS one 14 (3) (2019).
[64]
McDuff, D., Ma, S., Song, Y., & Kapoor, A. (2019). Characterizing bias in classifiers using generative models. arXiv preprint 1906.11891 https://arxiv.org/abs/1906.11891.
[65]
K.A. McLemore, Experiences with misgendering: Identity misclassification of transgender spectrum individuals, Self and Identity 14 (1) (2015) 51–74.
[66]
Nature, Accounting for sex and gender makes for better science, Editorial, Nature (2020) Retrieved from https://www.nature.com/articles/d41586-020-03459-y (last accessed 6 January 2020).
[67]
M. Nieuwenhuis, J. Wilkens, Twitter text and image gender classification with a logistic regression n-gram model, in: Proceedings of the Ninth International Conference of the CLEF Association (CLEF 2018), 2018.
[68]
S.U. Noble, Algorithms of oppression: How search engines reinforce racism, NYU Press, New York City, 2018.
[69]
B.A. Nosek, M.R. Banaji, A.G. Greenwald, Harvesting implicit group attitudes and beliefs from a demonstration web site, Group Dynamics: Theory, Research, and Practice 6 (1) (2002) 101–115.
[70]
B.A. Nosek, M.R. Banaji, A.G. Greenwald, Math= male, me= female, therefore math≠ me, Journal of personality and social psychology 83 (1) (2002) 44–59.
[71]
Office of the High Commissioner Human Rights, Gender stereotypes, United Nations, New York City, 2020, Available at https://www.ohchr.org/EN/Issues/Women/WRGS/Pages/GenderStereotypes.aspx (last accessed 25 May 2020).
[72]
Y. Ogasawara, Office ladies and salaried men: Power, gender, and work in Japanese companies, Univ of California Press, Oakland, 1998.
[73]
C. O'Neil, Weapons of math destruction; how big data increases inequality and threatens democracy, Crown, New York, 2016.
[74]
Sunghee Park, Jiyoung Woo, Gender Classification Using Sentiment Analysis and Deep Learning in a Health Web Forum, Applied Sciences. 9 (2019) 1249,.
[75]
D. Pinsof, M.G. Haselton, The effect of the promiscuity stereotype on opposition to gay rights, PloS one 12 (7) (2017).
[76]
A. Poulsen, E. Fosch-Villaronga, R.A. Søraa, Queering machines, Nature Machine Intelligence 2 (3) (2020) 152-152.
[77]
P. Rai, P. Khanna, Gender classification techniques: A review, in: D. Wyld, J. Zizka, D. Nagamalai (Eds.), Advances in Computer Science, Engineering & Applications. Advances in Intelligent and Soft Computing, 166, Springer, Berlin, Heidelberg, 2012.
[78]
V. Randall, G. Waylen (Eds.), Gender, politics and the state, Routledge, New York City, 2012.
[79]
B. Rey, Your tweet half-life is 1 billion times shorter than Carbon-14′s, Wiselytics (2014) Available at http://www.wiselytics.com/blog/tweet-isbillion-time-shorter-than-carbon14/ (accessed 2 September 2020).
[80]
L. Robinson, S.R. Cotten, H. Ono, A. Quan-Haase, G. Mesch, W. Chen, et al., Digital inequalities and why they matter. Information, Communication & Society 18 (5) (2015) 569–582,.
[81]
Roosendaal, A. (2010). Facebook Tracks and Traces Everyone: Like This! Tilburg Law School Legal Studies Research Paper Series No. 03/2011. Available at SSRN:https://ssrn.com/abstract=1717563.
[82]
P. Rosa, A. Dawson, Gender and the commercialization of university science: Academic founders of spinout companies, Entrepreneurship and Regional Development 18 (4) (2006) 341–366.
[83]
V. Russo, The celluloid closet: Homosexuality in the movies, Harper Collins, Amsterdam, 1987.
[84]
L. Schiebinger, Scientific research must take gender into account, Nature 507 (7490) (2014) 9-9.
[85]
A. Sink, D. Mastro, M. Dragojevic, Competent or warm? A stereotype content model approach to understanding perceptions of masculine and effeminate gay television characters, Journalism & Mass Communication Quarterly 95 (3) (2018) 588–606.
[86]
R.A. Søraa, Mechanical genders: how do humans gender robots?, Gender, Technology and Development 21 (1-2) (2017) 99–115.
[87]
R.A. Søraa, M. Anfinsen, C. Foulds, M. Korsnes, V. Lagesen, R. Robison, et al., Diversifying diversity: Inclusive engagement, intersectionality, and gender identity in a European Social Sciences and Humanities Energy research project, Energy Research & Social Science 62 (2020) 1–11.
[88]
T. Sun, A. Gaut, S. Tang, Y. Huang, M. ElSherief, J. Zhao, et al., Mitigating gender bias in natural language processing: Literature review, 2019, pp. 1630–1640,. https://arxiv.org/pdf/1906.08976.pdf (accessed 12 February 2021).
[89]
C. Tannenbaum, R.P. Ellis, F. Eyssel, J. Zou, L. Schiebinger, Sex and gender analysis improves science and engineering, Nature 575 (7781) (2019) 137–146.
[90]
K. Thorson, K. Cotter, M. Medeiros, C. Pak, Algorithmic inference, political interest, and exposure to news and politics on Facebook. Information, Communication & Society (2019) 1–18,.
[91]
A. Torralba, A.A. Efros, Unbiased look at dataset bias, in: Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, IEEE, 2011, pp. 1521–1528.
[92]
United Nations Human Rights Office of the High Commissioner, OHCHR, The corporate responsibility to respect human rights. an interpretative guide., United Nations, New York City, 2012, Available at https://www.ohchr.org/Documents/publications/hr.puB.12.2_en.pdf (accessed 1 September 2020).
[93]
B. Ur, P.G. Leon, L.F. Cranor, R. Shay, Y. Wang, Smart, useful, scary, creepy: Perceptions of online behavioral advertising, in: Proceedings of the ACM Symposium On Usable Privacy and Security (SOUPS), July 11-13, 2012, Washington, DC, USA, 2012, pp. 1–15.
[94]
S. Wachter, Affinity profiling and discrimination by association in online behavioural advertising, Berkeley Technology Law Journal 35 (2) (2020).
[95]
S. Wachter, B. Mittelstadt, A right to reasonable inferences: Re-thinking data protection law in the age of big data and AI, Columbia Business Law Review (2019) 494–620.
[96]
C. Wagner, D. Garcia, M. Jadidi, M. Strohmaier, It's a man's Wikipedia? Assessing gender inequality in an online encyclopedia, in: Ninth international AAAI conference on web and social media, 2015, https://arxiv.org/pdf/1501.06307.pdf.
[97]
C. Wagner, E. Graells-Garrido, D. Garcia, F. Menczer, Women through the glass ceiling: Gender asymmetries in Wikipedia, EPJ Data Science 5 (1) (2016) 5.
[98]
Y. Wilchek-Aviad, C. Tuval, N. Zohar, Gender stereotyping and body image of transgender women, Current Psychology (2020) 1–10.
[99]
M. Willson, Algorithms (and the) everyday. Information, Communication & Society 20 (1) (2017) 137–150,.
[100]
Xiang Yan, Ling Yan, Gender Classification of weblog authors, in: Proceedings of the Conference: Computational Approaches to Analyzing Weblogs, Papers from the 2006 AAAI Spring Symposium, Technical Report SS-06-03, Stanford, California, USA, March 27-29, 2006, 2006, pp. 228–230.
[101]
T. Zarsky, Mine your own business! Making the case for the implications of the data mining of personal information in the forum of public opinion, Yale Journal of Law and Technology 5 (2003) 57.
[102]
J. Zhao, T. Wang, M Yatskar, V. Ordonez, K.W. Chang, Men also like shopping: Reducing gender bias amplification using corpus-level constraints, 2017, pp. 2979–2989. https://arxiv.org/pdf/1707.09457.pdf (last accessed 12 February 2021).
[103]
Pei Zhou, Weijia Shi, Jieyu Zhao, Kuan-Hao Huang, Muhao Chen, Ryan Cotterell, et al., Examining gender bias in languages with grammatical gender, 2019, pp. 5279–5287,.
[104]
D.H. Zimmerman, C. West, Doing gender, Gender and Society 1 (2) (1987) 125–151.
[105]
I. Zliobaite, B. Custers, Using sensitive personal data may be necessary for avoiding discrimination in data-driven decision models, Artificial Intelligence and Law (24) (2016) 183–201.
[106]
S. Zuboff, Big other: Surveillance capitalism and the prospects of an information civilization, Journal of Information Technology 30 (1) (2015) 75–89.

Cited By

View all
  • (2024)Shielding or Silencing?: An Investigation into Content Moderation during the Sheikh Jarrah CrisisProceedings of the ACM on Human-Computer Interaction10.1145/36330718:GROUP(1-21)Online publication date: 21-Feb-2024
  • (2024)From words to genderInformation Processing and Management: an International Journal10.1016/j.ipm.2024.10364761:3Online publication date: 2-Jul-2024
  • (2023)Inclusive HRI IICompanion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568294.3579965(956-958)Online publication date: 13-Mar-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Information Processing and Management: an International Journal
Information Processing and Management: an International Journal  Volume 58, Issue 3
May 2021
1030 pages

Publisher

Pergamon Press, Inc.

United States

Publication History

Published: 01 May 2021

Author Tags

  1. Gender
  2. Twitter
  3. Social media
  4. Inference
  5. Gender classifier
  6. Automated gender recognition system
  7. Privacy
  8. Algorithmic bias
  9. Discrimination
  10. LGBTQAI+
  11. Gender stereotyping
  12. Online Behavioral Advertising

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 19 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Shielding or Silencing?: An Investigation into Content Moderation during the Sheikh Jarrah CrisisProceedings of the ACM on Human-Computer Interaction10.1145/36330718:GROUP(1-21)Online publication date: 21-Feb-2024
  • (2024)From words to genderInformation Processing and Management: an International Journal10.1016/j.ipm.2024.10364761:3Online publication date: 2-Jul-2024
  • (2023)Inclusive HRI IICompanion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568294.3579965(956-958)Online publication date: 13-Mar-2023
  • (2023)Feminist Human-Robot InteractionProceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568162.3576973(72-82)Online publication date: 13-Mar-2023
  • (2023)Fair and equitable AI in biomedical research and healthcareArtificial Intelligence in Medicine10.1016/j.artmed.2023.102658144:COnline publication date: 1-Oct-2023
  • (2022)Inclusive HRIProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523994(1247-1249)Online publication date: 7-Mar-2022
  • (2022)Slanted Speculations:Proceedings of the 2022 ACM Designing Interactive Systems Conference10.1145/3532106.3533449(85-99)Online publication date: 13-Jun-2022
  • (2022)Demographic-Reliant Algorithmic Fairness: Characterizing the Risks of Demographic Data Collection in the Pursuit of FairnessProceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency10.1145/3531146.3533226(1709-1721)Online publication date: 21-Jun-2022
  • (2022)The Case for Establishing a Collective Perspective to Address the Harms of Platform PersonalizationProceedings of the 2022 Symposium on Computer Science and Law10.1145/3511265.3550450(119-130)Online publication date: 1-Nov-2022
  • (2022)No Humans HereProceedings of the ACM on Human-Computer Interaction10.1145/34928576:GROUP(1-13)Online publication date: 14-Jan-2022
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media