[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3543758.3544213acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmundcConference Proceedingsconference-collections
research-article
Open access

“Es geht um Respekt, nicht um Technologie”: Erkenntnisse aus einem Interessensgruppen-übergreifenden Workshop zu genderfairer Sprache und Sprachtechnologie

Published: 15 September 2022 Publication History

Abstract

English: With the increasing attention non-binary people receive in Western societies, strategies of gender-fair language have started to move away from binary (only female/male) concepts of gender. Nevertheless, hardly any approaches to take these identities into account into machine translation models exist so far. A lack of understanding of the socio-technical implications of such technologies risks further reproducing linguistic mechanisms of oppression and mislabelling. In this paper, we describe the methods and results of a workshop on gender-fair language and language technologies, which was led and organised by ten researchers from TU Wien, St. Pölten UAS, FH Campus Wien and the University of Vienna and took place in Vienna in autumn 2021. A wide range of interest groups and their representatives were invited to ensure that the topic could be dealt with holistically. Accordingly, we aimed to include translators, machine translation experts and non-binary individuals (as “community experts”) on an equal footing. Our analysis shows that gender in machine translation requires a high degree of context sensitivity, that developers of such technologies need to position themselves cautiously in a process still under social negotiation, and that flexible approaches seem most adequate at present. We then illustrate steps that follow from our results for the field of gender-fair language technologies so that technological developments can adequately line up with social advancements.
Deutsch: Mit zunehmender gesamtgesellschaftlicher Wahrnehmung nicht-binärer Personen haben sich in den letzten Jahren auch Konzepte von genderfairer Sprache von der bisher verwendeten Binarität (weiblich/männlich) entfernt. Trotzdem gibt es bislang nur wenige Ansätze dazu, diese Identitäten in maschineller Übersetzung abzubilden. Ein fehlendes Verständnis unterschiedlicher sozio-technischer Implikationen derartiger Technologien birgt in sich die Gefahr, fehlerhafte Ansprachen und Bezeichnungen sowie sprachliche Unterdrückungsmechanismen zu reproduzieren. In diesem Beitrag beschreiben wir die Methoden und Ergebnisse eines Workshops zu genderfairer Sprache in technologischen Zusammenhängen, der im Herbst 2021 in Wien stattgefunden hat. Zehn Forscher*innen der TU Wien, FH St. Pölten, FH Campus Wien und Universität Wien organisierten und leiteten den Workshop. Dabei wurden unterschiedlichste Interessensgruppen und deren Vertreter*innen breit gestreut eingeladen, um sicherzustellen, dass das Thema holistisch behandelt werden kann. Dementsprechend setzten wir uns zum Ziel, Machine-Translation-Entwickler*innen, Übersetzer*innen, und nicht-binäre Privatpersonen (als “Lebens-welt-Expert*innen”) gleichberechtigt einzubinden. Unsere Analyse zeigt, dass Geschlecht in maschineller Übersetzung eine maßgeblich kontextsensible Herangehensweise erfordert, die Entwicklung von Sprachtechnologien sich vorsichtig in einem sich noch in Aushandlung befindlichen gesellschaftlichen Prozess positionieren muss, und flexible Ansätze derzeit am adäquatesten erscheinen. Wir zeigen auf, welche nächsten Schritte im Bereich genderfairer Technologien notwendig sind, damit technische mit sozialen Entwicklungen mithalten können.

References

[1]
AG Feministisch Sprachhandeln. 2015. Was tun? Sprachhandeln – aber wie? W_Ortungen statt Tatenlosigkeit. https://feministisch-sprachhandeln.org/
[2]
Doris Allhutter. 2021. Memory Traces in Society–Technology Relations. How to Produce Cracks in Infrastructural Power. In Reader Collective Memory-Work, Robert Hamm (Ed.). BeltraBooks, Sligo, 426–452. https://collectivememorywork.net/?page_id=93
[3]
Susan Arndt, Maureen Maisha Eggers, Grada Kilomba, and Peggy Piesche (Eds.). 2017. Mythen, Masken und Subjekte: Kritische Weißseinsforschung in Deutschland. Unrast Verlag, Münster.
[4]
April H. Bailey and Marianne LaFrance. 2017. Who Counts as Human? Antecedents to Androcentric Behavior. Sex Roles 76(2017), 682–693. https://doi.org/10.1007/S11199-016-0648-4
[5]
Madeline Balaam, Rob Comber, Rachel E. Clarke, Charles Windlin, Anna Ståhl, Kristina Höök, and Geraldine Fitzpatrick. 2019. Emotion Work in Experience-Centered Design. In Proceedings of the 2019 SIGCHI Conference on Human Factors in Computing Systems (Glasgow) (CHI ’19). ACM, New York, NY, Article 602, 12 pages. https://doi.org/10.1145/3290605.3300832
[6]
Solon Barocas and Andrew D. Selbst. 2016. Big Data’s Disparate Impact. Calif. L. Rev. 104, 3 (2016), 671–732. https://doi.org/10.15779/Z38BG31
[7]
Persson Perry Baumgartinger. 2021. Transitioning Gender Equality to the Equality of Sexgender Diversity. In Transitioning to Gender Equality, Christa Binswanger and Andrea Zimmermann (Eds.). Number 5 in Transitioning to Sustainability Series. MDPI, Basel, 85–94. https://doi.org/10.3390/books978-3-03897-867-1
[8]
Sandra L. Bem and Daryl J. Bem. 1973. Does Sex‐Biased Job Advertising ”Aid and Abet” Sex Discrimination?J. Appl. Soc. Psychol. 3, 1 (1973), 6–18. https://doi.org/10.1111/J.1559-1816.1973.TB01290.X
[9]
Nora Berenstain. 2016. Epistemic Exploitation. Ergo 3(2016), 569–590. https://doi.org/10.3998/ergo.12405314.0003.022
[10]
Su Lin Blodgett, Solon Barocas, Hal Daumé III, and Hanna Wallach. 2020. Language (Technology) is Power: A Critical Survey of ”Bias” in NLP. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (Online) (ACL 2020). ACL, Stroudsburg, PA, 5454–5476. https://doi.org/10.18653/v1/2020.acl-main.485
[11]
Tolga Bolukbasi, Kai-Wei Chang, James Y. Zou, Venkatesh Saligrama, and Adam T. Kalai. 2016. Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. In Advances in Neural Information Processing Systems 29: 30th Annual Conference on Neural Information Processing Systems (Barcelona) (NIPS 2016). Curran Associates, Red Hook, NY, 9 pages. https://papers.nips.cc/paper/2016/hash/a486cd07e4ac3d270571622f4f316ec5-Abstract.html
[12]
Adam Bradley, Cayley MacArthur, Mark Hancock, and Sheelagh Carpendale. 2015. Gendered or Neutral? Considering the Language of HCI. In Proceedings of the 41st Graphics Interface Conference (Halifax) (GI ’15). Canadian Information Processing Society, Toronto, 163–170. https://doi.org/10.20380/GI2015.21
[13]
Bundesamt für Familie und zivilgesellschaftliche Aufgaben. 2020. w / m / divers / offen: der Geschlechtseintrag. https://www.regenbogenportal.de/informationen/w-/-m-/-divers-/-offen-der-geschlechtseintrag
[14]
Sabrina Burtscher and Katta Spiel. 2021. “Let’s Talk about Gender” – Development of a Card Deck on (Gender) Sensitivity in HCI Research and Practice Based on a Contrasting Literature Review. i-com 20, 1 (2021), 85–103. https://doi.org/10.1515/icom-2021-0001
[15]
Rachel Clarke, Peter Wright, Madeline Balaam, and John McCarthy. 2013. Digital Portraits: Photo-Sharing After Domestic Violence. In Proceedings of the 2013 SIGCHI Conference on Human Factors in Computing Systems (Paris) (CHI ’13). ACM, New York, NY, 2517–2526. https://doi.org/10.1145/2470654.2481348
[16]
Cabala de Sylvain and Carsten Balzer. 2008. Die SYLVAIN-Konventionen – Versuch einer „geschlechtergerechten“ Grammatik-Transformation der deutschen Sprache. Liminalis 2008, 2 (2008), 40–53.
[17]
Boka En, Tobias Humer, Marija Petričević, Tinou Ponzer, Claudia Rauch, and Katta Spiel. 2021. Geschlechtersensible Sprache – Dialog auf Augenhöhe. https://www.gleichbehandlungsanwaltschaft.gv.at/dam/jcr:8029ba34-d889-4e64-8b15-ab9025c96126/210601_Leitfaden_geschl-Sprache_A5_BF.pdf
[18]
Joel Escudé Font and Marta R. Costa-jussà. 2019. Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques. In Proceedings of the 1st Workshop on Gender Bias in Natural Language Processing (Florence) (GeBNLP 2019). ACL, Stroudsburg, PA, 147–154. https://doi.org/10.18653/v1/W19-3821
[19]
Katja Franz, Anna Jöster, Martin Kuhlmann, Josefine Méndez, Beatrix Schwarzbach, Iris Schulte, and Franziska Trischler. 2021. „Wie wir gendern könnten, wenn wir wöllten“ [sic!]. sprechen 38, 72 (2021), 7–16. https://doi.org/10.1007/s00521-019-04144-6
[20]
Gleichbehandlungsanwaltschaft. 2022. Neuer Erlass zur Anerkennung intergeschlechtlicher Menschen. https://www.gleichbehandlungsanwaltschaft.gv.at/aktuelles-und-services/aktuelle-informationen/Neuer-Erlass-zur-Anerkennung-intergeschlechtlicher-Menschen.html
[21]
Paul Haller, Luan Pertl, and Tinou Ponzer (Eds.). 2022. Inter* Pride: Perspektiven aus einer weltweiten Menschenrechtsbewegung. w_orten & meer, Insel Hiddensee.
[22]
Lann Hornscheidt. 2012. feministische w_orte: ein lern-, denk- und handlungsbuch zu sprache und diskriminierung, gender studies und feministischer linguistik. Number 168 in Wissen & Praxis/number 5 in Transdisziplinäre Genderstudien. Brandes & Apsel, Frankfurt am Main.
[23]
Lann Hornscheidt and Ja’n Sammla. 2021. Wie schreibe ich divers? Wie spreche ich gendergerecht?: Ein Praxis-Handbuch zu Gender und Sprache. w_orten & meer, Insel Hiddensee.
[24]
Lisa Kristina Horvath and Sabine Sczesny. 2016. Reducing Women’s Lack of Fit with Leadership Positions? Effects of the Wording of Job Advertisements. Eur. J. Work Organ. Psychol. 25, 2 (2016), 316–328. https://doi.org/10.1080/1359432X.2015.1067611
[25]
Institut für Philosophie der Universität Leipzig. 2020. Genderleitfaden. http://philosop-her.de/wp-content/uploads/2021/05/universitaet_leipzig_genderleitfaden.pdf
[26]
Os Keyes. 2018. The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition. Proc. ACM Hum.-Comput. Interact. 2, CSCW, Article 88(2018), 22 pages. https://doi.org/10.1145/3274357
[27]
Sara Kindon, Rachel Pain, and Mike Kesby (Eds.). 2007. Participatory Action Research Approaches and Methods: Connecting People, Participation and Place. Number 22 in Routledge Studies in Human Geography. Routledge, Abingdon-on-Thames. https://doi.org/10.4324/9780203933671
[28]
Stefanie Koehler, Michael Wahl, Carmen Herold-Lacroix, Gregor Koehler, Ayumi Löwenstein, Alexander Pfingstl, Julian Scharffenberg, Marko Zesch, and Heik Zimmermann. 2021. Empfehlung zu gendergerechter, digital barrierefreier Sprache – eine repräsentative Studie. https://www.bfit-bund.de/DE/Publikation/empfehlung-gendergerechte-digital-barrierefreie-sprache-studie-koehler-wahl.html
[29]
Manuel Lardelli. 2020. Gender Bias in Machine Translation. Master’s thesis. University of Vienna, Wien.
[30]
Amit Moryossef, Roee Aharoni, and Yoav Goldberg. 2019. Filling Gender & Number Gaps in Neural Machine Translation with Black-Box Context Injection. In Proceedings of the 1st Workshop on Gender Bias in Natural Language Processing (Florence) (GEBNLP 2019). ACL, Stroudsburg, PA, 49–54. https://doi.org/10.18653/v1/W19-3807
[31]
Marcelo O. R. Prates, Pedro H. Avelar, and Luís C. Lamb. 2020. Assessing Gender Bias in Machine Translation: A Case Study with Google Translate. Neural. Comput. Appl. 32, 10 (2020), 6363–6381. https://doi.org/10.1007/s00521-019-04144-6
[32]
Luise F. Pusch. 2017. Das Deutsche als Männersprache: Aufsätze und Glossen zur feministischen Linguistik. Suhrkamp, Berlin. https://doi.org/10.4324/9780203933671
[33]
Quinn. 2020. Best Transgender & Non-Binary Characters in Video Games. https://butwhythopodcast.com/2020/05/17/best-transgender-non-binary-characters-in-video-games/
[34]
Rat für deutsche Rechtschreibung. 2018. Empfehlungen zur „geschlechtergerechten Schreibung“. Beschluss des Rats für deutsche Rechtschreibung vom 16. November 2018. https://www.rechtschreibrat.com/DOX/rfdr_PM_2018-11-16_Geschlechtergerechte_Schreibung.pdf
[35]
Beatrice Savoldi, Marco Gaido, Luisa Bentivogli, Matteo Negri, and Marco Turchi. 2021. Gender Bias in Machine Translation. Trans. Assoc. Comput. Linguist. 9 (2021), 845–874. https://doi.org/10.1162/tacl_a_00401
[36]
Sabine Sczesny, Magda Formanowicz, and Franziska Moser. 2016. Can Gender-Fair Language Reduce Gender Stereotyping and Discrimination?Front. Psychol. 7:25, Article 25 (2016), 11 pages. https://doi.org/10.3389/FPSYG.2016.00025
[37]
Katta Spiel. 2021. “Why are They All Obsessed with Gender?” — (Non)Binary Navigations Through Technological Infrastructures. In Proceedings of the Designing Interactive Systems Conference 2021 (Virtual Event/United States) (DIS ’21). ACM, New York, NY, 478–494. https://doi.org/10.1145/3461778.3462033
[38]
Katta Spiel, Os Keyes, and Pınar Barlas. 2019. Patching Gender: Non-Binary Utopias in HCI. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow) (CHI EA ’19). ACM, New York, NY, Article alt05, 11 pages. https://doi.org/10.1145/3290607.3310425
[39]
Katta Spiel, Laura Malinverni, Judith Good, and Christopher Frauenberger. 2017. Participatory Evaluation with Autistic Children. In Proceedings of the 2017 SIGCHI Conference on Human Factors in Computing Systems (Denver, CO) (CHI ’17). ACM, New York, NY, 5755–5766. https://doi.org/10.1145/3025453.3025851
[40]
Artūrs Stafanovičs, Toms Bergmanis, and Mārcis Pinnis. 2020. Mitigating Gender Bias in Machine Translation with Target Gender Annotations. In Proceedings of the Fifth Conference on Machine Translation (Online) (WMT20). ACL, Stroudsburg, PA, 629–638. https://www.aclweb.org/anthology/2020.wmt-1.73
[41]
Tony Sun, Andrew Gaut, Shirlyn Tang, Yuxin Huang, Mai ElSherief, Jieyu Zhao, Diba Mirza, Elizabeth Belding, Kai-Wei Chang, and William Yang Wang. 2019. Mitigating Gender Bias in Natural Language Processing: Literature Review. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics(Florence) (ACL 2019). ACL, Stroudsburg, PA, 1630–1640. https://doi.org/10.18653/v1/P19-1159
[42]
Eva Vanmassenhove, Christian Hardmeier, and Andy Way. 2018. Getting Gender Right in Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (Brussels) (EMNLP 2018). ACL, Stroudsburg, PA, 3003–3008. https://doi.org/10.18653/v1/D18-1334
[43]
Lennart Wachowiak, Christian Lang, Barbara Heinisch, and Dagmar Gromann. 2020. CogALex-VI Shared Task: Transrelation – A Robust Multilingual Language Model for Multilingual Relation Identification. In Proceedings of the Workshop on the Cognitive Aspects of the Lexicon (Online) (CogALex 2020). ACL, Stroudsburg, PA, 59–64. https://aclanthology.org/2020.cogalex-1.7
[44]
Jieyu Zhao, Yichao Zhou, Zeyu Li, Wei Wang, and Kai-Wei Chang. 2018. Learning Gender-Neutral Word Embeddings. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (Brussels) (EMNLP 2018). ACL, Stroudsburg, PA, 4847–4853. https://doi.org/10.18653/v1/D18-1521

Cited By

View all
  • (2024)From Participation to Solidarity: A Case Study on Access of Maker Spaces from Deaf and Hearing Perspectives: Von Partizipation zu Solidarität: Eine Fallstudie zur Zugänglichkeit von Makerspaces aus Gehörloser und Hörender PerspektiveProceedings of Mensch und Computer 202410.1145/3670653.3670670(140-155)Online publication date: 1-Sep-2024

Index Terms

  1. “Es geht um Respekt, nicht um Technologie”: Erkenntnisse aus einem Interessensgruppen-übergreifenden Workshop zu genderfairer Sprache und Sprachtechnologie

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        MuC '22: Proceedings of Mensch und Computer 2022
        September 2022
        624 pages
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 15 September 2022

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Geschlecht
        2. Sprachtechnologie
        3. automatisierte Übersetzung
        4. genderfaire Sprache
        5. nicht-binär
        6. partizipative Forschung

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Funding Sources

        • Center for Technology and Society CTS
        • FWF Austrian Science Funds

        Conference

        MuC '22
        MuC '22: Mensch und Computer 2022
        September 4 - 7, 2022
        Darmstadt, Germany

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)682
        • Downloads (Last 6 weeks)28
        Reflects downloads up to 12 Dec 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)From Participation to Solidarity: A Case Study on Access of Maker Spaces from Deaf and Hearing Perspectives: Von Partizipation zu Solidarität: Eine Fallstudie zur Zugänglichkeit von Makerspaces aus Gehörloser und Hörender PerspektiveProceedings of Mensch und Computer 202410.1145/3670653.3670670(140-155)Online publication date: 1-Sep-2024

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media