[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

In-Contextual Gender Bias Suppression for Large Language Models

Daisuke Oba, Masahiro Kaneko, Danushka Bollegala


Abstract
Despite their impressive performance in a wide range of NLP tasks, Large Language Models (LLMs) have been reported to encode worrying-levels of gender biases. Prior work has proposed debiasing methods that require human labelled examples, data augmentation and fine-tuning of LLMs, which are computationally costly. Moreover, one might not even have access to the model parameters for performing debiasing such as in the case of closed LLMs such as GPT-4. To address this challenge, we propose bias suppression that prevents biased generations of LLMs by simply providing textual preambles constructed from manually designed templates and real-world statistics, without accessing to model parameters. We show that, using CrowsPairs dataset, our textual preambles covering counterfactual statements can suppress gender biases in English LLMs such as LLaMA2. Moreover, we find that gender-neutral descriptions of gender-biased objects can also suppress their gender biases. Moreover, we show that bias suppression has acceptable adverse effect on downstream task performance with HellaSwag and COPA.
Anthology ID:
2024.findings-eacl.121
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1722–1742
Language:
URL:
https://aclanthology.org/2024.findings-eacl.121
DOI:
Bibkey:
Cite (ACL):
Daisuke Oba, Masahiro Kaneko, and Danushka Bollegala. 2024. In-Contextual Gender Bias Suppression for Large Language Models. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1722–1742, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
In-Contextual Gender Bias Suppression for Large Language Models (Oba et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.121.pdf
Video:
 https://aclanthology.org/2024.findings-eacl.121.mp4