• search hit 6 of 10
Back to Result List

Why does the robot only select men? How women and men perceive autonomous social robots that have a gender bias

  • Abstract Future social robots will act autonomously in the world. Autonomous behavior is usually realized by using AI models built with real-world data, which often reflect existing inequalities and prejudices in society. Even if designers do not intend it, there are risks that robots will be developed that discriminate against certain users, e. g. based on gender. In this work, we investigate the implications of a gender-biased robot that disadvantages women, which unfortunately is a bias in AI that is often reported. Our experiment shows that both men and women perceive the gender-biased robot to be unfair. However, our work indicates that women are more aware that a gender bias causes this unfairness. We also show that gender bias results in the robot being perceived differently. While the gender bias resulted in lower likability and intelligence ratings by women, men seem to lose trust in the robot if it behaves unfairly.

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Sebastian Thomas BüttnerORCiD, Maral GoudarziORCiD, Michael PrillaORCiD
DOI:https://doi.org/10.1145/3670653.3677492
ISBN:979-8-4007-0998-2
Parent Title (English):MuC '24: Proceedings of Mensch und Computer 2024, Karlsruhe Germany September 1-4, 2024
Publisher:ACM Digital Library
Document Type:Conference Proceeding
Language:English
Date of Publication (online):2024/09/01
Date of first Publication:2024/09/01
Publishing Institution:Westfälische Hochschule Gelsenkirchen Bocholt Recklinghausen
Release Date:2025/06/20
Tag:Empirical studies in HCI; Experiment; Fairness; Gender Bias; Human-Robot Interaction; Human-centered computing; Social Robot
Pagenumber:6 Seiten
First Page:479
Last Page:484
Licence (German):License LogoEs gilt das Urheberrechtsgesetz

$Rev: 13159 $