Out of dataset, out of algorithm, out of mind: a critical evaluation of AI bias against disabled people

AI and Society:1-11 (forthcoming)
  Copy   BIBTEX

Abstract

Generative AI models are shaping our future. In this work, we discover and expose the bias against physically challenged people in generative models. Generative models (Stable Diffusion XL and DALL·E 3) are unable to generate content related to the physically challenged, e.g., inclusive washroom, even with very detailed prompts. Our analysis reveals that this disability bias emanates from biased AI datasets. We achieve this using a novel strategy to automatically discover bias against underrepresented groups like the physically challenged. Finally, we track the root of this disability bias in search engines (Google, Bing, Yandex, and DuckDuckGo). Search engines suffer from disability bias for neutral prompts. The standard strategy of using synonyms to retrieve diverse results does not automatically include the physically challenged. Search engines require specific mention of the underrepresented group to retrieve relevant results. Therefore, conscious effort is required to include underrepresented groups while scraping datasets from search engines. A future built by generative AI having little understanding of physically challenged people will have serious implications. We hope that our effort lays the groundwork for future datasets and algorithms that include this underrepresented group.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 101,173

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2024-12-29

Downloads
3 (#1,850,298)

6 months
3 (#1,471,056)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references