Abstract
Generative AI models are shaping our future. In this work, we discover and expose the bias against physically challenged people in generative models. Generative models (Stable Diffusion XL and DALL·E 3) are unable to generate content related to the physically challenged, e.g., inclusive washroom, even with very detailed prompts. Our analysis reveals that this disability bias emanates from biased AI datasets. We achieve this using a novel strategy to automatically discover bias against underrepresented groups like the physically challenged. Finally, we track the root of this disability bias in search engines (Google, Bing, Yandex, and DuckDuckGo). Search engines suffer from disability bias for neutral prompts. The standard strategy of using synonyms to retrieve diverse results does not automatically include the physically challenged. Search engines require specific mention of the underrepresented group to retrieve relevant results. Therefore, conscious effort is required to include underrepresented groups while scraping datasets from search engines. A future built by generative AI having little understanding of physically challenged people will have serious implications. We hope that our effort lays the groundwork for future datasets and algorithms that include this underrepresented group.