OpenAI’s Sora Is Plagued by Sexist, Racist, and Ableist Biases
1 min read
Summary
A Wired investigation of OpenAI’s Sora tool has found the AI-generated videos continue to be sexist, racist and ableist, reinforcing harmful stereotypes.
Sora perpetuates gender roles by always depicting men as CEOs and women as flight attendants, while disabled people are only shown using wheelchairs and interracial relationships areUnsupported relationay tricky to generate.
Upon examination, 50% of women depicted in Sora-generated videos were smiling compared to 0% of men, supporting the idea of patriarchal expectations of women, says Amy Gaeta, research associate at the University of Cambridge’s Leverhulme Center for the Future of Intelligence.
Sora’s developers have said the tool is intended to produce less biased results and that the company is researching how to change its training data and adjust user prompts to generate less biased videos, but OpenAI declined to give further details.