What happens when you ask an AI image generator for a regular photo of a person? University of Washington Research suggests that the result could be influenced by sexual and racial stereotypes. One study found that the Stable Diffusion image generator overrepresented light-skinned men, underrepresented indigenous people, and even sexualized some women of color when asked to create an image of a “person.” As AI becomes more prevalent in our daily lives, these types of human biases and biases have the potential to spread and cause more harm.
Sourojit Ghosh is a fourth-year doctoral student. candidate in human-centered design and engineering at the University of Washington. Ramón Alvarado is assistant professor of philosophy and member of the Data Science Initiative at the University of Oregon. They both study the ethics of artificial intelligence and join us to talk about the challenges it poses.
Contact “Think Out Loud®”
If you would like to comment on any of the topics in this show or suggest your own topic, please contact us on Facebooksend an email to thinkoutloud@opb.org, or you can leave us a voicemail at 503-293-1983. The telephone number to call at noon is 888-665-5865.