Discussion 6

M4.1 Bias in Modeling

Post

O’Neil’s article (2023) reports on the voices of five women from tech fields who have been exposing problems with bias and discrimination in algorithms for years. Hendrycks and colleagues (2023) released a report earlier this month warning that AI poses  catastrophic risks to humanity.  Answer the following:

  • After reading the O’Neil article, explore further the work of either Buolamwini, Chowdhury, Gangadharan, Gebru, or Noble. Describe the site/content you reviewed, why you chose it, and what you learned.

  • The report by Hendrycks and colleagues (2023) underpins the AI risks of the “AI Doomers” as described in the O’Neil article (2023). One of the signers of the Statement on AI risks is Geoffry Hinton, an Emeritus Professor of Computer Science at the University of Toronto. He is quoted by O’Neil as saying “I believe that the possibility that digital intelligence will become much smarter than humans and will replace us as the apex intelligence is a more serious threat to humanity than bias and discrimination, even though bias and discrimination are happening now and need to be confronted urgently.” Do you agree? Why or why not?

  • Obermeyer and colleagues (2021) provide practical steps that organizations can take to diminish harmful effect of AI. Which of these overlap with calls from Buolamwini, Chowdhury, Gangadharan, Gebru, Noble, and the “AI Doomers”? Which steps do you think are most urgent and why?

Discussion posts are the primary assessment of your understanding and critical assessment of readings. You must reference the readings analyzed in your posts using in-text APA style. Posts should range between 400-500 words.

Due by: 10/22 11:59 pm EST

Respond

This week you are assigned to small groups again to learn what content your peers explored and their thoughts on the risks of AI. Read two of your group members’ posts and describe how your viewpoints converge and diverge. Please make sure that everyone in your group gets at least one set of comments.

Due by: 10/26 at 11:59pm EST

See here for Rubrics