Section 8.3
Research Algorithmic Bias
Learning Goals
Students will describe how computing innovations can reflect human biases because of biases written into the algorithms or biases in the data used by the innovation. (IOC-1.D.1)
Students will list actions programmers and analysts can do to prevent algorithmic bias. (IOC-1.D.2)
Students will explain how biases can be embedded at all levels of software development (IOC-1.D.3)
Objectives and General Description
Algorithms are a part of our everyday life. Even though algorithms are created with good intent, they may not be as objective and neutral as we think. The impact of algorithmic bias has far reaching consequences. An awareness of past mistakes and a shift in the design approach can remove this bias from the development of future computing innovations. This section will introduce students to examples of algorithmic bias and a refinement of their Culturally Authentic Protocols that the students developed in Section 4.7. Students will also learn how the Design Thinking approach incorporates an empathy component to help prevent future instances of algorithmic bias.
The students will begin by defining bias and watching a TED talk by Joy Buolamwini in which she discusses her experiences with algorithmic bias. She also shares her journey to counteract this bias. Students will then research and share other examples of algorithmic bias. Finally, the students will learn how the Design Thinking approach uses empathy as a preventive measure to reduce the likelihood of algorithmic bias.
Activities
Activity 8.3.1 (budget 25 minutes)
Ask students to define bias. Dictionary.com defines bias as: prejudice, a strong inclination of the mind or a preconceived opinion about something or someone. A bias may be favorable or unfavorable: bias in favor of or against an idea.
Another common term currently is "implicit bias". Ask students what this means. Implicit bias refers to the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner.
Now pose this questions: Can computer programs be biased?
Show this TED talk from Joy Buolamwini, "How I'm Fighting Bias in Algorithms". This is an 8 minute talk. Don't have students take notes; just let them listen. It is a powerful story.
Before there is any class discussion, have students write down their personal thoughts. What was something they learned from the TED talk? How does this relate to themselves? Have they personally experienced anything similar? Does this change their approach to creating artifacts/programs/analysis in our class? If so, how?
Ask for student volunteers to share their reflections. This could be a highly emotional conversation. Make sure you have developed norms for discussion in your classroom that facilitate an open and honest, judgement-free conversation.
What were Joy's three recommendations for combating algorithmic bias?
Who codes matters: "Are we creating full-spectrum teams with diverse individual who can check each other’s blind spots?"
How we code matters: "Are we factoring in fairness as we are developing systems?"
Why we code matters: " We’ve used tools of computational creation to unlock immense wealth. We now have the opportunity to unlock even greater equality if we make social change a priority and not an afterthought."
Activity 8.3.2 (budget 30 minutes)
The previous activity illustrated the algorithmic bias in facial recognition programs. Have students brainstorm the implications for this bias in different industries.
Algorithmic bias extends further than facial recognition. The students will next be researching different examples of algorithmic bias.
Have students work with a partner and find additional examples of algorithmic bias. You can assign categories (judicial system, healthcare, law enforcement, etc.), have them find an example that pertains to their team project or just let them find random examples.
Resources that may be helpful:
http://mediashift.org/2017/02/how-to-enable-students-to-identify-bias-in-product-development/
https://blogs.scientificamerican.com/voices/i-know-some-algorithms-are-biased-because-i-created-one/
https://www.newscientist.com/article/2166207-discriminating-algorithms-5-times-ai-showed-prejudice/
https://www.forbes.com/sites/cognitiveworld/2020/02/07/biased-algorithms/#68ff889a76fc
Have students share their research findings by building a slideshow. Each set of students adds a slide that describes the example that they found and the impact that it could make.
Revisit the class set of culturally authentic protocols that were developed in Section 4.7. Analyze them for any potential bias issues. Modify or add as needed.
Activity 8.3.3 (budget 15 minutes)
How Design Thinking in the Software Development Cycle can reduce the Likelihood of Algorithmic Bias
Now that students are familar with algorithmic bias, share the development story of the MRI and it's impact on children.
You can read about this transformation using these resources
Share the story with the students using the images in this design thinking and MRI slideshow to make it more impactful.
Ask the students: What made the difference? Was it in how the machine technically worked?
The difference was that the intended audience wasn't taken into consideration. There was an empathy component missing. The technical solution was accurate. However, there was such a disconnect between the purpose of the machine and the children who needed to use it, that the effect was that children were having to be sedated to use it. If the designer had looked the product through the eyes of the user during the design stages, this could have been prevented.
Sometimes, we create a solution that WE think will work without talking to the people that will actually be using our product to see what they need. This can also lead to the algorithmic bias that we have been studying.
Ask the students: How do you fix this? Where do you modify the software development life cycle?
Review the agile cycle.
The design stage needs to include feedback from the potential users. This can be accomplished through interviews, surveys or a combination of communication platforms with potential users. The feedback needs to be taken into consideration at the design phase to prevent a disconnect between the user and the product.
This empathy component can be incorporated in our projects and will be brought in during Sprint 2 of our ARC Challenge #4.
Here is an excellent lesson plan on Algorithmic Bias from the Anti-Defamation League if you want to extend this topic.