AI algorithms are only as fair as the data used to train them. If the data includes human biases, such as racial, gender or cultural sterotypes, then the AI can learn and repeat those biases in its decisions. For example, an AI used in hiring might reject applicants from certain groups if it has learned from past data that preferred one group over another. Bias in AI can have real-world consequences, especially in fields like health care, and law enforcement.
Every heard of the saying "You are what you eat?" AI is basically that, it is what the majority says. But can we change the algorithm to make AI have less of a Bias?
Things to consider:
Bias in AI algorithms
Data privacy and surveillance
Consent and transparency
Deepfakes and misinformation
Global ethical frameworks (e.g., OECD AI Principles)
Record your answer in your Student WORKBOOK.
Algorithm
Surveillance
Data Privacy
Do you really know that the following three terms are? You will need to understand what these are to gather a deeper understanding of bias and fairness with AI.
What is an Algorithm?
What is Surveillance?
What is Data Privacy?
Use your WORKBOOK to record important information from this videos.Â
Video length ( 11:19)
Watch the following video and take notes as there are questions on the Quiz about this video.
In your WORKBOOKÂ
In the video "What is Algorithmic Bias?", you have explored how artificial intelligence system, especially those that use algorithms to make decisions, can reflect and amplify human biases, even if unintentionally. Bias in AI doesn’t necessarily come from malice; it often stems from flawed or incomplete data, lack of diversity in design teams, or assumptions that have been included into the algorithm itself.
As you are preparing for future careers, it's crucial to understand that algorithmic bias isn’t just a tech issue, it’s a real-world challenge that affects every sector.
You’re a Grade 12 SHSM student in Technological Design. You’ve applied for a competitive co-op placement at a leading design firm. The company uses an AI-powered system to screen all applications and portfolios before a human ever sees them.
You don’t get selected. Several of your classmates are offered interviews, even though you know your portfolio is strong. When you ask about the process, you're told:
“The AI looks for portfolio elements and keywords that align with profiles of past successful applicants.”
This response leaves you with more questions than answers.
In your WORKBOOK, write down 5 thoughtful questions you would ask the design firm’s hiring team to better understand why the AI system might not have selected you.
Try to uncover:
how the AI made its decision,
what data it used,
how fairness was ensured, and
whether bias might have played a role.