It predicted that black defendants were at a higher risk of being a repeat offender.
The analysis determined that black defendants were 77% more likely to be at a higher risk score than white defendants in prior violent crimes.
Amazon's recruiting model was trained on the data from resumes over the span of 10 years. Most of the candidates within that time frame were men, resulting in discrimination against women.
The algorithm looked for assertive and dominating keywords in the resumes, which were mainly used by men. The most ideal candidates according to the AI model were men, which reflected the bias of the training data.
A study deemed that black patients needed to be considered sicker compared to white patients in order to receive similar care due to black patients historically spending less on their healthcare compared to white patients because of income disparity.
An algorithm used in the US healthcare revealed that black patients were 40% less likely to receive pain medication, and Hispanic patients were 25% less likely compared to white patients.