16
Special Topic: Fairness in Machine Learning Chao Lan

Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Special Topic: Fairness in Machine Learning

Chao Lan

Page 2: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Machine learning is increasingly applied in sensitive areas.

Page 3: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Fairness matters in recidivism prediction.

Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks.

- ProPublica, 2016.

Page 4: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Fairness matters in auto health assessment.

Page 5: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Amazon scraps secret AI recruiting tool that showed bias against women.

- Reuters, 2018.

Fairness matters in auto job hiring.

Page 6: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Klare et al. Face recognition performance: Role of demographic information. IEEE Trans. Information Forensics and Security, 2012.

Other bias: commercial facial recognition.

Page 7: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

R. Tatman. Google’s speech recognition has a gender bias, 2016.

Other bias: GoogleVoice.

Page 8: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

R. Ehrenberg. Data-driven crime prediction fails to erase human bias. Science News, 2017.

Other bias: drug use prediction.

Page 9: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

W. Hutson. Even artificial intelligence can acquire biases against race and gender. Science, 2017.

Other bias: word embedding techniques.

Page 10: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

“ To avoid exacerbating biases by encoding them into

technology systems, we need to develop a principle of

‘equal opportunity by design’ -- designing data

systems that promote fairness and safeguard against

discrimination from the first step of the engineering

process and continuing throughout their lifespan. ”

Algorithmic fairness is a priority in the US AI R&D strategic plan (2016).

Page 11: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

And it remains a priority in the 2019 update.

“Beyond purely data-related issues, however, larger

questions arise about the design of AI to be inherently

just, fair, transparent, and accountable.

Scientists must also study to what extent justice and

fairness considerations can be designed into the

system, and how to accomplish this within the bounds

of current engineering techniques.”

Page 12: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Many initiatives on fairness research.

many workshops, papers, articles...

Page 13: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

What is “fairness” in algorithmic prediction?

Amazon scraps secret AI recruiting tool that showed bias against women.

- Reuters, 2018.

Statistical Disparity means big gap between

- Pr{ f(x) = hire | x is male}

- Pr{ f(x) = hire | x is female}

Equalized error rates means similar

- Pr{ f(x) = hire | x is male & y = not hired}

- Pr{ f(x) = hire | x is female & y = not hired}

Individual Fairness means

- f(x) = f(z) if x and z are equally qualified

Page 14: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Where does algorithmic bias come from?

Page 15: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

W. Norton. Cultural geography: Environments, landscapes, identities, inequalities. Oxford University Press, 2013.

Does hiding sensitive attribute help?

Page 16: Special Topic: Fairness in Machine Learningclan/teach/ml20/ml20_fairness.pdf · Machine learning is increasingly applied in sensitive areas. Fairness matters in recidivism prediction

Let’s design a fair learner for linear model.