Blogs (1) >>
ASE 2019
Sun 10 - Fri 15 November 2019 San Diego, California, United States
Fri 15 Nov 2019 09:15 - 10:30 at Cortez 1B - Welcome and Keynote Chair(s): Matthias Tichy

Modern software contributes to important societal decisions, and yet we know very little about its fairness properties. Can software discriminate? Evidence of software discrimination has been found in systems that recommend criminal sentences, grant access to loans and other financial products, transcribe YouTube videos, translate text, and perform facial recognition. Systems that select what ads to show users can similarly discriminate. For example, a professional social network site could, hypothetically, learn stereotypes and only advertise stereotypically female jobs to women and stereotypically male ones to men. Despite existing evidence of software bias, and significant potential for negative consequences, little technology exists to test software for such bias, to enforce lack of bias, and to learn fair models from potentially biased data. Even defining what it means for software to discriminate is a complex task. I will present recent research that defines software fairness and discrimination; develops a testing-based, causality-capturing method for measuring if and how much software discriminates and provides provable formal guarantees on software fairness; and demonstrates how framing problems as fairness-constrained contextual bandits can reduce not only bias but also impact of bias. I will also describe open problems in software fairness and how recent advances in machine learning and natural language modeling can help address them. Overall, I will argue that enabling and ensuring software fairness requires solving research challenges across computer science, including in machine learning, software and systems engineering, human-computer interaction, and theoretical computer science.

My research is in software engineering. I am interested in improving our ability to build systems that are smart, and self-adapt to their environment. I am particularly interested in ensuring fairness in software systems. Watch a video describing my latest work on software fairness testing here:

Fri 15 Nov
Times are displayed in time zone: (GMT-07:00) Tijuana, Baja California change

09:00 - 10:30: EXPLAIN 2019 - Welcome and Keynote at Cortez 1B
Chair(s): Matthias TichyUlm University, Germany
explain-2019-papers09:00 - 09:15
Day opening
explain-2019-papers09:15 - 10:30
Yuriy BrunUniversity of Massachusetts Amherst