Impact of Bias on Hiring and School Admissions
There is ample evidence to show that implicit bias in evaluations of candidates can have significantly disparate impact on different demographic groups. Motivated by this, we present the first mathematical analysis of the impact of bias on: (i) online hiring of candidates and (ii) rank of schools that students get matched to. In each application, we let the candidates belong to disjoint groups, for instance defined by race, gender, nationality, age, etc. Each candidate has a “true” nonnegative utility Z \in R and an “observed” utility \beta Z, where \beta is an unknown group-dependent bias factor typically less than 1. In the hiring setting, we show that when the observed evaluations of candidates incorporate a group-specific bias, algorithms developed for the unbiased online secretary problem (i.e., group-agnostic methods) can be suboptimal. We propose group-aware parallelization as an effective mechanism for counteracting bias, and this can be tweaked for fairly general settings. In the schools matching setting, we show that biased candidates are heavily penalized in terms of matched school rankings, even for relatively small amounts of bias. We propose targeted interventions with limited resources to de-bias different ranges of students dependent on the extent of bias.
This talk is based on ongoing work with Jad Salem, Yuri Faenza and Xuan Zhang.