Affiliation:
1. Leavey School of Business, Santa Clara University, Santa Clara, California, USA
2. Haas School of Business, University of California, Berkeley, California, USA;
Abstract
This article reviews the recent literature on algorithmic fairness, with a particular emphasis on credit scoring. We discuss human versus machine bias, bias measurement, group versus individual fairness, and a collection of fairness metrics. We then apply these metrics to the US mortgage market, analyzing Home Mortgage Disclosure Act data on mortgage applications between 2009 and 2015. We find evidence of group imbalance in the dataset for both gender and (especially) minority status, which can lead to poorer estimation/prediction for female/minority applicants. Loan applicants are handled mostly fairly across both groups and individuals, though we find that some local male (nonminority) neighbors of otherwise similar rejected female (minority) applicants were granted loans, something that warrants further study. Finally, modern machine learning techniques substantially outperform logistic regression (the industry standard), though at the cost of being substantially harder to explain to denied applicants, regulators, or the courts.
Subject
Economics and Econometrics,Finance
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献