Affiliation:
1. Inria 8 Ecole Normale Supérieure, Paris, France
2. New York University, Brooklyn, NY, USA
Abstract
The data revolution continues to transform every sector of science, industry, and government. Due to the incredible impact of data-driven technology on society, we are becoming increasingly aware of the imperative to use data and algorithms responsibly—in accordance with laws and ethical norms. In this article, we discuss three recent regulatory frameworks: the European Union’s General Data Protection Regulation (GDPR), the New York City Automated Decisions Systems (ADS) Law, and the Net Neutrality principle, which aim to protect the rights of individuals who are impacted by data collection and analysis. These frameworks are prominent examples of a global trend: Governments are starting to recognize the need to regulate data-driven algorithmic technology.
Our goal in this article is to bring these regulatory frameworks to the attention of the data management community and to underscore the technical challenges they raise and that we, as a community, are well-equipped to address. The main takeaway of this article is that legal and ethical norms cannot be incorporated into data-driven systems as an afterthought. Rather, we must think in terms of responsibility by design, viewing it as a systems requirement.
Funder
Agence Nationale de la Recherche
National Science Foundation
Publisher
Association for Computing Machinery (ACM)
Subject
Information Systems and Management,Information Systems
Reference32 articles.
1. Data Profiling
2. Managing your digital life
3. Julia Angwin Jeff Larson Surya Mattu and Lauren Kirchner. 2016. Machine bias: Risk assessments in criminal sentencing. ProPublica (May 2016). Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. Julia Angwin Jeff Larson Surya Mattu and Lauren Kirchner. 2016. Machine bias: Risk assessments in criminal sentencing. ProPublica (May 2016). Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
4. Alexandra Chouldechova. 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Retrieved from http://arxiv.org/abs/1703.00056. Alexandra Chouldechova. 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Retrieved from http://arxiv.org/abs/1703.00056.
5. Danielle K. Citron and Frank A. Pasquale. 2014. The scored society: Due process for automated predictions. Washington Law Rev. 89 (2014). Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2376209. Danielle K. Citron and Frank A. Pasquale. 2014. The scored society: Due process for automated predictions. Washington Law Rev. 89 (2014). Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2376209.
Cited by
34 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献