Accountable Information Use: Privacy and Fairness in Decision-Making Systems: Increasingly, decisions and actions affecting people's lives are determined by automated systems processing personal data. Excitement about these systems has been accompanied by serious concerns about their opacity and the threats that they pose to privacy, fairness, and other values. Recognizing these concerns, the investigators seek to make real-world automated decision-making systems accountable for privacy and fairness by enabling them to detect and explain violations of these values. The technical work is informed by, and applied to, online advertising, healthcare, and criminal justice, in collaboration with and as advised by domain experts.

Addressing privacy and fairness in decision systems requires providing formal definitional frameworks and practical system designs. The investigators provide new notions of privacy and fairness that deal with both protected information itself and proxies for it, while handling context-dependent, normative definitions of violations. A fundamental tension they address pits the access given to auditors of a system against the system owners' intellectual property protections and the confidentiality of the personal data used by the system. The investigators decompose such auditing into stages, where the level of access granted to an auditor is increased when potential (but not explainable) violations of privacy or fairness are detected. Workshops and public releases of code and data amplify the investigators' interactions with policy makers and other stakeholders. Their partnerships with outreach organizations encourage diversity.

News

2017-12-20 The New Yorker article New York City’s Bold, Flawed Attempt to Make Algorithms Accountable mentions Co-PI's Helen Nissenbaum and Tom Ristenpart.
2017-11-01 Two of our papers were presented at CCS 2017: Use Privacy in Data-driven Systems and Machine Learning Models that Remember Too Much
2017-08-04 CMU News Release about the project: Decision systems that respect privacy, fairness .
2017-08-01 Our paper, Proxy Discrimination in Data-driven Systems, has appeared on arXiv.

People


Anupam Datta

CMU
(principal investigator)

Matt Fredrikson

CMU
(co-principal investigator)

Ole Mengshoel

CMU
(co-principal investigator)

Helen Nissenbaum

Cornell Tech
(co-principal investigator)

Tom Ristenpart

Cornell Tech
(co-principal investigator)

Michael C. Tschantz

ICSI
(co-principal investigator)

Piotr Mardziel

CMU
(senior person)

Alexandra Chouldechova

CMU
(collaborator)

Saikat Guha

Microsoft Research
(collaborator)

Daniel Neill

CMU
(collaborator)

David Page

University of Wisconsin-Madison
(collaborator, advisory board)

Alfred Blumstein

CMU
(advisory board)

Anne Milgram

NYU Law
(advisory board)

Jeannette M. Wing

Columbia University
(advisory board)

Events

This material is based upon work supported by the National Science Foundation under Grant Number CNS 1704845.