Seminars

NO.164 Differential Privacy and its Applications

Shonan Village Center

October 28 - November 1, 2024 (Check-in: October 27, 2024 )

Organizers

  • Marco Gaboardi
    • Boston University, USA
  • Jun Sakuma
    • University of Tsukuba / RIKEN, Japan
  • Thomas Steinke
    • IBM Research at Almaden, USA

Overview

The schedule has been updated from the previous one: October 31 - November 4, 2022 (Check-in: October 30, 2022).

Differential privacy is a mathematically-formalized standard for the privacy-preserving analysis and use of data: it permits a broad array of data analysis methods while imultaneously offering a strong guaranteed bound on the increase in harm that any user suffers as a result of participating in a differentially private data analysis. Over the past decade, a rich literature has developed and differential privacy has slowly been accepted as a “gold standard” for data privacy. This has culminated in its adoption by government agencies like the US Census Bureau and companies like Apple, Google, and Microsoft.

The development of differential privacy has been given impetus by the increasing scale of private data being collected and used. Traditional de-identification techniques have suffered high-profile failures as they struggle to keep pace with the abundance of information available to potential attackers. The distinguishing feature of differential privacy is that it makes no assumption about the background information known to a potential attacker.

One of the reasons for the success of differential privacy is that it can be integrated into algorithmic and machine learning techniques to provide a variety of data analysis tools that are at the same time private and accurate. Furthermore, these can be composed in a modular way to develop sophisticated systems for using private data. In addition, differential privacy has also found application in areas of data analysis where privacy is not necessarily a concern, such as in ensuring fairness of algorithms and in guaranteeing statistical validity in adaptive data analysis.

Differential privacy is studied by researchers in different research areas of computer science, such as algorithms, cryptography, machine learning, programming languages, security, databases, as well as from several areas of statistics, data science, information theory, law and policy making, and social science. This broad community often work in the different areas in isolation or with limited communication between areas. We believe that for differential privacy to be successful, a conjunct effort of researchers in all these areas is needed.

The overall goal of the meeting is to foster the discussion between researchers in academia and industry that are working in different areas of differential privacy and its applications. The common ground between the different participants will be the interest in the different aspects of data privacy. On the applications side, we aim to stimulate the discussion around the different tools and assumptions that are needed to put differential privacy to work in practice. On the theoretical side instead, we aim to advance the discussion around foundational issues about algorithms and models for privacy. A further goal of the meeting is exploring the applicability of the most recent techniques developed in the setting of differential privacy to problems in different research areas, such as algorithmic fairness and guaranteeing statistical validity in adaptive data analysis.

An expected outcome of the meeting is to acquire, as a community of researchers interested in data privacy, a common understanding of the technical challenges and the needs for making differential privacy practical. Moreover, we expect the meeting to offer to early-career researchers an opportunity to present and discuss their work with established researchers in the field.

We plan to involve participants working on differential privacy, as well as participants with a broader interest in data privacy that can contribute to the understanding and use of differential privacy. In particular, we seek to include people with expertise on:

  • optimization problems
  • programming tools and systems
  • benchmarking data privacy tools
  • statistical analysis
  • information theory
  • privacy and overfitting attacks
  • algorithmic design
  • trusted security and privacy models
  • algorithmic fairness
  • generalization properties of data analyses

The meeting will encourage exchanges between researchers representing these different research areas and communities. Another expected outcome of the meeting are new
collaborations between participants traditionally working in separate research areas.

Comparison with previous workshops

The topic of the proposed seminar is related to the previous seminars: 116 “Anonymization methods and inference attacks: theory and practice” and 069 “Logic and Verification Methods in Security and Privacy”. The main distinctive feature of our proposed seminar is that it will focus on the notion of Differential Privacy and its applications, rather than on other notions (e.g. anonymity) like previous seminars.

Similar events focusing on differential privacy have been held at the Banff International Research Station (April/May 2018, co-organized by Thomas Steinke), and at DIMACS at
Rutgers University (October 2012). In addition, there will be a program in spring 2019 at the Simons institute for the Theory of Computing at UC Berkeley (Marco Gaboardi will organize part of this program). Our proposal is to have the first event of this kind in Asia.