Contributions to Probabilistic and Statistical Foundations of Differential Privacy
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Université d'Ottawa / University of Ottawa
Abstract
It is undeniable that the rapid advancement of data analytics and artificial intelligence over the past decade has transformed many industries. However, these advancements have also highlighted the need for robust privacy-preserving techniques to protect personal data from misuse. Furthermore, increasing regulatory scrutiny and public awareness highlight the importance of protecting individual privacy as data-driven technologies evolve. This thesis addresses these concerns by exploring and advancing the field of differential privacy. The primary goal of this thesis is to provide a mathematical and statistical framework for differential privacy to better balance and answer questions related to data utility and privacy. Indeed, the majority of research to date focuses on privacy aspects, with little emphasis on data utility. As such, the thesis investigates privacy guarantees across different settings and statistical problems. We propose many novel mechanisms that integrate concepts from statistical disclosure control, statistics, time series, and machine learning along with classical differential privacy. For this we explore various extensions of ε- and (ε, δ)-differential privacy mechanisms. We prove the validity (from both the privacy and data utility perspective) of these proposed mechanisms using a rigorous mathematical framework. The theoretical results are complemented by a variety of numerical experiments to validate the underlying intuitions. The findings indicate that our contributions significantly improve data utility while offering strong privacy guarantees. As such, they can be practically implemented in real-world settings.
Description
Keywords
Differential Privacy, Privacy, Probability, Statistics, Disclosure Control
