Nikita Kalinin

PhD
Institute of Science and Technology Austria (ISTA)
Data Adaptive Differential Privacy

At the present time of increasing data demand, protecting the privacy of individuals is a principal concern. While Differential Privacy offers a compromise between leveraging new data for analysis and protecting sensitive user information by introducing noise to sensitive estimates, its tradeoff often does not satisfy the needs of data analysts and machine learners, preventing it from being widely applicable. One promising direction for enhancing the effectiveness of these methods is through a data-adaptive DP. By adapting privacy guarantees for different datasets, we can potentially achieve improved guarantees with the same level of noise, under the assumption that real-world data is reasonably "nice". In this PhD topic proposal, our goal is to explore the realm of data-adaptive differential privacy, with a focus on expanding its applicability to machine learning.

Track:
Academic Track
ELLIS Edge Newsletter
Join the 6,000+ people who get the monthly newsletter filled with the latest news, jobs, events and insights from the ELLIS Network.