[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3243734.3243863acmconferencesArticle/Chapter ViewAbstractPublication PagesccsConference Proceedingsconference-collections
research-article

DP-Finder: Finding Differential Privacy Violations by Sampling and Optimization

Published: 15 October 2018 Publication History

Abstract

We present DP-Finder, a novel approach and system that automatically derives lower bounds on the differential privacy enforced by algorithms. Lower bounds are practically useful as they can show tightness of existing upper bounds or even identify incorrect upper bounds. Computing a lower bound involves searching for a counterexample, defined by two neighboring inputs and a set of outputs, that identifies a large privacy violation. This is an inherently hard problem as finding such a counterexample involves inspecting a large (usually infinite) and sparse search space. To address this challenge, DP-Finder relies on two key insights. First, we introduce an effective and precise correlated sampling method to estimate the privacy violation of a counterexample. Second, we show how to obtain a differentiable version of the problem, enabling us to phrase the search task as an optimization objective to be maximized with state-of-the-art numerical optimizers. This allows us to systematically search for large privacy violations. Our experimental results indicate that DP-Finder is effective in computing differential privacy lower bounds for a number of randomized algorithms. For instance, it finds tight lower bounds in algorithms that obfuscate their input in a non-trivial fashion.

Supplementary Material

MP4 File (p508-bichsel.mp4)

References

[1]
Martin Abadi, Paul Barham, Jianmin Chen, Zhifeng Chen, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Geoffrey Irving, Michael Isard, Manjunath Kudlur, Josh Levenberg, Rajat Monga, Sherry Moore, Derek G. Murray, Benoit Steiner, Paul Tucker, Vijay Vasudevan, Pete Warden, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. 2016. TensorFlow: A System for Large-scale Machine Learning. In Proc. Operating Systems Design and Implementation (OSDI). 265--283. http://dl.acm.org/citation.cfm?id=3026877.3026899
[2]
Martin Abadi, Andy Chu, Ian Goodfellow, H. Brendan McMahan, Ilya Mironov, Kunal Talwar, and Li Zhang. 2016. Deep Learning with Differential Privacy. In Proc. Conference on Computer and Communications Security (CCS). 308--318.
[3]
Aws Albarghouthi and Justin Hsu. 2017. Synthesizing Coupling Proofs of Differential Privacy. Proc. ACM Program. Lang., Vol. 2, POPL, Article 58 (2017), bibinfonumpages30 pages.
[4]
Gilles Barthe, Thomas Espitau, Benjamin Grégoire, Justin Hsu, and Pierre-Yves Strub. 2017. Proving uniformity and independence by self-composition and coupling. In International Conferences on Logic for Programming, Artificial Intelligence and Reasoning (LPAR). 19. https://hal.sorbonne-universite.fr/hal-01541198
[5]
Gilles Barthe, Marco Gaboardi, Emilio Jesús Gallego Arias, Justin Hsu, César Kunz, and Pierre-Yves Strub. 2014. Proving Differential Privacy in Hoare Logic. In Proc. Computer Security Foundations Symposium (CSF). 411--424.
[6]
Gilles Barthe, Marco Gaboardi, Justin Hsu, and Benjamin Pierce. 2016. Programming Language Techniques for Differential Privacy. ACM SIGLOG News, Vol. 3, 1 (2016), 34--53.
[7]
Andrea Bittau, Úlfar Erlingsson, Petros Maniatis, Ilya Mironov, Ananth Raghunathan, David Lie, Mitch Rudominer, Ushasree Kode, Julien Tinnes, and Bernhard Seefeld. 2017. Prochlo: Strong Privacy for Analytics in the Crowd. In Proc. Symposium on Operating Systems Principles (SOSP). 441--459.
[8]
Avrim Blum, Cynthia Dwork, Frank McSherry, and Kobbi Nissim. 2005. Practical Privacy: The SuLQ Framework. In Proc. Principles of Database Systems (PODS). 128--138.
[9]
Y. Chen and A. Machanavajjhala. 2015. On the Privacy Properties of Variants on the Sparse Vector Technique. (2015). http://www.jstor.org/stable/166789
[10]
Shiva Prasad Kasiviswanathan, Kobbi Nissim, Sofya Raskhodnikova, and Adam Smith. 2013. Analyzing Graphs with Node Differential Privacy. In Theory of Cryptography Conference (TCC). 457--476.
[11]
Assimakis Kattis and Aleksandar Nikolov. 2017. Lower Bounds for Differential Privacy from Gaussian Width. In Int. Symposium on Computational Geometry (SoCG). 45:1--45:16.
[12]
Min Lyu, Dong Su, and Ninghui Li. 2017. Understanding the Sparse Vector Technique for Differential Privacy. Proc. VLDB Endow., Vol. 10, 6 (2017), 637--648.
[13]
Frank D. McSherry. 2009. Privacy Integrated Queries: An Extensible Platform for Privacy-preserving Data Analysis. In Proc. SIGMOD International Conference on Management of Data. 19--30.
[14]
Kobbi Nissim, Sofya Raskhodnikova, and Adam Smith. 2007. Smooth Sensitivity and Sampling in Private Data Analysis. In Proc. Symposium on Theory of Computing (STOC). 75--84.
[15]
N. Papernot, M. Abadi, Ú. Erlingsson, I. Goodfellow, and K. Talwar. 2016. Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data. (2016). stat.ML/1610.05755
[16]
J. Priya Inala, S. Gao, S. Kong, and A. Solar-Lezama. 2018. REAS: Combining Numerical Optimization with SAT Solving. (2018). cs.PL/1802.04408
[17]
Davide Proserpio, Sharon Goldberg, and Frank McSherry. 2014. Calibrating Data to Sensitivity in Private Data Analysis. Proc. VLDB Endow., Vol. 7, 8 (2014), 637--648.
[18]
Jason Reed and Benjamin C. Pierce. 2010. Distance Makes the Types Grow Stronger: A Calculus for Differential Privacy. In Proc. International Conference on Functional Programming (ICFP). 157--168.
[19]
Benjamin I. P. Rubinstein and Francesco Aldà. 2017. Pain-Free Random Differential Privacy with Sensitivity Sampling. In Proc. International Conference on Machine Learning, (ICML). 2950--2959.
[20]
A. W. van der Vaart. 1998. Asymptotic Statistics.
[21]
Daniel Winograd-Cort, Andreas Haeberlen, Aaron Roth, and Benjamin C. Pierce. 2017. A Framework for Adaptive Differential Privacy. Proc. ACM Program. Lang., Vol. 1, ICFP, Article 10 (2017), 29 pages.

Cited By

View all
  • (2025)Auditing privacy budget of differentially private neural network modelsNeurocomputing10.1016/j.neucom.2024.128756614(128756)Online publication date: Jan-2025
  • (2024)Curator Attack: When Blackbox Differential Privacy Auditing Loses Its PowerProceedings of the 2024 on ACM SIGSAC Conference on Computer and Communications Security10.1145/3658644.3690367(3540-3554)Online publication date: 2-Dec-2024
  • (2024)QueryCheetah: Fast Automated Discovery of Attribute Inference Attacks Against Query-Based SystemsProceedings of the 2024 on ACM SIGSAC Conference on Computer and Communications Security10.1145/3658644.3690272(3451-3465)Online publication date: 2-Dec-2024
  • Show More Cited By

Index Terms

  1. DP-Finder: Finding Differential Privacy Violations by Sampling and Optimization

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CCS '18: Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security
      October 2018
      2359 pages
      ISBN:9781450356930
      DOI:10.1145/3243734
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 15 October 2018

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. differential privacy
      2. lower bounds
      3. optimization
      4. sampling

      Qualifiers

      • Research-article

      Funding Sources

      • ERC Starting Grant 2015

      Conference

      CCS '18
      Sponsor:

      Acceptance Rates

      CCS '18 Paper Acceptance Rate 134 of 809 submissions, 17%;
      Overall Acceptance Rate 1,261 of 6,999 submissions, 18%

      Upcoming Conference

      CCS '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)86
      • Downloads (Last 6 weeks)14
      Reflects downloads up to 11 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Auditing privacy budget of differentially private neural network modelsNeurocomputing10.1016/j.neucom.2024.128756614(128756)Online publication date: Jan-2025
      • (2024)Curator Attack: When Blackbox Differential Privacy Auditing Loses Its PowerProceedings of the 2024 on ACM SIGSAC Conference on Computer and Communications Security10.1145/3658644.3690367(3540-3554)Online publication date: 2-Dec-2024
      • (2024)QueryCheetah: Fast Automated Discovery of Attribute Inference Attacks Against Query-Based SystemsProceedings of the 2024 on ACM SIGSAC Conference on Computer and Communications Security10.1145/3658644.3690272(3451-3465)Online publication date: 2-Dec-2024
      • (2024)PP-CSA: Practical Privacy-Preserving Software Call Stack AnalysisProceedings of the ACM on Programming Languages10.1145/36498568:OOPSLA1(1264-1293)Online publication date: 29-Apr-2024
      • (2024)DP-Discriminator: A Differential Privacy Evaluation Tool Based on GANProceedings of the 21st ACM International Conference on Computing Frontiers10.1145/3649153.3649211(285-293)Online publication date: 7-May-2024
      • (2024)Distributed Differential Privacy via Shuffling Versus Aggregation: A Curious StudyIEEE Transactions on Information Forensics and Security10.1109/TIFS.2024.335147419(2501-2516)Online publication date: 2024
      • (2024)Eureka: A General Framework for Black-box Differential Privacy Estimators2024 IEEE Symposium on Security and Privacy (SP)10.1109/SP54263.2024.00166(913-931)Online publication date: 19-May-2024
      • (2024)Lower Bounds for Rényi Differential Privacy in a Black-Box Setting2024 IEEE Symposium on Security and Privacy (SP)10.1109/SP54263.2024.00134(951-971)Online publication date: 19-May-2024
      • (2024)Synthesizing Tight Privacy and Accuracy Bounds via Weighted Model Counting2024 IEEE 37th Computer Security Foundations Symposium (CSF)10.1109/CSF61375.2024.00048(449-463)Online publication date: 8-Jul-2024
      • (2024)Federated learning with differential privacy via fast Fourier transform for tighter-efficient combiningScientific Reports10.1038/s41598-024-77428-014:1Online publication date: 5-Nov-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media