Expanding our Differential Privacy Library
All developers have a responsibility to treat data with care and respect. Differential privacy helps organizations derive insights from data while simultaneously ensuring that those results do not allow any individual’s data to be distinguished or re-identified. This principled approach supports data computation and analysis across many of Google’s core products and features.
Last summer, Google open sourced our foundational differential privacy library so developers and organizations around the world can benefit from this technology. Today, we’re announcing the addition of Go and Java to our library, an end-to-end solution for differential privacy: Privacy on Beam, and new tools to help developers implement this technology effectively.
We’ve listened to feedback from our developer community and, as of today, developers can now perform differentially private analysis in Java and Go. We’re working to bring these two libraries to full feature parity with C++.
We want all developers to have access to differential privacy, regardless of their level of expertise. Our new Privacy on Beam framework captures years of Googler developer experience and efficiency improvements in a comprehensive and easy-to-use solution that handles computation end-to-end. Built on Apache Beam, Privacy on Beam can reduce implementation mistakes, and take care of all the steps that are essential to differential privacy, including noise addition, partition selection, and contribution bounding. If you’re new to Apache Beam or differential privacy, our codelab can get you started.
Tracking privacy budgets is another challenge developers face when implementing differential privacy. So, we’re also releasing a new Privacy Loss Distribution tool for tracking privacy budgets. With this tool, developers can maintain an accurate estimate of the total cost to user privacy for collections of differentially private queries, and better evaluate the overall impact of their pipelines. Privacy Loss Distribution supports widely used mechanisms (such as Laplace, Gaussian, and Randomized response) and can scale to hundreds of compositions.
We hope these new languages, tools, and features unlock differential privacy for even more developers. Continue to share your stories and suggestions with us at email@example.com—your feedback will help inform our future differential privacy launches and updates.
Software Engineers: Yurii Sushko, Daniel Simmons-Marengo, Christoph Dibak, Damien Desfontaines, Maria Telyatnikova
Research Scientists: Pasin Manurangsi, Ravi Kumar, Sergei Vassilvitskii, Alex Kulesza, Jenny Gillenwater, Kareem Amin
Related Google News:
- Our all-new TalkBack screen reader February 23, 2021
- Expanding our testing in San Francisco February 17, 2021
- What you can learn in our Q1 2021 Google Cloud Security Talks February 10, 2021
- Introducing Sqlcommenter: An open source ORM auto-instrumentation library January 28, 2021
- Expanding the reach of your Android Auto apps January 28, 2021
- How we’re helping developers with differential privacy January 28, 2021
- Preparing our partners for Apple’s iOS 14 policy updates January 27, 2021
- How our customers modernize business intelligence with BigQuery and Looker January 26, 2021