To help developers train AI agents with strong privacy guarantees, Google just released a machine learning library called TensorFlow Privacy. The library, which is open source, can be downloaded on GitHub.
Aside from training AI models with privacy, TensorFlow aims to “advance the state-of-the-art in machine learning with strong privacy guarantees.”
In a blog post published by Google Brain Product Manager Carey Radebaugh and Google Research Scientist Ulfar Erlingsson, the duo wrote:
“Modern machine learning is increasingly applied to create amazing new technologies and user experiences, many of which involve training machines to learn responsibly from sensitive data, such as personal photos or email. Ideally, the parameters of trained machine-learning models should encode general patterns rather than facts about specific training examples.”
Google plans to turn TensorFlow Privacy into a hub with all the best techniques in training machine learning models with strong privacy guarantees.
Machine Learning Library
TensorFlow Privacy operates on the principle of differential privacy. This is a statistical technique to maximize accuracy while balancing user information.
Differential privacy ensures that an AI model cannot encode information unique to a developer. This blocks all chances of a breach releasing a user’s identity.
Instead of gathering and storing information to learn, differential privacy enables an AI agent to acquire knowledge from patterns that show up en masse.
For instance, all the information being typed into Gmail is in the form of a Smart Reply. However, the differential privacy technique used by Google will not allow the personal data typed by a person on their email to show up as a Smart Reply in a stranger’s email.
Google hopes that by sharing TensorFlow Privacy, developers can integrate it into other machine learning tools or even improve its capabilities. The machine learning library only requires simple code changes and hyperparameter tuning to make it easier to use for both new and experienced developers.
Comments (0)
Most Recent