Answer questions using data you cannot see.

We seek to make privacy-preserving AI easy by extending the major Deep Learning frameworks (PyTorch, Tensorflow, and Keras) with techniques for privacy such as: Federated Learning, Homomorphic Encryption, Secure Multi-party Computation, and Differential Privacy.

Furthermore, we seek to make privacy-techniques easy to deploy by building integrations allowing for training across Cloud, Android, iOS, CPU, GPU, and Javascript (Web) technologies.

Let's make the world more privacy preserving!

lightbulb_outline View ideas list


  • deep learning
  • federated learning
  • homomorphic encryption
  • secure multi-party computation
  • differential privacy


  • Other
  • privacy
  • artificial intelligence
comment IRC Channel
email Mailing list
mail_outline Contact email

OpenMined 2020 Projects

  • Anshuman Singh.
    Implement Auto-Scaling of PyGrid servers on Google Cloud
    The audience of PySyft largely consists of people who would like to train their model on private data that reside on other devices/locations. Right...
  • Ravikant Singh
    Implement Fan-Vercauteren Homomorphic Encryption Scheme in PySyft
    FV (Fan-Vercauteren) Homomorphic Encryption scheme is one of the leading approaches in homomorphic encryption. Homomorphic encryption is a form of...
  • Afzaal Hussain
    Performing a security audit on privacy-preserving, distributed learning methodologies running on PyGrid with respect to GDPR requirement.
    In order to bridge the gap between industry and new technology, this project will supply governance models which are relevant to Privacy-Preserving...
  • ArchitG
    Wrap Open-License Zero-Knowledge Proof Library
    Zero-knowledge proofs have an important role to play in the future of verified machine learning prediction. However, no deep learning framework has...