There are two complementary approaches to private and secure machine learning: differential privacy can guarantee privacy of the subjects of the training data with respect to the output of a differentially private learning algorithm, while cryptographic approaches can guarantee secure operation of the learning process in a potentially distributed environment. The aim of this workshop is to bring together researchers interested in private and secure machine learning, to stimulate interactions to advance either perspective or to combine them.
Keywords: differential privacy, encrypted data, homomorphic encryption
As discussed at the end of the workshop, the Google group Private Multi-Party Machine Learning can be used for communication among the community.
PSML 2017 is a one-day workshop on August 11, 2017. The workshop includes two invited talks and several contributed talks. See our CFP for details about the abstract submissions for the contributed talks. The workshop will accept both unpublished and published works, and the submissions of unpublished works are not considered as publication.