Private and Secure Machine Learning 2017
@ICML 2017 Workshop, Sydney, August 11, 2017
Welcome to Private and Secure Machine Learning 2017
There are two complementary approaches to private and secure machine learning: differential privacy can guarantee privacy of the subjects of the training data with respect to the output of a differentially private learning algorithm, while cryptographic approaches can guarantee secure operation of the learning process in a potentially distributed environment. The aim of this workshop is to bring together researchers interested in private and secure machine learning, to stimulate interactions to advance either perspective or to combine them.
Keywords: differential privacy, encrypted data, homomorphic encryption
Mailing list for Private and Secure ML community
As discussed at the end of the workshop, the Google group Private Multi-Party Machine Learning can be used for communication among the community.
PSML 2017 is a one-day workshop on August 11, 2017. The workshop includes two invited talks and several contributed talks. See our CFP for details about the abstract submissions for the contributed talks. The workshop will accept both unpublished and published works, and the submissions of unpublished works are not considered as publication.
- Abstract submission deadline: 26 May, 2017
- Notifications of acceptance : 19 June, 2017
- Registration Ends: See ICML site
- Workshop Date: 11 August, 2017
- Antti Honkela (University of Helsinki)
- Kana Shimizu (Waseda University)
- Samuel Kaski (Aalto University)
- Erman Ayday (Bilkent University)
- Borja Balle (Amazon)
- Emiliano De Cristofaro (University College London)
- Onur Dikmen (University of Helsinki)
- Christos Dimitrakakis (Chalmers University / University of Lille / Harvard University)
- Marco Gaboardi (University at Buffalo, SUNY )
- Koji Nuida (National Institute of Advanced Industrial Science and Technology)
- Jun Sakuma (University of Tsukuba)