Workhop date: 25 June 2014
Many machine learning settings include privacy and security requirements which are not well-addressed by traditional methods in security or machine learning. Privacy issues arise with the collection, analysis and sharing of personal data by corporations and governments. Further security considerations arise in applications such as biometric authentication, intrusion detection and response, malware classification and crowdsourcing. This has led to growing interest in the interplay between machine learning, game theory, cryptography, privacy, security, and statistics. Recent years have seen encouraging progress with the development of adversarial learning algorithms, low regret learning, differentially private learning methods, and analyses of possible attacks on machine learning algorithms.
Although significant progress has been made, numerous theoretical and practical challenges remain. First, several emerging research areas in data analysis, such as stream mining, mobility data mining, and social network analysis, require new theoretical and applied techniques to ensure privacy or security. Second, there is an urgent need for learning and mining methods with privacy and security guarantees for biomedical, financial, and other critical applications. Third, there is an emerging demand for security applications such as biometric authentication, malware detection, intrusion detection, and spam filtering. Finally, large scale systems require integrating large volumes of information and decentralised decision making in a secure and privacy-preserving manner over a network. In all cases, the strong interconnections between data mining and machine learning, cryptography, and game theory create the need for the development of multidisciplinary approaches on secure and private learning theory and practice.
The aim of this workshop is to bring together scientists and practitioners who conduct cutting edge research on privacy and security issues in machine learning to discuss the most recent advances in these research areas, identify open problem domains and research directions, and propose possible solutions. In particular, we invite interdisciplinary research on cryptography, data mining, game theory, machine learning, privacy, security and statistics. Moreover, we invite contributions on novel applications of machine learning for problems in which security and privacy issues are crucial. We invite both mature contributions as well as interesting preliminary results and descriptions of open problems on emerging research domains.
The workshop will invites submissions, ranging from mature work to open problems, in any of the following core subjects:
1. Statistical approaches for privacy preservation.
2. Private decision making and mechanism design.
3. Metrics and evaluation methods for privacy and security.
4. Robust learning in adversarial environments.
5. Learning in unknown / partially observable stochastic games.
6. Distributed inference and decision making for security.
7. Application-specific privacy preserving machine learning and decision theory.
8. Secure multiparty computation and cryptographic approaches for machine learning.
9. Cryptographic applications of machine learning and decision theory.
10. Security applications: Intrusion detection and response, biometric authentication, fraud detection, spam filtering, captchas.
11. Security analysis of learning algorithms
12. The economics of learning, security and privacy.
Current challenges in the fields of learning, security and privacy can only be addressed in a multi-disciplinary setting. Hence the goal of the proposed workshop is to foster the exchange of ideas at the juncture of these fields and an open discussion of new challenges and applications.