Secure Computation
Master's Degree in Cybersecurity
Academic Year: 2023/2024
Lecturers: Prof. Fabio De Gaspari and Riccardo Lazzeretti
Overview
Secure multi-party computation allows a network of mutually distrustful players, each holding a secret input, to run an interactive protocol in order to evaluate a function on their joint inputs in a secure way, i.e. without revealing anything more than what the output of the function might reveal. Secure computation is an abstraction of several important applications, including electronic voting, digital auctions, zero knowledge, and more.
Neural Networks are function approximators that can represent any function. Mathematically, they are a composition of several functions (layers) that are learned through gradient-based optimization. Neural networks suffer many possible attacks with results ranging from poor learning or distorted classification results, to breach of privacy and disclosure of sensitive training data. Research on how to address these vulnerabilities is ongoing, and secure multi-party computation techniques are being applied in some settings.
The course is thought as an introduction to secure computation, and will cover both its theoretical foundations and its applications to practical settings such as distributed ledgers and cryptographic currencies. Further, it covers introduction to neural networks from both theoretical and practical standpoints and analyzes the security of the functions learned by these networks through study of different attack techniques and countermeasures.
Syllabus
Computing over encrypted data: Equivalence between semantic security and CPA security. Fully-homomorphic encryption (FHE) and somewhat-homomorphic encryption (SHE). The Learning with Errors (LWE) assumption. Constructing SHE from LWE. Bootstrapping theorems.
Secure two-party computation (2PC): Two-party protocols and definitions for passive and active security. Sequential composition theorem. Zero knowledge for NP. Coin tossing by telephone. Oblivious transfer. Yao's garbled circuits protocol and its optimizations.
Secure multi-party computation (MPC): Multi-party protocols and definitions for passive and active security. Secret sharing schemes. MPC with honest majority. The SPDZ protocol.
Neural Networks Basics: Security issues in NN, intro to Machine Learning (ML), Gradient-Based Optimization, Overfitting, Regularization and Hyperparameters
Deep Neural Networks (DNN): DNN as function approximators, Logistic Regression and Gradient Descent, Activation Functions, Backpropagation, Convolutions
Secure DNNs: Adversarial examples, NN poisoning, trojaning, data privacy
Federated Learning: Learning in a distributed way, MPC in federated learning (secure aggregation)
Teaching Material
The slides for the course can be downloaded from here.
The following books are also suggested as further references:
[1] Yehuda Lindell (Editor). Tutorials on the Foundations of Cryptography (dedicated to Oded Goldreich). Springer, 2017.
[2] Carmit Hazay and Yehuda Lindell. Efficient Secure Two-Party Protocols, Springer, 2010.
[3] Ronald Cramer, Ivan Bjerre Damgård, Jesper Buus Nielsen. Secure Multiparty Computation and Secret Sharing. Cambridge University Press, 2015.
[4] Ian Goodfellow, Yoshua Bengio, Aaron Courville. Deep Learning. MIT Press, 2016. Available Online
[5] Michael Nielsen. Neural Networks and Deep Learning. 2019. Available Online
Logistics
Lectures time:
Monday 14:00 - 17:00 Aula G50, Viale Regina Elena 295, 00198 Rome.
Wednesday 13:00 - 15:00 Aula T1 Building E, 00198 Rome.
Classroom (class code: b4mll6r): https://classroom.google.com/c/NjY1MjQ2NTA4Nzc4?cjc=b4mll6r
Grading
Oral exam and project
Announcements
23/02/2024: The course will start on February 26, 2024.