Privacy-Preserving Machine Learning
Large language models like ChatGPT are highly useful tools for users, but they also raise privacy concerns that cannot be overlooked. In particular, the fact that user input data is transmitted to and processed by a server has made many companies hesitant to adopt these models. In this context, Privacy-Preserving Machine Learning (PPML) is gaining attention as a field that leverages cryptographic techniques such as homomorphic encryption and secure multi-party computation to protect data. PPML plays a crucial role when handling sensitive data, such as medical records and financial information, ensuring data confidentiality while enabling the safe training and use of machine learning models.
Advanced Cryptography
Traditional encryption is primarily used to securely transmit data over public networks. However, in recent years, advanced encryption techniques that allow for more precise data access and secure computations while maintaining strong security are actively being researched. Our goal is to introduce these new functionalities while fully meeting rigorous mathematical security standards. Representative technologies in this field include Homomorphic Encryption (HE), Multi-party Computation (MPC), and Functional Encryption (FE).
Post-Quantum Cryptography
Post-Quantum Cryptography is an encryption technology developed to address the potential risk that existing cryptographic methods may become vulnerable as quantum computers advance. Current public-key cryptosystems, such as RSA and ECC (Elliptic Curve Cryptography), can be easily broken by the powerful computational capabilities of quantum computers. Post-Quantum Cryptography aims to create encryption algorithms that remain secure even against quantum computer attacks. Notable techniques include lattice-based encryption, hash-based signatures, and multivariate cryptography. These algorithms are designed to offer strong security even in the face of quantum computing threats