Berivan Isik
I am a PhD student in Electrical Engineering Department at Stanford University advised by Tsachy Weissman and Sanmi Koyejo, where I am affiliated with the SAIL and StatsML groups. My research is supported by Stanford Graduate Fellowship (2019-2023), Google Ph.D. Fellowship (2023-2026), and a Meta research grant. I received my MS degree from Stanford University in June 2021 and my BS degree from Middle East Technical University in June 2019, both in Electrical Engineering.
Previously, I visited Stanford in 2018 as an undergraduate researcher under the supervision of Ayfer Ozgur and Peter Kairouz. In 2021 summer, I worked at Google as a research intern hosted by Philip Chou and George Toderici. From February to September 2022, I did a second research internship at Google under the supervision of Philip Chou, Onur Guleryuz, and Danhang Tang. From October 2022 to February 2023, I worked at Amazon as an applied scientist intern hosted by Dmitri Pavlichin. In 2023 summer, I was a visiting researcher in Nicolas Papernot's group at Vector Institute for Artificial Intelligence.
I have been working at Google Research as a student researcher since October 2023 under the supervision of Natalia Ponomareva and Sergei Vassilvitskii.
Collaborators (past & present): Nicolas Papernot (UofT & Vector Institute), Ce Zhang (ETH Zurich), Peter Kairouz (Google), Ahmad Beirami (Google), Peter Richtarik (KAUST), Ayfer Ozgur (Stanford), Arindam Banerjee (UIUC), Deniz Gunduz (Imperial College London), Philip Chou (Google), George Toderici (Google).
berivan.isik [at] stanford [dot] edu
Interests
My research interests are scalable & trustworthy machine learning, privacy, fairness, robustness, compression, and information theory. Recently, I have been working on differential privacy, model compression, federated learning, transfer learning, data validation for foundation models, efficient training/finetuning of large language models, and learned compression.
Publications
- Highlights:
Exact Optimality of Communication-Privacy-Utility Tradeoffs in Distributed Mean Estimation [PDF] [code]
Berivan Isik, Wei-Ning Chen, Ayfer Ozgur, Tsachy Weissman, Albert No
Conference on Neural Information Processing Systems (NeurIPS), 2023.
Preliminary version presented at the ICML 2023 Federated Learning Workshop and the Theory and Practice of Differential Privacy (TPDP) Workshop in 2023.
Sparse Random Networks for Communication-Efficient Federated Learning [PDF] [code]
Berivan Isik*, Francesco Pase*, Deniz Gunduz, Tsachy Weissman, Michele Zorzi
International Conference on Learning Representations (ICLR), 2023.
Preliminary version presented at the International Workshop on Federated Learning: Recent Advances and New Challenges, NeurIPS, 2022. [Oral]
Communication-Efficient Federated Learning through Importance Sampling [PDF]
Berivan Isik*, Francesco Pase*, Deniz Gunduz, Sanmi Koyejo, Tsachy Weissman, Michele Zorzi
Preliminary version presented at the Workshop on Federated Learning and Analytics, ICML, 2023.
Preprint, 2023.
An Information-Theoretic Justification for Model Pruning [PDF] [code]
Berivan Isik, Tsachy Weissman, Albert No
International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.
Preliminary version presented at the Sparsity in Neural Networks Workshop, 2021. [Spotlight talk]
- Others:
An Information-Theoretic Understanding of Maximum Manifold Capacity Representations [PDF]
Berivan Isik*, Rylan Schaeffer*, Victor Lecomte*, Mikail Khona, Yann LeCun, Andrey Gromov, Ravid Shwartz-Ziv, Sanmi Koyejo
Preliminary version accepted at the Workshop on Unifying Representations in Neural Models (UniReps), NeurIPS, 2023. [Oral]
Preliminary version accepted at the Workshop on Information-Theoretic Principles in Cognitive Systems (InfoCog), NeurIPS, 2023. [Spotlight talk]
Preliminary version accepted at two other NeurIPS 2023 Workshops: Self-Supervised Learning - Theory and Practice (SSL), Symmetry and Geometry in Neural Representations (NeurReps).
GPT-Zip: Deep Compression of Finetuned Large Language Models [PDF]
Berivan Isik*, Hermann Kumbong*, Wanyi Ning*, Xiaozhe Yao*, Sanmi Koyejo, Ce Zhang
Preliminary version presented at the Workshop on Efficient Systems for Foundation Models, ICML, 2023.
Lossy Compression of Noisy Data for Private and Data-Efficient Learning [PDF]
Berivan Isik, Tsachy Weissman
IEEE Journal of Selected Areas in Information Theory (JSAIT), 2023.
Sandwiched Video Compression: Efficiently Extending the Reach of Standard Codecs with Neural Wrappers [PDF]
Berivan Isik, Onur Guleryuz, Danhang Tang, Jonathan Taylor, Philip A. Chou
IEEE International Conference on Image Processing (ICIP), 2023.
Neural Network Compression for Noisy Storage Devices [PDF]
Berivan Isik, Kristy Choi, Xin Zheng, Tsachy Weissman, Stefano Ermon, H.-S. Philip Wong, Armin Alaghi
ACM Transactions on Embedded Computing Systems (TECS), 2023.
Preliminary version presented at the Deep Learning through Information Geometry Workshop, NeurIPS, 2020.
Learning under Storage and Privacy Constraints [PDF1] [PDF2 (longer)]
Berivan Isik, Tsachy Weissman
IEEE International Symposium on Information Theory (ISIT), 2022.
LVAC: Learned Volumetric Attribute Compression for Point Clouds using Coordinate Based Networks [PDF] [code]
Berivan Isik, Philip A. Chou, Sung Jin Hwang, Nick Johnston, George Toderici
Journal of Frontiers in Signal Processing (FSP), 2022.
Upper Bounds on the Rate of Uniformly-Random Codes for the Deletion Channel [PDF]
Berivan Isik, Francisco Pernice, Tsachy Weissman
Submitted to IEEE Transactions on Information Theory (TIT), 2023.
Neural 3D Compression via Model Compression [PDF]
Berivan Isik
WiCV, CVPR, 2021.
rTop-k: A Statistical Estimation Approach to Distributed SGD [PDF]
Berivan Isik*, Leighton P. Barnes*, Huseyin A. Inan*, Ayfer Ozgur
IEEE Journal on Selected Areas in Information Theory (JSAIT), 2020.
Preliminary version presented at the Federated Learning for User Privacy and Data Confidentiality, ICML, 2020. [Long Oral]
Patents
"Learned Volumetric Attribute Compression Using Coordinate-Based Networks", US Patent 17/708,628.
Philip A. Chou, Berivan Isik, Sung Jin Hwang, Nick Johnston, George Toderici
Invited Talks
Columbia University - NYC Privacy Day. "Exact Optimality of Communication-Privacy-Utility Tradeoffs in Distributed Mean Estimation", Dec. 2023.
Google - Federated Learning Seminar. "Exact Optimality of Communication-Privacy-Utility Tradeoffs in Distributed Mean Estimation", Oct. 2023.
Vector Institute for Artificial Intelligence. "Sparse Random Networks for Communication-Efficient Federated Learning", Sep. 2023.
Video Quality Expert Group (VQEG). "Sandwiched Video Compression: Efficiently Extending the Reach of Standard Codecs with Neural Wrappers", June 2023.
Flower Monthly. "Sparse Random Networks for Communication-Efficient Federated Learning", June 2023.
Federated Learning One World (FLOW) Seminar. "Sparse Random Networks for Communication-Efficient Federated Learning", March 2023. [Slides]
Google - Federated Learning Seminar. "Sparse Random Networks for Communication-Efficient Federated Learning", Jan. 2023. [Slides]
Bilkent University - Electrical Engineering Seminars. "Sparsity in Neural Networks", Jan. 2023. [Slides]
Middle East Technical University - Electrical Engineering Seminars. "Sparsity in Neural Networks", Dec. 2022. [Slides]
Google - Sparsity Reading Group. "An Information-Theoretic Justification for Model Pruning", Nov. 2022.
Google - Perception. "Learned Sandwiched Video Compression", Oct. 2022.
Google - Information Theory Reading Group. "An Information-Theoretic Justification for Model Pruning", Sep. 2022.
Google - Information Theory Reading Group. "LCoN: Learning Under Storage and Privacy Constraints", July 2022.
Google - Neural Compression Team. "LVAC: Learned Volumetric Attribute Compression for Point Clouds using Coordinate Based Networks", Sep. 2021.
Stanford Compression Workshop. "An Information-Theoretic Approach to Neural Network Compression", Feb. 2021. [Video] [Slides]
NeurIPS Meetup Turkey. "Noisy Neural Network Compression", Dec. 2020. [Video] (Turkish)
Facebook Reality Labs - Research. "Robust Neural Network Compression", Nov. 2020.
Workshops
ICML Workshop on Efficient Systems for Foundation Models. "GPT-Zip: Deep Compression of Finetuned Large Language Models", July 2023.
ICML Workshop on Federated Learning. "Exact Optimality in Communication-Privacy-Utility Tradeoffs", July 2023. [PDF]
ICML Workshop on Federated Learning. "Leveraging Side Information for Communication-Efficient Federated Learning", July 2023. [PDF]
Vector Institute Machine Learning Security and Privacy Workshop. “Exact Optimality of Communication-Privacy-Utility Tradeoffs in Distributed Mean Estimation”, July 2023. [PDF]
Simons Institute Workshop on Information Theoretic Methods for Trustworthy Machine Learning. “Exact Optimality of Communication-Privacy-Utility Tradeoffs in Distributed Mean Estimation”, May 2023. [PDF]
Information Theory and Applications Workshop (ITA). "An Information-Theoretic Justification for Model Pruning", Feb. 2023. [PDF]
NeurIPS Workshop on Federated Learning. "Efficient Federated Random Subnetwork Training", Dec. 2022. [PDF]
Stanford Center for Image Systems Engineering (SCIEN) Workshop. "Neural 3D Compression", Dec. 2021. [PDF]
Stanford SystemX Conference. "Neural Network Compression for Noisy Storage Devices", Nov. 2021. Invited. [PDF]
Sparsity in Neural Networks Workshop. "Neural Network Pruning via Rate-Distortion Theory", July 2021. [PDF]
CVPR WiCV Workshop. "Neural 3D Compression via Model Compression", June 2021. [PDF]
NeurIPS Deep Learning through Information Geometry Workshop. "Noisy Neural Network Compression for Analog Storage Devices", Dec. 2020. [PDF]
NeurIPS WiML Workshop. "Noisy Neural Network Compression", Dec. 2020. [PDF]
Stanford SystemX Conference. "Noisy Neural Network Compression for Analog Storage Devices", Nov. 2020. Invited. [PDF]
ICML Federated Learning for User Privacy and Data Confidentiality Workshop. "rTop-k: A Statistical Estimation Approach to Distributed SGD", July 2020. [PDF]
Involvements
Organizer of the ICML 2023 Workshop on Neural Compression: From Information Theory to Applications. [Link]
Organizer of the Women in Machine Learning (WiML) Workshop at ICML 2021.
Organizer of the ICML 2021 Workshop on Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning (ITR3@ICML-21). [Link]
Member of the Stanford Faculty Search Committee, 2024.
Reviewer for:
International Conference on Machine Learning (ICML) 2021, 2022 (outstanding reviewer award), 2023, 2024
Conference on Neural Information Processing Systems (NeurIPS) 2021, 2022, 2023 (top reviewer)
International Conference on Learning Representations (ICLR) 2022, 2023, 2024
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, 2023, 2024
Journal of Machine Learning Research (JMLR)
IEEE Conference on Secure and Trustworthy Machine Learning (SaTML) 2024
European Conference on Computer Vision (ECCV) 2022
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2023-2024
International Conference on Computer Vision (ICCV) 2023
Data Compression Conference (DCC) 2024
SIAM Journal on Mathematics of Data Science (SIMODS)
IEEE Journal on Selected Areas in Information Theory (JSAIT)
IEEE International Symposium on Information Theory (ISIT) 2022-2023
IEEE Information Theory Workshop (ITW) 2022
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
Reviewer for the ICLR Blogposts track, 2023-2024.
Program committee member for:
NeurIPS 2023 Workshop, Self-Supervised Learning - Theory and Practice
NeurIPS 2023 Workshop, UniReps: Unifying Representations in Neural Models
NeurIPS 2023 Workshop, R0-FoMo: Robustness of Few-shot and Zero-shot Learning in Foundation Models
NeurIPS 2023 Workshop on Federated Learning in the Age of Foundation Models
ICML 2023 Workshop on New Frontiers in Adversarial Machine Learning
ICML 2023 Workshop on Knowledge and Logical Reasoning in the Era of Data-Driven Learning
ICML 2023 Workshop on Federated Learning and Analytics in Practice
NeurIPS 2022 Workshop on Federated Learning: Recent Advances and New Challenges
Grader for the course EE 382A: Parallel Processors Beyond Multicore Processing in Spring 2021.
Mentor for STEM to SHTEM summer internship program for high school students in 2020.
Hobbies
Photography: my amateur photographs