Berivan Isik

I am a third-year PhD student in Electrical Engineering Department at Stanford University, advised by Tsachy Weissman, and supported by Stanford Graduate Fellowship (2019-2022). I received my MS degree from Electrical Engineering Department of Stanford University in June 2021 and my BS degree from Electrical and Electronics Engineering Department of Middle East Technical University in June 2019. I previously interned at Stanford in 2018 under the supervision of Ayfer Ozgur.

I am also a student researcher at Google since February 2022. Previously, I spent summer 2021 at Google as a research intern hosted by Philip Chou.

Google Scholar

LinkedIn

Github

Twitter

berivan.isik [at] stanford [dot] edu

Interests

My research interests are information theory, coding theory, data compression, and machine learning. Recently, I have been working on learned data compression, model compression, 3D compression, federated learning, and compression for privacy, robustness and fairness in machine learning.

Publications

  1. Berivan Isik, Tsachy Weissman, "Learning under Storage and Privacy Constraints" [PDF]

IEEE International Symposium on Information Theory (ISIT), 2022.


  1. Berivan Isik, Tsachy Weissman, Albert No, "An Information-Theoretic Justification for Model Pruning" [PDF] [code]

International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.

Preliminary version presented at the Sparsity in Neural Networks Workshop, 2021. [Spotlight]


  1. Berivan Isik, Philip A. Chou, Sung Jin Hwang, Nick Johnston, George Toderici, "LVAC: Learned Volumetric Attribute Compression for Point Clouds using Coordinate Based Networks" [PDF]

Preprint, 2021.


  1. Berivan Isik, Kristy Choi, Xin Zheng, Tsachy Weissman, Stefano Ermon, H.-S. Philip Wong, Armin Alaghi, "Neural Network Compression for Noisy Storage Devices" [PDF]

Submitted to ACM TECS, 2022.

Preliminary version presented at the Deep Learning through Information Geometry Workshop, NeurIPS, 2020.


  1. Berivan Isik, "Neural 3D Compression via Model Compression" [PDF]

WiCV, CVPR, 2021.


  1. Leighton P. Barnes*, Huseyin A. Inan*, Berivan Isik*, Ayfer Ozgur (*equal contribution), "rTop-k: A Statistical Estimation Approach to Distributed SGD" [PDF]

IEEE Journal on Selected Areas in Information Theory, 2020.

Preliminary version presented at the Federated Learning for User Privacy and Data Confidentiality, ICML, 2020.

Workshops

  1. Stanford Center for Image Systems Engineering (SCIEN) Workshop. "Neural 3D Compression", Dec. 2021. [PDF]

  2. Stanford SystemX Conference. "Neural Network Compression for Noisy Storage Devices", Nov. 2021. Invited. [PDF]

  3. Sparsity in Neural Networks Workshop. "Neural Network Pruning via Rate-Distortion Theory", July 2021. [PDF]

  4. CVPR WiCV Workshop. "Neural 3D Compression via Model Compression", June 2021. [PDF]

  5. NeurIPS Deep Learning through Information Geometry Workshop. "Noisy Neural Network Compression for Analog Storage Devices", Dec. 2020. [PDF]

  6. NeurIPS WiML Workshop. "Noisy Neural Network Compression", Dec. 2020. [Poster]

  7. Stanford SystemX Conference. "Noisy Neural Network Compression for Analog Storage Devices", Nov. 2020. Invited.

  8. ICML Federated Learning for User Privacy and Data Confidentiality Workshop. "rTop-k: A Statistical Estimation Approach to Distributed SGD", July 2020. [PDF]

Invited Talks

  1. Google - Neural Compression Team. "LVAC: Learned Volumetric Attribute Compression for Point Clouds using Coordinate Based Networks", Sep. 2021.

  2. Stanford Compression Workshop. "An Information-Theoretic Approach to Neural Network Compression", Feb. 2021. [Video] [Slides]

  3. NeurIPS Meetup Turkey. "Noisy Neural Network Compression", Dec. 2020. [Video] (Turkish)

  4. Facebook Reality Labs - Research. "Robust Neural Network Compression", Nov. 2020.

Involvements

  • Co-organizer of the Women in Machine Learning (WiML) Workshop at ICML 2021.

  • Co-organizer of the ICML 2021 Workshop on Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning (ITR3@ICML-21). [Link]

  • Reviewer for JSAIT, ISIT-22, ICML (2021-2022), NeurIPS (2021-2022), ICLR-22, AISTATS-22, ECCV-22, and ITW-22.

  • Grader for the course EE 382A: Parallel Processors Beyond Multicore Processing in Spring 2021.

  • Mentor for STEM to SHTEM summer internship program for high school students in 2020.

Hobbies

I like photography (hope you will like my amateur photographs), literature, and Rock'n Roll dancing.