Kangwook Lee

Postdoc Researcher

Room 911, N1 building

291 Daehak-ro, Yuseong-gu

Daejeon, South Korea, 34141

+82-42-350-7529

I am a postdoc researcher at Information and Electronics Research Institute at KAIST, working with Prof. Changho Suh. I obtained my PhD degree in May 2016 from the EECS department at UC Berkeley under the supervision of Prof. Kannan Ramchandran. I also obtained my MS degree in EECS from UC Berkeley in 2012, and before that I obtained my BS degree in EE from KAIST in 2010.

Email: kw1jjang at kaist dot ac dot kr

Homepage: https://sites.google.com/site/kw1jjang/

Old one: http://people.eecs.berkeley.edu/~kw1jjang/

Google Scholar: http://scholar.google.com/citations?user=sCEl8r-n5VEC&hl=en


Updates

  • (Sep 2018) Our paper is accepted for NIPS 2018
    • K. Ahn, K. Lee, H. Cha, and C. Suh, "Binary Rating Estimation with Graph Side Information" (acceptance rate: 20.81%)
  • (Aug 2018) Our papers are accepted for The 55th Annual Allerton Conference on Communication, Control, and Computing
    • K. Ahn, K. Lee, H. Cha and C. Suh, "Binary Rating Estimation with Graph Side Information" (invited paper)
    • J. Yoon, K. Lee, and C. Suh, "On the Joint Recovery of Community Structure and Community Features"
  • (June 2018) Gave an invited talk
    • "Coding techniques for distributed computing and machine learning" @ 2018 IEEK Summer Conference
  • (May 2018) Our paper is accepted for IEEE J-STSP
    • K. Ahn, K. Lee, and C. Suh, "Hypergraph Spectral Clustering in the Weighted Stochastic Block Model"
  • (Mar 2018) Our new papers on Coded Computation are accepted for IEEE ISIT 2018
    • H. Park, K. Lee, J. Sohn, C. Suh, and J. Moon, "Hierarchical Coding for Distributed Computing"
    • T. Baharav, K. Lee, O. Ocal, and K. Ramchandran, "Straggler-proofing massive-scale distributed matrix multiplication with d-dimensional product codes"
  • (Mar 2018) Our new paper is accepted for ICLR 2018 Workshop
    • K. Lee, K. Lee, H. Kim, C. Suh, and K. Ramchandran, "SGD on Random Mixtures: Private Machine Learning under Data Breach Threats"
  • (Jan 2018) Our paper is accepted for ICLR 2018
    • K. Lee*, H. Kim*, and C. Suh, "Simulated+Unsupervised Learning With Adaptive Data Generation and Bidirectional Mappings''
  • (Jan 2018) Two papers will be presented at SysML 2018
    • K. Lee, K. Lee, H. Kim, C. Suh, and K. Ramchandran, "SGD on Random Mixtures: Private Machine Learning under Data-breach Threats"
    • J. Chung, K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "UberShuffle: Communication-efficient Data Shuffling for SGD via Coding Theory"
  • (Jan 2018) Our paper is on Arxiv
    • H. Park, K. Lee, J. Sohn, C. Suh, and J. Moon, "Hierarchical Coding for Distributed Computing", under review. preprint.
  • (Dec 2017, Jan 2018) Gave 3 invited talks
    • "Speeding Up Distributed Machine Learning Using Codes” @ National Information Society Agency, Daegu, Korea, Jan 2018
    • "Speeding Up Distributed Machine Learning Using Codes” @ Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, Korea, Jan 2018
    • "Speeding Up Distributed Machine Learning Using Codes” @ Institute of New Media and Communications, Seoul National University, Seoul, Korea, Dec 2017
  • (Nov 2017) Our work is accepted for The Workshop on Machine Learning Systems at NIPS 2017
    • J. Chung, K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "UberShuffle: Communication-efficient Data Shuffling for SGD via Coding Theory"
  • (Oct 2017) Our work is presented at The 51st Asilomar Conference on Signals, Systems and Computers as an invited paper
    • J. Chung, K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "Coded Shuffling for Distributed Machine Learning: Theory and Practice"
  • (Sep 2017) Our work on "Large-scale and Interpretable Collaborative Filtering for Educational Data" is featured as Research Highlights on KAIST Annual R&D Report 2017: [Link]
  • (Sep 2017) Our paper is submitted to IEEE Transactions on Information Theory
    • K. Ahn*, K. Lee*, and C. Suh, "Community Recovery in Hypergraphs". preprint.
  • (Sep 2017) Our work on "Large-scale and Interpretable Collaborative Filtering for Educational Data" is featured on KAIST Breakthroughs 2017 Fall: [Link]
  • (Aug 2017) Gave two workshop talks at KDD 2017 Workshop and ICML 2017 Workshop
  • (Aug 2017) Our paper is accepted for The 54th Annual Allerton Conference on Communication, Control, and Computing
    • G. Suh, K. Lee, and C. Suh, "Matrix Sparsification for Coded Matrix Multiplication"
  • (July 2017) Our paper is accepted for publication in IEEE Transactions on Information Theory
    • K. Lee, M. Lam, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, “Speeding Up Distributed Machine Learning Using Codes”
  • (June 2017) Our paper is accepted for The KDD 2017 Workshop on Advancing Education with Data
    • K. Lee, J. Chung, and C. Suh, "Large-scale and Interpretable Collaborative Filtering for Educational Data"
  • (June 2017) Gave two talks at IEEE International Symposium on Information Theory 2017
  • (June 2017) Gave a talk at Korea Institute of Communications and Information Sciences Summer Workshop 2017
  • (June 2017) Our work is presented by Prof. Kannan Ramchandran at 2017 IEEE Communication Theory Workshop
    • "On coded computation for speeding up parallel multi-core machine learning"
  • (May 2017) Our paper is presented at IEEE International Conference on Communications 2017
    • K. Chandrasekher, K. Lee, P. Kairouz, R. Pedarsani, and K. Ramchandran, "Asynchronous and Noncoherent Neighbor Discovery for the IoT Using Sparse-Graph Codes"
  • (May 2017) Gave a talk on "Speeding Up Distributed Machine Learning Using Codes” at Naver
  • (Apr 2017) Our papers are accepted for IEEE International Symposium on Information Theory 2017
    • K. Lee, C. Suh, and K. Ramchandran, "High-Dimensional Coded Matrix Multiplication"
    • K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "Coded Computation for Multicore Setups"
    • K. Ahn, K. Lee, and C. Suh, "Information-theoretic Limits of Subspace Clustering"
  • (Mar 2017) Our paper is accepted for publication in IEEE Transactions on Information Theory
    • R. Pedarsani, D. Yin, K. Lee, and K. Ramchandran, “PhaseCode: Fast and Efficient Compressive Phase Retrieval based on Sparse-Graph-Codes”, to appear in IEEE Transactions on Information Theory.
  • (Feb 2017) Our paper is presented at Information Theory and Applications Workshop 2017
    • "Subspace clustering via hypergraph community recovery"
  • (Feb 2017) Our paper is accepted for publication in IEEE Transactions on Information Theory
    • K. Lee, N. Shah, L. Huang, and K. Ramchandran, “The MDS Queue: Analysing the Latency Performance of Codes”, to appear in IEEE Transactions on Information Theory.
  • (Jan 2017) Our paper is accepted for IEEE International Conference on Communications 2017
    • K. Chandrasekher, K. Lee, P. Kairouz, R. Pedarsani, and K. Ramchandran, "Asynchronous and Noncoherent Neighbor Discovery for the IoT Using Sparse-Graph Codes"
  • (Dec 2016) Gave a talk on "Speeding Up Distributed Computing Systems Using Codes" at KAIST Information Theory and Machine Learning Workshop
  • (Dec 2016) Our work was presented by Prof. Changho Suh at The 2016 Shannon Workshop
    • "High-dimensional Codes for Distributed Computing"
  • (Dec 2016) Gave a talk on "Learning Analytics: Collaborative Filtering or Regression With Experts?" at The NIPS 2016 Workshop on Machine Learning For Education
  • (Nov 2016) Our paper is published in Trends in Neuroscience and Education
    • H. Han, K. Lee, and F. Soylu, "Predicting long-term outcomes of educational interventions using evolutionary causal matrices and Markov chain"
  • (Nov 2016) Gave an invited talk on "Introduction to Machine Learning and Deep Learning" at National Information Society Agency, Daegu, Korea, Nov 2016.
  • (Nov 2016) Our paper is accepted for the Moral Development and Education Special Interest Group American Educational Research Association (AERA) 2017
    • H. Han, S. Thoma, F. Soylu, and K. Lee, "How to Make Moral Education More Effective?: From a Brain Study to Policy Making"
  • (Nov 2016) Our work was presented by Ramtin Pedarsani as an invited talk at The 50th anniversary of the Asilomar Conference on Signals, Systems, and Computers
    • K. Lee, M. Lam, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "Codes Can Speed Up Large-Scale Distributed Computing"
  • (Oct 2016) Our paper is accepted for publication in IEEE/ACM Transactions on Networking
    • K. Lee, R. Pedarsani, and K. Ramchandran, “On Scheduling Redundant Requests with Cancellation Overheads”
  • (Oct 2016) Our paper is submitted to IEEE Transactions on Information Theory
    • K. Lee, M. Lam, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, “Speeding Up Distributed Machine Learning Using Codes”
  • (Oct 2016) Our paper is accepted as an oral presentation for The NIPS 2016 Workshop on Machine Learning For Education
    • K. Lee, J. Chung, Y. Cha, and C. Suh, "Learning Analytics: Collaborative Filtering or Regression With Experts?"
  • (Sep 2016) Gave a talk on "Community Recovery in Hypergraphs" at Allerton
    • K. Ahn, K. Lee, and C. Suh, "Community Recovery in Hypergraphs"


On Research

Active Research Projects:

  • Coding Theory for Distributed Machine Learning: Coded Computation
  • Machine Learning Algorithms and Information Theoretic Limits: Community Recovery, Recommendation Algorithms, and Deep Learning
  • Applications: Machine Learning for Education and Machine Learning for Autonomous Driving

Journal Publications (in preparation):

  1. R. Pedarsani, K. Lee, and K. Ramchandran, “Sparse Covariance Estimation Based on Sparse-Graph Codes”, in preparation.
  2. K. Lee, K. Chandrasekher, R. Pedarsani, and K. Ramchandran, “SAFFRON: Sparse-Graph Code Framework for Group Testing”, submitted. Arxiv preprint.
  3. K. Ahn*, K. Lee*, and C. Suh, "Community Recovery in Hypergraphs", submitted to IEEE Transactions on Information Theory. preprint.

Journal Publications

  1. K. Ahn, K. Lee, and C. Suh, "Hypergraph Spectral Clustering in the Weighted Stochastic Block Model", to appear in IEEE Journal of Selected Topics in Signal Processing.
  2. K. Lee, M. Lam, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, “Speeding Up Distributed Machine Learning Using Codes”, in IEEE Transactions on Information Theory, vol. 64, no. 3, pp. 1-16, March 2018.
  3. R. Pedarsani, D. Yin, K. Lee and K. Ramchandran, “PhaseCode: Fast and Efficient Compressive Phase Retrieval based on Sparse-Graph-Codes”, in IEEE Transactions on Information Theory, vol. 63, no. 6, pp. 3663-3691, June 2017.
  4. K. Lee, N. B. Shah, L. Huang and K. Ramchandran, "The MDS Queue: Analysing the Latency Performance of Erasure Codes," in IEEE Transactions on Information Theory, vol. 63, no. 5, pp. 2822-2842, May 2017.
  5. K. Lee, R. Pedarsani and K. Ramchandran, "On Scheduling Redundant Requests With Cancellation Overheads," in IEEE/ACM Transactions on Networking, vol. 25, no. 2, pp. 1279-1290, April 2017.
  6. N. B. Shah, K. Lee and K. Ramchandran, "When Do Redundant Requests Reduce Latency?," in IEEE Transactions on Communications, vol. 64, no. 2, pp. 715-722, Feb. 2016.
  7. H. Han, K. Lee, and F. Soylu, "Simulating outcomes of interventions using a multipurpose simulation program based on the Evolutionary Causal Matrices and Markov Chain", to appear in the Knowledge and Information Systems (KAIS) journal, 2017.
  8. H. Han, K. Lee, and F. Soylu, "Predicting long-term outcomes of educational interventions using evolutionary causal matrices and Markov chain", in Trends in Neuroscience and Education, vol. 5, no. 4, pp. 157-165, December 2016.

Peer-reviewed Conference & Workshop Proceedings:

  1. K. Ahn, K. Lee, H. Cha, and C. Suh, "Binary Rating Estimation with Graph Side Information", NIPS 2018, December, 2018
  2. K. Ahn, K. Lee, H. Cha and C. Suh, "Binary Rating Estimation with Graph Side Information", The 55th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2018 (invited paper)
  3. J. Yoon, K. Lee, and C. Suh, "On the Joint Recovery of Community Structure and Community Features", The 55th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2018
  4. H. Park, K. Lee, J. Sohn, C. Suh, and J. Moon, "Hierarchical Coding for Distributed Computing", IEEE International Symposium on Information Theory, Vali, Colorado, USA, 2018
  5. T. Baharav, K. Lee, O. Ocal, and K. Ramchandran, "Straggler-proofing massive-scale distributed matrix multiplication with d-dimensional product codes", IEEE International Symposium on Information Theory, Vali, Colorado, USA, 2018
  6. K. Lee, K. Lee, H. Kim, C. Suh, and K. Ramchandran, "SGD on Random Mixtures: Private Machine Learning under Data-breach Threats", International Conference on Learning Representations (ICLR) Workshop, BC, Canada, April, 2018
  7. K. Lee*, H. Kim*, and C. Suh, "Simulated+Unsupervised Learning With Adaptive Data Generation and Bidirectional Mappings'', International Conference on Learning Representations (ICLR), BC, Canada, April, 2018
  8. K. Lee, K. Lee, H. Kim, C. Suh, and K. Ramchandran, "SGD on Random Mixtures: Private Machine Learning under Data-breach Threats", SysML 2018, Stanford, CA, February, 2018
  9. J. Chung, K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "UberShuffle: Communication-efficient Data Shuffling for SGD via Coding Theory", SysML 2018, Stanford, CA, February, 2018
  10. J. Chung, K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "UberShuffle: Communication-efficient Data Shuffling for SGD via Coding Theory", Neural Information Processing Systems (NIPS): Workshop on Machine Learning Systems, Long Beach, California, USA, December, 2017
  11. G. Suh, K. Lee, and C. Suh, "Matrix Sparsification for Coded Matrix Multiplication", The 54th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2017
  12. K. Lee, H. Kim, and C. Suh, "Crash to not crash: Playing video games to predict vehicle collisions," ICML Workshop on Machine Learning for Autonomous Vehicles, Sydney, Australia, August, 2017
  13. K. Lee, J. Chung, and C. Suh, "Large-scale and Interpretable Collaborative Filtering for Educational Data", KDD Workshop on Advancing Education with Data, Halifax, Canada, August, 2017. [Press: KAIST Breakthrough 2017 Fall] [Video]
  14. K. Lee, C. Suh, and K. Ramchandran, "High-Dimensional Coded Matrix Multiplication", IEEE International Symposium on Information Theory, Aachen, Germany, June, 2017
  15. K. Lee, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, "Coded Computation for Multicore Setups", IEEE International Symposium on Information Theory, Aachen, Germany, June, 2017
  16. K. Ahn, K. Lee, and C. Suh, "Information-theoretic Limits of Subspace Clustering", IEEE International Symposium on Information Theory, Aachen, Germany, June, 2017
  17. K. Chandrasekher, K. Lee, P. Kairouz, R. Pedarsani, and K. Ramchandran, "Asynchronous and Noncoherent Neighbor Discovery for the IoT Using Sparse-Graph Codes", IEEE International Conference on Communications (ICC), Paris, France, May, 2017.
  18. H. Han, S. Thoma, F. Soylu, and K. Lee, "How to Make Moral Education More Effective?: From a Brain Study to Policy Making", the Moral Development and Education Special Interest Group American Educational Research Association (AERA), San Antonio, TX, USA, April, 2017.
  19. K. Lee, J. Chung, Y. Cha, and C. Suh, "Learning Analytics: Collaborative Filtering or Regression With Experts?", Neural Information Processing Systems (NIPS): Workshop on Machine Learning for Education, Barcelona, Spain, December, 2016.
  20. K. Ahn, K. Lee, and C. Suh, "Community Recovery in Hypergraphs", The 53rd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, September, 2016.
  21. K. Lee, M. Lam, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, “Speeding Up Distributed Machine Learning Using Codes”, IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, July, 2016.
  22. K. Lee, R. Pedarsani, and K. Ramchandran, “SAFFRON: Sparse-Graph Code Framework for Group Testing”, IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, July, 2016.
  23. K. Lee, M. Lam, R. Pedarsani, D. Papailiopoulos, and K. Ramchandran, “Speeding-up Distributed Machine Learning Using Codes”, Neural Information Processing Systems (NIPS): Workshop on Machine Learning Systems, Montreal, Canada, December, 2015.
  24. K. Lee, R. Pedarsani, and K. Ramchandran, “On Scheduling Redundant Requests with Cancellation Overheads”, The 53rd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2015.
  25. R. Pedarsani, K. Lee, and K. Ramchandran, “Sparse Covariance Estimation Based on Sparse-Graph Codes”, The 53rd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2015.
  26. D. Yin, K. Lee and K. Ramchandran, “Fast and Robust Compressive Phase Retrieval with Sparse- Graph Codes”, IEEE International Symposium on Information Theory (ISIT), Hong Kong, June, 2015.
  27. R. Pedarsani, K. Lee and K. Ramchandran, “Capacity-Approaching PhaseCode for Low-Complexity Compressive Phase Retrieval”, IEEE International Symposium on Information Theory (ISIT), Hong Kong, June, 2015.
  28. R. Pedarsani, K. Lee and K. Ramchandran, “PhaseCode: Fast and Efficient Compressive Phase Retrieval based on Sparse-Graph-Codes”, The 52nd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2014.
  29. N. Shah, K. Lee and K. Ramchandran, “The MDS Queue: Analysing the Latency Performance of Codes”, IEEE International Symposium on Information Theory (ISIT), Honolulu, HI, USA, July, 2014.
  30. N. Shah, K. Lee and K. Ramchandran, “When Do Redundant Requests Reduce Latency?”, The 51st Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, October, 2013.
  31. K. Lee, L. Yan, A. Parkeh and K. Ramchandran, “A VoD System for Massively Scaled, Heterogeneous Environments: Design and Implementation”, IEEE 21st International Symposium on Modeling, Analysis and Simulation of Computer and Telecommunication Systems, San Francisco, CA, USA, August, 2013. Best Paper Finalist. [code]
  32. K. Lee, H. Zhang, Z. Shao, M. Chen, A. Parekh and K. Ramchandran, “An Optimized Distributed Video-on-Demand Streaming System: Theory and Design”, The 50th Allerton Conference on Communication, Control and Computing, Monticello, IL, USA, October, 2012. (invited paper)
  33. S. Pawar, S. Rouayheb, H. Zhang. K. Lee and K. Ramchandran, “Codes for a Distributed Caching based Video-On-Demand System”, Asilomar Conference on Signals, Systems, and Computers, Pacific grove, CA, USA, November, 2011.
  34. B. Nardelli, J. Lee, K. Lee, Y. Yi, S. Chong, E. Knightly and M. Chiang, “Experiment evaluation of optimal CSMA”, IEEE INFOCOM, Shanghai, China, April, 2011.

Technical Talks:

  • "Speeding Up Distributed Machine Learning Using Codes” @ National Information Society Agency, Daegu, Korea, Jan 2018
  • "Speeding Up Distributed Machine Learning Using Codes” @ Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, Korea, Jan 2018
  • "Speeding Up Distributed Machine Learning Using Codes” @ Institute of New Media and Communications, Seoul National University, Seoul, Korea, Dec 2017
  • "Matrix Completion with Graphs" @ UC Berkeley BASiCS seminar, Berkeley, November, 2017
  • "Large-scale and Interpretable Collaborative Filtering for Educational Data" @ KDD Workshop on Advancing Education with Data, Halifax, Canada, August, 2017
  • "Crash To Not Crash: Playing Video Games To Predict Vehicle Collisions" @ ICML Workshop on Machine Learning for Autonomous Vehicles 2017, Sydney, Australia, August, 2017
  • "High-Dimensional Coded Matrix Multiplication" @ IEEE International Symposium on Information Theory (ISIT), Aachen, Germany, June 2017
  • "Coded Computation for Multicore Setups" @ IEEE International Symposium on Information Theory (ISIT), Aachen, Germany, June 2017
  • "High-Dimensional Coded Matrix Multiplication" @ Korea Institute of Communications and Information Sciences Summer Workshop, Korea, June 2017
  • "Speeding Up Distributed Machine Learning Using Codes” @ Naver, May 2017. [video]
  • "Speeding Up Distributed Computing Systems Using Codes" @ KAIST Information Theory and Machine Learning Workshop, Daejeon, Korea, December 2016
  • "Learning Analytics: Collaborative Filtering or Regression With Experts?" @ Neural Information Processing Systems (NIPS): Workshop on Machine Learning for Education, Barcelona, Spain, December 2016
  • "Introduction to Machine Learning", @ National Information Society Agency, Daegu, Korea, November 2016
  • "Community Recovery in Hypergraphs" @ The 53rd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, September 2016
  • "Speeding Up Distributed Machine Learning Using Codes” @ IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, July 2016
  • "SAFFRON: Sparse-Graph Code Framework for Group Testing” @ IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, July 2016
  • "Speeding Up Distributed Machine Learning Using Codes” @ Samsung Electronics DMC R&D Center, June 2016
  • “Speeding-up Distributed Machine Learning Using Codes”, Graduation Day Poster, 2016 Information Theory and Applications Workshop, La Jolla, CA, Feb 2016.
  • “Sub-linear time algorithms for sparse signal recovery based on sparse-graph codes”, Institute of New Media and Communications @ Seoul National University (SNU), Seoul, Korea, Jan 2016.
  • “Speeding-up Distributed Machine Learning Using Codes”, Spotlight presentation, Neural Information Processing Systems (NIPS): Workshop on Machine Learning Systems, Montreal, Canada, Dec. 2015.
  • “On Scheduling Redundant Requests with Cancellation Overheads”, The 53rd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, October 2015.
  • “Fast and Robust Compressive Phase Retrieval with Sparse-Graph Codes”, IEEE International Symposium on Information Theory (ISIT), Hong Kong, June 2015.
  • “Capacity-Approaching PhaseCode for Low-Complexity Compressive Phase Retrieval”, IEEE International Symposium on Information Theory (ISIT), Hong Kong, June 2015.
  • “A robust scalable framework for content distribution: caching by the people for the people”, University of Seoul, May 2015.
  • “A robust scalable framework for content distribution: caching by the people for the people”, IEEE Communication Theory Workshop (CTW 2015), Dana Point, CA, May 2015.
  • “The MDS Queue: Analysing the Latency Performance of Codes”, IEEE International Symposium on Information Theory (ISIT), July 2014.
  • “Latency Study of Distributed Storage Systems: Role of Redundancy and Coding”, Laboratory of Network Architecture Design and Analysis @ KAIST, Daejeon, Korea, May 2014.
  • “When Do Redundant Requests Reduce Latency?”, DIMACS Workshop on Algorithms for Green Data Storage, Rutgers University, NJ, December 2013.
  • “The MDS Queue: Analysing the Latency Performance of Codes”, IEEE International Conference on Big Data, Santa Clara, CA, October 2013.
  • “A VoD System for Massively Scaled, Heterogeneous Environments: Design and Implementation”, IEEE 21st International Symposium on Modeling, Analysis and Simulation of Computer and Telecommunication Systems, San Francisco, CA, August 2013. Best Paper Finalist.


On My Background

Selected Awards and Honors

  • The Outstanding Graduate Student Instructor Award, 2016, http://gsi.berkeley.edu/programs-services/award-programs/ogsi/ogsi-2016/
  • Best Paper Award Finalist, 2013, IEEE MASCOTS 2013
  • KFAS Fellowship, 2010 - 2015, Korea Foundation for Advanced Studies (KFAS)
  • (Presidential Award) Korea Talent Award (대한민국 인재상), 2009, Korea Foundation for Advancement of Science & Creativity (KOFAC)

Work Experience

  • Postdoctoral Researcher, Information and Electronics Research Institute at KAIST, 2016.06 - present

Education

  • Doctor of Philosophy (Ph.D.), University of California, Berkeley, 2010.08 - 2016.5
    • Major : Electrical Engineering and Computer Science
    • Advisor : Prof. Kannan Ramchandran
    • Thesis : Speeding Up Distributed Storage And Computing Systems Using Codes
  • Master of Science (MS), University of California, Berkeley, 2010.08 - 2012.12
    • Major : Electrical Engineering and Computer Science
    • Advisor : Prof. Kannan Ramchandran
    • Thesis : An Optimized Distributed Video-on-Demand Streaming System: Theory and Design
    • [code]
  • Bachelor of Science (BS), Korea Advanced Institute of Science and Technology (KAIST), 2006.03 - 2010.05
    • Major : Electrical Engineering
    • Advisor : Prof. Sae-Young Chung, Prof. Yung Yi
    • GPA : Overall 4.19/4.30, Major 4.27/4.30
  • Seoul Science High School, 2004.03 - 2006.02
    • Early graduation of excellence (2 years)

Teaching Experience:

  • Head GSI, EECS 126 Probability and Random Processes, Fall 2015.
  • Head GSI, EECS 126 Probability and Random Processes, Fall 2014.

Industrial Work Experience

  • Software Engineer, Lytmus Inc., 2013.06 - 2013.09
  • R&D Intern, Samsung Electronics, 2009.07
  • R&D Intern, LG Display, 2008.06 - 2008.08


On My Personal Life

Miscellaneous Talks

  • Invited Speaker, Career Seminar hosted by YEHS (Young Engineers Honor Society), May 2017
  • Invited Speaker, Freshman Seminar @ KAIST, October 2016
  • Invited Speaker, Leadership Forum hosted by SisaIN, 2015
  • Invited Speaker, Leadership Forum hosted by SisaIN, 2014

Hackathons and startup weekends

  • 3rd Prize Winner, Bay BitHack (BitCoin Hackathon), Analysis of BitCoin transaction networks (CreBit)
  • 3rd Prize Winner, Springboard 2012 (Startup Plan Competition), TIDE Institute
  • 2nd Prize Winner, Springboard 2011 (Startup Plan Competition), TIDE Institute
  • I built PopTube, a free Youtube aggregator for K-POP lovers, with Jayoung Choi

e-Sports

  • I once was a professional Starcraft 2 player [link]
  • For several years, I had played Starcraft 1/2 for KAIST and UC Berkeley
    • Ranked the 2nd in the North America SC2 league
  • I also won a few amateur Starcraft leagues including this, broadcast on OnGameNet