## Yeonjong Shin

**Assistant Professor of Mathematical Sciences at KAIST**

**Contact**

Office: Rm 1405, E6

Phone : +82-42-350-2748

Email: yeonjong_shin AT kaist DOT ac DOT kr

**Employment**

• Assistant Professor (tenure-track), July 2022 - Present

Department of Mathematical Sciences,

KAIST, Daejeon, South Korea

• Prager Assistant Professor, July 2018 - June 2022

Division of Applied Mathematics,

Brown University, Providence, RI, USA

**Education**

• Ph.D. Mathematics, May 2018

The Ohio State University, Columbus, OH, USA

Advisor: Dongbin Xiu

• B.S. Mathematics and

• B.A. Economics, August 2013*

Yonsei University, Seoul, South Korea

*Military Service as a KATUSA, 2009-2011

**Research Interests**

• Mathematics of Machine Learning and Approximation Theory

• Scientific Computing, Stochastic Optimization, Uncertainty Quantification, and Data Science

**Publications (****google scholar****)**

[21] In preparation

[20] J. T. Lauzon, S-W. Cheung, **Y. Shin**, Y. Choi, D. M. Copeland, and K. Huynh, S-OPT: A points selection algorithm for hyper-reduction in reduced order models

*Submitted for publication. *

[19] M. Ainsworth and **Y. Shin**, Active Neuron Least Squares: A training method for multivariate rectified neural networks

*Submitted for publication. *

[18] **Y. Shin**, J. Darbon, G. E. Karniadakis, A Caputo fractional derivative-based algorithm for optimization

*Submitted for publication. **https://arxiv.org/abs/2104.02259*

[17] B. Deng, **Y. Shin**, L. Lu, Z. Zhang, and G. E. Karniadakis, Convergence rate of DeepONets for learning operators arising from advection-diffusion equations,

*Submitted for publication. **https://arxiv.org/pdf/2102.10621*

[16] **Y. Shin**, Z. Zhang, G. E. Karniadakis, Error estimates of residual minimization using neural networks for linear PDEs,

*Submitted for publication. **https://arxiv.org/abs/2010.08019*

[15] Z. Zhang, **Y****.**** Shin**, G. E. Karniadakis, GFINNs: GENERIC Formalism Informed Neural Networks for Deterministic and Stochastic Dynamical Systems

*Accepted at Philos. Trans. R. Soc. A.** *http://arxiv.org/abs/2109.00092

[14] **Y. Shin**, Effects of Depth, Width, and Initialization: A Convergence Analysis of Layer-wise Training for Deep Linear Networks,

*Anal. Appl. 20(1), pp. 73-119 (2022) **https://arxiv.org/abs/1910.05874*

[13] A. D. Jagtap, **Y****.**** Shin**, K. Kawaguchi, and G. E. Karniadakis, Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

*Neurocomputing, 468, pp 165--180 (2022). **https://arxiv.org/pdf/2105.09513** *

[12] M. Ainsworth and **Y. Shin**, Plateau Phenomenon in Gradient Descent Training of ReLU networks: Explanation, Quantification, and Avoidance,

*SIAM J. Sci. Comput., 43(5), A3438-A3468 (2021). **https://arxiv.org/abs/2007.07213*

[11] J. Hou, **Y. Shin**, and D. Xiu, Identification of Corrupted Data via k-means Clustering for Function Approximation,

*CSIAM Trans. Appl. Math., 2, pp. 81-107 (2021).*

[10] **Y. Shin**, J. Darbon, G. E. Karniadakis, On the convergence of physics informed neural networks for linear second-order elliptic and parabolic type PDEs,

* Commun. Comput. Phys., 28, pp. 2042-2074 (2020). **https://arxiv.org/abs/2004.01806*

[9] **Y. Shin** and G. E. Karniadakis, Trainability of ReLU Networks and Data-dependent Initialization,

*Journal of Machine Learning for Modeling and Computing, 1(1), 39-74 (2020). **https://arxiv.org/abs/1907.09696*

[8] L. Lu, **Y. Shin**, Y. Su, and G. E. Karniadakis, Dying ReLU and Initialization: Theory and Numerical Examples,

* Commun. Comput. Phys., 28, pp. 1671-1706 (2020). **https://arxiv.org/abs/1903.06733*

[7] **Y. Shin**, K. Wu and D. Xiu, Sequential function approximation using randomized samples,

*J. Comput. Phys., 371, 363-381 (2018).*

[6] K. Wu, **Y. Shin** and D. Xiu, A randomized tensor quadrature method for high dimensional polynomial approximation,

*SIAM J. Sci. Comput., 39(5), A1811-A1833 (2017).*

[5] **Y. Shin** and D. Xiu, A randomized algorithm for multivariate function approximation,

*SIAM J. Sci. Comput., 39(3), A983-A1002 (2017).*

[4] L. Yan, **Y. Shin** and D. Xiu, Sparse approximation using l1-l2 minimization and its applications to stochastic collocation,

*SIAM J. Sci. Comput., 39(1), A229-A254 (2017).*

[3] **Y. Shin** and D. Xiu, Correcting data corruption errors for multivariate function approximation,

*SIAM J. Sci. Comput., 38(4), A2492-A2511 (2016).*

[2] **Y. Shin** and D. Xiu, On a near optimal sampling strategy for least squares polynomial regression,

*J. Comput. Phys., 326, 931-946 (2016). *

[1] **Y. Shin** and D. Xiu, Nonadaptive quasi-optimal points selection for least squares linear regression,

*SIAM J. Sci. Comput., 38(1), A385-A411 (2016)*.

**Conference Presentations**

[18] (Virtual) SIAM Mathematics of Data Science, Sept 2022

[17] (Virtual) SIAM UQ, Atlanta, GA, April 2022

[16] (Virtual) SIAM Analysis of PDEs, Berlin, Germany, March 2022

[15] (Virtual) KSIAM Annual Meeting, Busan, South Korea, Dec 2021

[14] (Virtual) The 6th Annual Meeting of SIAM Central States Section, Oct 2021

[13] (Virtual) SIAM Southeastern Atlantic Section Conference, Auburn, AL, Sept 2021

[12] (Virtual) SIAM CSE 2021 , March 2021

[11] (Virtual) Mathematical and Scientific Machine Learning, Princeton, July 2020

[10] (Virtual) SIAM Mathematics of Data Science, June 2020

[9] ICERM: Scientific Machine Learning, Jan 2019, Providence, RI, USA

[8] SIAM UQ 2018, April 2018, Garden Grove, CA, USA

[7] SIAM CSE 2017, Feb 2017, Atlanta, GA, USA

[6] 12th WCCM - 6th APCOM 2016, July 2016, Seoul, Korea.

[5] 15th International Conference Approximation Theory, May 2016, San Antonio, TX, USA.

[4] SIAM UQ 2016, April 2016, Lausanne, Switzerland.

[3] 14th Copper Mountain Conference on Iterative Methods, March 2016, CO, USA.

[2] ICIAM 2015, August 2015, Beijing, China.

[1] SIAM CSE 2015, March 2015, Salt Lake City, UT, USA

**Invited talks/seminars**

[22] (Virtual) CCMA Seminar, Penn State University, State College, PA, USA, May 2022

[21] (Virtual) Department of Mathematics, FSU, Tallahassee, FL, USA, Jan 2022

[20] (Virtual) Department of Mathematics, UCLA, Los Angeles, CA, USA, Jan 2022

[19] Department of Mathematics, Portland State University, Portland, OR, USA, Jan 2022

[18] Department of Mathematics, Lehigh, Bethlehem, PA, USA, Dec 2021

[17] Department of Scientiﬁc Computing, FSU, Tallahassee, FL, USA, Nov 2021

[16] (Virtual) Comput. and Applied Math Seminar, Tufts, MA, USA, Oct 2021.

[15] (Virtual) DDPS Seminar, LLNL, July 2021. (Youtube)

[14] (Virtual) University of California, Riverside, CA, USA, May 2021.

[13] (Virtual) University of Texas at El Paso, Texas, USA, Apr 2021.

[12] (Virtual) RWTH Aachen University, Germany, Mar 2021.

[11] (Virtual) University of Iowa, Iowa, USA, Mar 2021.

[10] (Virtual) KAIST, Daejeon, Korea, Feb 2021.

[9] (Virtual) Helmholtz-Zentrum Dresden-Rossendorf, Germany, Sep 2020.

[8] (Virtual) Physics-Informed Learning Machines Webinar, PNNL, May 2020.

[7] Department of Computational Science & Engineering Seminar, Yonsei University, Aug 2019, Seoul, Korea.

[6] Department of Mathematical Sciences Seminar, Seoul National University, Aug 2019, Seoul, Korea.

[5] Department of Mathematics Seminar, Yonsei University, Jan 2019, Seoul, Korea.

[4] Applied Algebra and Optimization Research Center Seminar, Sungkyunkwan University, Jan 2019, Suwon, Korea.

[3] Applied Mathematics Colloquium, Brown University, Sep 2018, Providence, RI, USA.

[2] Spring School, University of South Carolina, Feb 2018, Columbia, SC, USA.

[1] Department of Mathematics Seminar, Sungkyunkwan University, July 2016, Suwon, Korea.

**Teaching**

** Brown University**

** **APMA 1650, APMA 1655 (Honors) Statistical Inference I**, **[Lecture Notes]

APMA 1160 An Introduction to Numerical Optimization

** **APMA 1170 Introduction to Computational Linear Algebra

APMA 1210 Operations Research - Deterministic Models

APMA 1360 Applied Dynamical Systems