**The Homepage of Jiange Li**

**The Homepage of Jiange Li**

I am a faculty member in the Institute for Advanced Study in Mathematics of the Harbin Institute of Technology. Previously, I was a post-doctoral fellow hosted by Dr. Ori Gurel-Gurevich and Dr. Ohad Noy Feldheim in the Einstein Institute of Mathematics of the Hebrew University of Jerusalem. Before that, I worked with Prof. Muriel Médard at the Research Laboratory of Electronics at MIT. I obtained my Ph.D. degree in mathematics under the supervision of Prof. Wenbo Li [Memorial, Obituary] and Prof. Mokshay Madiman from the University of Delaware. I can be reached at jiange.li@hit.edu.cn.

My research interests lie at the interface between analysis and probability, such as information-theoretic inequalities, high-dimensional phenomena in probability theory and convex geometry, geometric functional inequalities, and the analysis of Boolean functions.

**Preprint****s:**

Long-term balanced allocation via thinning (with O. N. Feldheim and O. Gurel-Gurevich)

**Journal publications:**

Boolean functions: Noise stability, non-interactive correlation distillation, and mutual information (with M. Médard),

*IEEE Trans. Inform. Theory**,*67(2): 778-789, February 2021Concentration of information content for convex measures (with M. Fradelizi and M. Madiman),

*Electron. J. Probab.*25(20):1-22, 2020Load balancing under d-thinning (with O. N. Feldheim),

*Electron. Commun. Probab.*Further investigations of Rényi entropy power inequalities and an entropic characterization of s-concave densities (with A. Marsiglietti and J. Melbourne),

*Geometric Aspects of Functional Analysis*– Israel Seminar (GAFA) 2017-2019, Volume II, pages 95-123, Lecture Notes in Mathematics 2266.Large deviations for conditional guesswork,

*Stat. Probab. Lett.*, 153: 7-14, 2019Capacity-achieving guessing random additive noise decoding (GRAND) (with K. Duffy and M. Médard),

*IEEE Trans. Inform. Theory**,*65(7): 4023-4040, July 2019A combinatorial approach to small ball inequalities for sums and differences (with M. Madiman),

*Comb. Probab. Comput.*, 28(1):100-129, 2019Rényi entropy power inequality and a reverse,

*Studia. Math.*, 242(3): 303-319, 2018Entropies of weighted sums in cyclic groups and applications to polar codes (with E. Abbe and M. Madiman),

*Entropy*, Special issue on “Entropy and Information Inequalities", 19(9), September 2017A note on distribution-free symmetrization inequalities (with Z. Dong and W. V. Li),

*J. Theor. Probab*., 28(3):958-967, 2015

**Conference proceedings:**** **

Usable deviation bounds for the information content of convex measures (with M. Fradelizi and M. Madiman), In

*Proc. IEEE Intl. Symp. Inform. Theory*., pp. 2258-2263, Los Angeles, USA, July 2020.Entropic central limit theorem for Rényi entropy (with A. Marsiglietti and J. Melbourne), In

*Proc. IEEE Intl. Symp. Inform. Theory*., pp. 1137-1141, Paris, France, July 2019.Rényi entropy power inequalities for s-concave densities (with A. Marsiglietti and J. Melbourne), In

*Proc. IEEE Intl. Symp. Inform. Theory*., pp. 2224-2228, Paris, France, July 2019.Further investigations of the maximum entropy of the sum of two dependent random variables (with J. Melbourne), In

*Proc. IEEE Intl. Symp. Inform. Theory*., pp. 1969-1972, Vail, USA, July 2018.Boolean functions: noise stability, non-interactive correlation, and mutual information (with M. Médard), In

*Proc. IEEE Intl. Symp. Inform. Theory.*, pp. 266-270, Vail, USA, July 2018.Guessing noise, not code-words (with K. Duffy and M. Médard), In

*Proc. IEEE Intl. Symp. Inform. Theory.*, pp. 671-675, Vail, USA, July 2018.Information concentration for convex measures (with M. Fradelizi and M. Madiman), In

*Proc. IEEE Intl. Symp. Inform. Theory.*, pp. 1128-1132, Barcelona, Spain, July 2016.