Research
Peer-Reviewed
Chang, J., He, J., Kang, J. & Wu, M. (2023+). Statistical inferences for complex dependence of multimodal imaging data, Journal of the American Statistical Association, in press.
Chang, J., Hu, Q., Kolaczyk, E. D., Yao, Q. & Yi, F. (2024). Edge differentially private estimation in the $\beta$-model via jittering and method of moments, The Annals of Statistics, 52, 708-728.
Chang, J., Chen, C., Qiao, X. & Yao, Q. (2024). An autocovariance-based learning framework for high-dimensional functional time series, Journal of Econometrics, 239, 105385.
Chang, J., Hu, Q., Liu, C. & Tang, C. Y. (2024). Optimal covariance matrix estimation for high-dimensional noise in high-frequency data, Journal of Econometrics, 239, 105329.
Chang, J., Chen, X. & Wu, M. (2024). Central limit theorem for high dimensional dependent data, Bernoulli, 30, 712-742.
Chang, J., Jiang, Q. & Shao, X. (2023). Testing the martingale difference hypothesis in high dimension, Journal of Econometrics, 235, 972-1000.
Chang, J., Shi, Z. & Zhang, J. (2023). Culling the herd of moments with penalized empirical likelihood, Journal of Business & Economic Statistics, 41, 791-805.
Chang, J., He, J., Yang, L. & Yao, Q. (2023). Modelling matrix time series via a tensor CP-decomposition, Journal of the Royal Statistical Society Series B, 85, 127-148.
Chang, J., Cheng, G. & Yao, Q. (2022). Testing for unit roots based on sample autocovariances, Biometrika, 109, 543–550.
Chang, J., Kolaczyk, E. D. & Yao, Q. (2022). Estimation of subgraph densities in noisy networks, Journal of the American Statistical Association, 117, 361–374.
Chang, J., Chen, S. X., Tang, C. Y. & Wu, T. T. (2021). High-dimensional empirical likelihood inference, Biometrika, 108, 127–147.
Chang, J., Tang, C. Y. & Wu, T. T. (2018). A new scope of penalized empirical likelihood with high-dimensional estimating equations, The Annals of Statistics, 46, 3185–3216.
Chang, J., Guo, B. & Yao, Q. (2018). Principal component analysis for second-order stationary vector time series, The Annals of Statistics, 46, 2094–2124.
Chang, J., Qiu, Y., Yao, Q. & Zou, T. (2018). Confidence regions for entries of a large precision matrix, Journal of Econometrics, 206, 57–82.
Chang, J., Delaigle, A., Hall, P. & Tang, C. Y. (2018). A frequency domain analysis of the error distribution from noisy high-frequency data, Biometrika, 105, 353–369.
Chang, J., Zheng, C., Zhou, W.-X. & Zhou, W. (2017). Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity, Biometrics, 73, 1300–1310.
Chang, J., Zhou, W., Zhou, W.-X. & Wang, L. (2017). Comparing large covariance matrices under weak conditions on the dependence structure and its application to gene clustering, Biometrics, 73, 31–41.
Chang, J., Yao, Q. & Zhou, W. (2017). Testing for high-dimensional white noise using maximum cross-correlations, Biometrika, 104, 111–127.
Chang, J., Shao, Q.-M. & Zhou, W.-X. (2016). Cramer-type moderate deviations for Studentized two-sample U-statistics with applications, The Annals of Statistics, 44, 1931–1956.
Chang, J., Tang, C. Y. & Wu, Y. (2016). Local independence feature screening for nonparametric and semiparametric models by marginal empirical likelihood, The Annals of Statistics, 44, 515–539.
Chang, J., Guo, B. & Yao, Q. (2015). High dimensional stochastic regression with latent factors, endogeneity and nonlinearity, Journal of Econometrics, 189, 297–312.
Chang, J. & Hall, P. (2015). Double-bootstrap methods that use a single double-bootstrap simulation, Biometrika, 102, 203–214.
Chang, J., Chen, S. X. & Chen, X. (2015). High dimensional generalized empirical likelihood for moment restrictions with dependent data, Journal of Econometrics, 185, 283–304.
Chang, J., Tang, C. Y. & Wu, Y. (2013). Marginal empirical likelihood and sure independence feature screening, The Annals of Statistics, 41, 2123–2148.
Chang, J. & Chen, S. X. (2011). On the approximate maximum likelihood estimation for diffusion processes, The Annals of Statistics, 39, 2820–2851.
Review Papers & Invited Discussion
Chang, J., Kolaczyk, E. D. & Yao, Q. (2020). Discussion of 'Network cross-validation by edge sampling', Biometrika, 107, 277–280.
Chang, J., Guo, J. & Tang, C. Y. (2018). Peter Hall's contribution to empirical likelihood, Statistica Sinica, 28, 2375–2387.
Manuscripts
Chang, J., Fang, Q., Kolaczyk, E., MacDonald, P. & Yao, Q. (2024). Autoregressive networks with dependent edges.
Chang, J., Ding, Z., Jiao, Y., Li, R. & Yang, J. Z. (2024). Deep conditional generative learning: Model and error analysis.
Chang, J., Tang, C. Y. & Zhu, Y. (2023). Exploring excellence: Bayesian penalized empirical likelihood and MCMC sampling.
Chang, J., Fang, Q., Qiao, X. & Yao, Q. (2023). On the modelling and prediction of high-dimensional functional time series.
Chang, J., Tang, C. Y. & Zhu, Y. (2023). Efficiently handling constraints with Metropolis-adjusted Langevin algorithms.
Chang, J., Jiang, Q., McElroy, T. & Shao, X. (2022). Statistical inference for high-dimensional spectral density matrix.
R packages
HDTSA (High-Dimensional Time Series Analysis): This package includes the R codes for high-dimensionanl factor model (Lam & Yao, 2012; Chang, Guo & Yao, 2015), high-dimensional white noise test (Chang, Yao & Zhou, 2017), PCA for vector time series (Chang, Guo & Yao, 2018), identifying cointegration in high-dimensional time series (Zhang, Robinson & Yao, 2019), unit-root test (Chang, Cheng & Yao, 2022), and high-dimensional martingale difference test (Chang, Jiang & Shao, 2023).