Papers under review/Preprints
- Sparse factor models of high dimension, Code, with Yoshikazu Terada
- Factor multivariate stochastic volatility models of high dimension, Code, with Manabu Asai (R&R)
- Break recovery in graphical networks with D-trace loss, Code, with Lin Ying, TK Pong and Akiko Takeda
- Variable selection with sparse mRMR, Code, with Peter Naylor, Héctor Climente and Makoto Yamada
Publications
13. Model-based Vs agnostic methods for the prediction of time-varying covariance matrices, with Jean-David Fermanian and Panos Xidonas, Annals of Operations Research. Code
12. Sparse M-estimators in semi-parametric copula models, with Jean-David Fermanian, Bernoulli. Code
11. Estimation of high-dimensional Vector Autoregression via sparse precision matrix (2023), with Manabu Asai, The Econometrics Journal. Code
10. High-dimensional nonlinear feature selection with Hilbert-Schmidt Independence Criterion LASSO (2023), with Makoto Yamada, Hiroaki Yamada and Tobias Freidling, Journal of the Japan Statistical Society.
9. High-dimensional stochastic volatility models (2023), with Manabu Asai, Journal of Time Series Analysis. Arxiv. Code
8. Feature screening with kernel knockoff (2022), with Peter Naylor, Héctor Climente and Makoto Yamada, accepted at International Conference on Artificial Intelligence and Statistics (AISTATS2022), to appear at Proceedings of Machine Learning Research. Code
7. The finite sample properties of sparse M-estimators with pseudo-observations (2022), with Jean-David Fermanian, Annals of the Institute of Statistical Mathematics
6. Post-selection inference with HSIC-Lasso (2021), with Tobias Freidling, Makoto Yamada and Héctor Climente, International Conference on Machine Learning (ICML2021), Proceedings of Machine Learning Research. Arxiv. Code
5. High-dimensional penalized ARCH processes (2021), with Jean-David Fermanian, Econometric Reviews
4. Statistical analysis of sparse approximate factor models (2020), with Yoshikazu Terada, Electronic Journal of Statistics
3. Sparse Hilbert-Schmidt Independence Criterion regression (2020), with Makoto Yamada, International Conference on Artificial Intelligence and Statistics (AISTATS2020), Proceedings of Machine Learning Research. Appendix
2. Asymptotic theory of the adaptive Sparse-Group Lasso (2020), Annals of the Institute of Statistical Mathematics
1. Dynamic asset correlations based on vines (2019), with Jean-David Fermanian, Econometric Theory. Code
This paper corresponds to the merger of two working papers: Dynamic asset correlations based on vines presents the vine-GARCH specification, states the weak consistency and the asymptotic normality of estimates, and provides some empirical results; Vine-GARCH process stationarity and asymptotic properties provides the conditions for the existence and the uniqueness of stationary solutions of the vine-GARCH models.
Work in progress
- Rotated stochastic volatility models, with Manabu Asai
- Adaptive structural break detection in variance-covariance matrix, with Ying Lin
- Estimation of dynamic model via MMD criterion, with Jean-David Fermanian and Pierre Alquier
- Linear approximation of generalized autoregressive time series models of high dimension, with Paul Doukhan
- Sparse matrix decomposition factor analysis of high dimension, with Yoshikazu Terada
- Large variance-covariance estimation by Bures-Wasserstein, with Ying Lin and TK Pong
- Structural breaks in Hawkes process, with Yoann Potiron
Journal Reviewing
Journal of Econometrics, Electronic Journal of Statistics, Journal of Machine Learning Research, Econometric Theory, Journal of Multivariate Analysis, Scandinavian Journal of Statistics, Annals of the Institute of Statistical Mathematics, Computational Statistics & Data Analysis, Statistics & Probability Letters, Quantitative Finance, Advances in Data Analysis and Classification, Applied Probability Journals, AISTATS, ICML, NeurIPS.