Home
I am Associate Professor at the Business Analytics discipline, the University of Sydney Business School. I'm a Chief Investigator in the ARC Centre for Data Analytics for Resources and Environments (DARE) and was an investigator in the Australian Research Council's Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS). I'm also an executive member of the Sydney Vietnam Insitute.
I received a PhD in Statistics in 2012 from the National University of Singapore, a Master and a Bachelor in Mathematics from the Vietnam National University, Hanoi. Before joining the University of Sydney, I worked as a postdoctoral research fellow at the University of New South Wales.
My name Minh-Ngoc is pronounced as miŋ-ŋɔːk
Precisely speaking, I work part-time as a researcher and full-time as a babysitter. Here are my most two important publications EVER!
Address: Rm 446, Business School Building (H69), University of Sydney, NSW 2006, Australia
T +61 2 8627 4752 | F +61 2 9351 6409 | E <given name minh-ngoc> dot <surname>@sydney.edu.au
Want to learn about Variational Bayes with hands-on experience? Check out our accessible tutorial here .
News
Dec 2023 (updated version April 2024): Natural gradient works efficiently in learning (Amari, 1998), but computing it remains a challenging problem. Our new work Natural Gradient Variational Bayes without Fisher Matrix Analytic Calculation and Its Inversion completely addresses this challenge. Accepted to the Journal of the American Statistical Association.
Feb 2023: New work Particle Mean Field Variational Bayes extends the scope of VB by leveraging recent advances in Optimal Transport theory and SDEs. This paves the way for scalable and accurate Bayesian inference in big model big data, in which the posterior is approximated by a set of optimisation-based particles with theoretical guarantee. [Code]
Feb 2023: The work "Quantum Variational Bayes on manifolds" with Anna Lopatnikova is accepted into an invited session at ICASSP 2023. The work Quantum natural gradient for Variational Bayes is under 2nd round revision for Quantum.
Feb 2023: New work Bayesian Inference for Evidence Accumulation Models with Regressors develops scalable Bayesian inference methods for EAMs in psychology/cognitive science.
Feb 2023: New work Realized recurrent conditional heteroskedasticity model for volatility modelling, led by PhD student Chen Liu, creates new ways for incorporating deep learning and high-frequency trading data into financial risk forecasting.
Apr 2022: A new version of An Introduction to Quantum Computing for Statisticians and Data Scientists is available: fixing typos and adding two sections on "Birds-eye view of quantum theory" and "Programming quantum computers" with a review on cloud-based access to real QC and current state-of-the-art programming software for implementing quantum algorithms. Accepted to Foundations of Data Science.
Dec 2021: New work (with Anna Lopatnikova), An Introduction to Quantum Computing for Statisticians, provides an accessible introduction to quantum computing with a focus on applications in statistics and data analysis, and highlights the challenges and opportunities in applying QC to statistical problems.
Dec 2021: New publication (with N. Nguyen, D. Gunawan and R. Kohn) A long short-term memory stochastic volatility model is to be published in Journal of Business and Economics Statistics
Nov 2021: I was named "Australia's top researcher in Probability and Statistics with Applications in 2021" by The Australian Research Magazine
Oct 2021: New publications: Variational Bayes on manifolds (Statistics and Computing - published), Efficient selection between hierarchical cognitive models: cross-validation with Variational Bayes (Psychological Methods - accepted), Time-evolving psychological processes over repeated decisions (Psychological Review - accepted) and Recurrent conditional heteroskedasticity (Journal of Applied Econometrics - accepted).
Jun 2021: Gave an interview with the Early Career and Student Statisticians network about research and ... coffee. Nothing serious, just for fun.
Jun 2021: Our psychological work "Time-evolving psychological processes over repeated decisions" has been substantially revised. Using recent advances from Bayesian estimation, together with cognitive modelling, the work re-analysed three famous cognitive datasets and found significant evidence that human psychological processes and decision-making behavior are not static but tend to change over time in interesting manners.
Jun 2021: New work "Quantum natural gradient for Variational Bayes" uses quantum computation to speed up Variational Bayes. Joint work with quantum-physicist Anna Lopatnikova.
Mar 2021: New publications "The block-Poisson estimator for optimally tuned exact subsampling MCMC" (with Quiroz, Villani, Kohn and Dang), "Assessment and adjustment of approximate inference algorithms using the law of total variance" (with Yu, Nott and Klein) and "Manifold optimisation assisted Gaussian variational approximation" (with Zhou, Gao and Gerlach). All to appear on the Journal of Graphical and Computational Statistics.
I'm invited to give a keynote address at the Australian Early Career & Student Statisticians Conference, July 2021.
An accessible tutorial on Variational Bayes with hands-on experience can be found here - written with two PhD students of mine.
Nov 2020: Our research on cognitive modelling has been topped up with $378,500 with the success of ARC DP21 grant, joint with Robert Kohn, Scott Brown and David Gunawan.
Sep 2020: The R package for the Particle Metropolis within Gibbs sampler is available on CRAN. This package implements the efficient Bayesian inference methods introduced in our paper, making it handy for psychologists who want to apply recent advances in Bayesian computation into cognitive modelling.
Aug 2020: Our paper (with D. Gunawan, K.-D. Dang, M. Quiroz and R. Kohn), recently accepted to Stat and Computing, shows how subsampling can be used to speed up Sequential Monte Carlo.
I'm hiring a postdoctoral research fellow, please click here for the details. Application deadline 2 July.
June 2020: Our paper (with Robert Salomone, Mattias Quiroz, Robert Kohn and Mattias Villani) “Spectral Subsampling MCMC for Stationary Time Series” has been accepted to the 37th International Conference on Machine Learning (ICML 2020).
Mar 2020: Our works on cognitive modelling have been published: "Robustly estimating the marginal likelihood for cognitive models via importance sampling" and "Identifying relationships between cognitive processes across tasks, contexts, and time" were accepted by Behavior Research Methods and "New Estimation Approaches for the Linear Ballistic Accumulator Model" is to appear in Journal of Mathematical Psychology.
Dec 2019: Fantastic news! Our ARC DP2020 grant proposal (with Junbin Gao and Richard Gerlach) has been successful! This project will develop efficient Deep Learning-based time series models for financial data. Exciting research results to come.
Conference travel plans: BoB@Gold Coast, Nov 2019; CFE@London, Dec 2019; EcoSta@Seoul, June 2020; ISBA@Kunming, June 2020; Neurips@Vancouver, Dec 2020.
Aug 2019: My new paper Variational Bayes on Manifolds (with D Nguyen and D Nguyen) develops an efficient manifold VB method that exploits both the geometry structure of the constrained variational parameter space, and the information geometry of the approximating family. Manifold VB is more stable and less sensitive to initialization. The paper also establishes a sharp convergence rate.
July 2019: Our subsampling MCMC paper is featured among the most read in JASA
June 2019: Our paper Hamiltonian Monte Carlo with energy conserving subsampling (with Dang, Quiroz, Kohn and Villani) has been accepted by Journal of Machine Learning Research. The paper develops an HMC methodology that works with large data.
June 2019: A full-text version of our paper "A long short-term memory stochastic volatility model" (with Nguyen, Gunawan and Kohn) is available here. The paper combines the state-of-the-art LSTM technique in Deep Learning with Stochastic Volatility modelling in financial econometrics, and introduces the so-called LSTM-SV model for volatility modelling. It's shown in a range of examples that the LSTM-SV model is able to capture interesting underlying patterns and has an impressive predictive performance.
June 2018: Our first paper (with D Gunawan, S Brown, R Kohn) on experimental psychology is available here. More to come.
June 2018: Our first paper Bayesian Deep Net GLM and GLMM of the Bayesian deep learning project is available. More to come. Software packages are available here. Accepted by JCGS in Jan 2019.
May 2018: Our new paper (with D Gunawan, R Kohn, M Quiroz, K Dang) describes how to do Bayesian inference for complex models with big data, by a non-trivial combination of sophisticated techniques MCMC, annealing SMC, HMC and subsampling.
May 2018: I was invited to give a talk at the conference Bayesian Statistics in the Big Data Era organised by Kerrie Mengersen, Pierre Pudlo and Christian Roberts, 26-30/11 Marseille, France.
May 2018: Our new paper (with D Gunawan, C Carter and R Kohn) "Flexible Density Tempering Approaches for State Space Models with an Application to Factor Stochastic Volatility Models" is arXived
Mar 2018: A new version of our paper "The block-Poisson estimator for exact subsampling MCMC" is available on ArXiv. The paper shows that it is possible to obtain exact Bayesian inference with MCMC in big data, even only data subsets are used within MCMC iterations.
Jan 2018: Our paper (with Matias Quiroz, Mattias Villani, Robert Kohn) "Speeding Up MCMC by Efficient Data Subsampling" has been accepted by Journal of the American Statistical Association.
I will be giving a short course on "Bayesian Computation for Big Models Big Data" at IRTG Summer Camp, Humboldt-Universität zu Berlin, 11-14 July 2018.
I will be on sabbatical leave from June 2018, and will be visiting and giving talks at several universities in Europe, Hong Kong and Singapore.
Fantastic news! our research is funded by an ARC DP grant 2018-2020, $348,912. This is a joint project with Robert Kohn (UNSW) and Scott Brown (UoN).
I gave a plenary invited talk on Deep Learning at the 2nd Vietnam International Conference on Applied Mathematics, HCM city, 15-18 Dec, 2017.
I gave an invited talk on "Bayesian Computation for Big Models Big Data" at the Statistical Challenges in Astronomy workshop, UNSW, 7-8 Dec 2017.