High-dimensional inference reading group

Schedule

2016-17 Term 1

  • Wed 9/11, 2-3pm: David Rossell. Room 20.233.
  • Wed 23/11 & Thurs 24/11 15-17: David Rossell. MSc data science intro classes to Bayesian variable selection
  • Wed 30/11, 2-3pm. Omiros. Scalable variable selection under block-diagonal designs

2016-17 Term 2

  • Wed 11/01, 2-3pm - Geert Mesters. Searching multiregression dynamic models of resting-state fMRI networks using integer programming. Slides
  • Wed 25/01, 12-1pm - Piotr Zwiernik. A Bayesian information criterion for singular models. Slides; see also this video.
  • Wed 15/02, 12-1pm - Miquel Torrens. Branch-and-bound algorithms for computing best-subset regression models. Slides
  • Wed 01/03, 12-1pm - Davide Viviano. Statistical consistency and asympttic normality for high-dimensional robust M-estimators." arXiv preprint arXiv:1501.00312 (2015). Slides
  • Wed 15/03, 12-1pm - Tim Stumpf-Fétizon.
2016-17 Term 3

  • Wed 07/06, 4-5pm - Stephen Hansen (Oxford). Double LASSO for high-dimensional covariate adjustment and instrumental variable regression. Slides

Reading papers (red indicates papers that were discussed already)

On asymptotics

  • Castillo, I., Schmidt-Hieber, J., & Van der Vaart, A. (2015). Bayesian linear regression with sparse priors. The Annals of Statistics, 43(5), 1986-2018.
  • Gao, C., van der Vaart, A. W., and Zhou, H. H. (2015), A general framework for Bayes structured linear models, arXiv, 1506.02174, 1-44 

  • Narisetty, N. N., & He, X. (2014). Bayesian variable selection with shrinking and diffusing priors. The Annals of Statistics, 42(2), 789-817.
  • Shin, M., Bhattacharya, A., & Johnson, V. E. (2015). Scalable Bayesian variable selection using nonlocal prior densities in ultrahigh-dimensional settings. arXiv preprint arXiv:1507.07106.
  • Loh, P. L. (2015). Statistical consistency and asymptotic normality for high-dimensional robust M-estimators. arXiv preprint arXiv:1501.00312.
  • Drton, M. & Plummer, M. (2017). A Bayesian information criterion for singular models. Journal of Royal Statistical Society - Series B, 79, 1-38. (with discussion)

Applications to time series / graphical models / etc

  • Chandrasekaran, V., Parrilo, P. A., & Willsky, A. S. (2010). Latent variable graphical model selection via convex optimization. In Communication, Control, and Computing (Allerton), 2010 48th Annual Allerton Conference on(pp. 1610-1613). IEEE.
  • Costa, L., Smith, J., Nichols, T., Cussens, J., Duff, E. P., & Makin, T. R. (2015). Searching multiregression dynamic models of resting-state fMRI networks using integer programming. Bayesian Analysis, 10(2), 441-478.
  • Peterson, C., Stingo, F. C., & Vannucci, M. (2015). Bayesian inference of multiple Gaussian graphical models. Journal of the American Statistical Association, 110(509), 159-174.
  • Zou, T., Lan, W., Wang, H., & Tsai, C. L. (2016). Covariance Regression Analysis. Journal of the American Statistical Association, (just-accepted), 1-44.
On computation
  • Bertsimas, D., King, A., Mazumder, R. (2016). Best subset selection via a modern optimization lens. The Annals of Statistics44(2), 813-852.
  • Clyde, M. A., Ghosh, J., & Littman, M. L. (2012). Bayesian adaptive sampling for variable selection and model averaging. Journal of Computational and Graphical Statistics.
  • Foster, D. P., Karloff, H. J., & Thaler, J. (2015, July). Variable Selection is Hard. In COLT (pp. 696-709).
  • Furnival, George M., and Robert W. Wilson. "Regressions by leaps and bounds." Technometrics 42.1 (2000): 69-79.
  • Gatu, Cristian, and Erricos John Kontoghiorghes. "Branch-and-bound algorithms for computing the best-subset regression models." Journal of Computational and Graphical Statistics (2012).
  • Womack, A. J., Fuentes, C., & Taylor-Rodriguez, D. (2015). Model Space Priors for Objective Sparse Bayesian Regression. arXiv preprint arXiv:1511.04745.


Ċ
David Rossell,
Nov 10, 2016, 12:29 AM
Comments