Publications

Journal Articles

  1. Discrete optimization methods for group model selection in compressed sensing; accepted in Mathematical Programming, (2020) https://doi.org/10.1007/s10107-020-01529-7 (with J. Kurtz, and O. Schaudt).

  2. Outcome Prediction with Serial Neuron-Specific Enolase and Machine Learning in Anoxic-Ischaemic Disorders of Consciousness; Computers in biology and medicine, 107 (2019), 145–152 (with E. Muller, J. Shock, A. Bender, J. Kleeberger, T. Högen, M. Rosenfelder, and A. Lopez-Rolon).

  3. On the construction of sparse matrices from expander graphs, B. Bah and J. Tanner; Frontiers in Applied Mathematics and Statistics: Mathematics of Computation and Data Science, Vol. 4(39) (2018) 2297-4687.

  4. The sample complexity of weighted sparse approximation, B. Bah and R. Ward; IEEE Transactions on Signal processing, Vol. 64(12) (2016) 3145-3155.

  5. Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices, B. Bah and J. Tanner; Linear Algebra and its Applications, Vol. 441(1) (2014) 88-109.

  6. Vanishingly sparse matrices and expander graphs, with application to compressed sensing, B. Bah and J. Tanner; IEEE Transactions on Information Theory, Vol. 59(11) (2013) 7491-7508.

  7. Improved bounds on restricted isometry constants for Gaussian matrices, B. Bah and J. Tanner; SIAM Journal on Matrix Analysis, Vol. 31(5) (2010) 2882-2898.


Conference Proceedings

  1. Improving the Reliability of Pooled Testing with Combinatorial Decoding and Compressed Sensing; accepted at 55th Annual Conference on Information Sciences and Systems (CISS 2021), Online, (2021) (with H. Petersen, S. Agarwal, and P. Jung).

  2. Towards the Localisation of Lesions in Diabetic Retinopathy; accepted at Computing Conference 2021, London, (with S. Mensah, and W. Brink).

  3. An Integer Programming Approach to Deep Neural Networks with Binary Activation Functions; ICML 2020 workshop on “Beyond First Order Methods in Machine Learning” (with J. Kurtz).

  4. On Error Correction Neural Networks for Economic Forecasting; 23rd International Conference on Information Fusion (FUSION 2020), Pretoria, South Africa, (with M. Mvubu, E. Kabuga, C. Plitz, R. Becker, and H-G. Zimmermann).

  5. Using neural networks to identify individual animals from photographs; (extended abstract), South African Forum for Artificial Intelligence Research (FAIR 2019), Cape Town, South Africa, (with E. Kabuga, I. Durbach, and A. Clark).

  6. Weighted sparse recovery with expanders, B. Bah; 5th International Workshop on Compressed Sensing applied to Radar, Multimodal Sensing and Imaging (CoSeRa), Siegen, Germany, 10-13 September 2018.

  7. Convex block-sparse linear regression with expanders, provably, A. Kyrillidis, B. Bah, R. Hasheminezhad, Q. Tran-Dinh, L. Baldassarre, and V. Cevher; 19th International Conference on Artificial Intelligence and Statistics (AISTATS 2016), Cadiz, Spain, 2016.

  8. Metric Learning with Rank and Sparsity Constraints, B. Bah, V. Cevher, S. Becker and B. Gözcü; IEEE International Conference on Acoustics, Speech and Signal Processing, Florence, Italy, 2014.

  9. Model-based Sketching and Recovery with Expanders, B. Bah, L. Baldassarre and V. Cevher; ACM-SIAM Symposium on Discrete Algorithms, Portland, Oregon, USA, 2014.

  10. Construction and analysis of sparse random matrices and expander graphs with applications to compressed sensing, B. Bah and J. Tanner; 10th International Conference on Sampling Theory and Applications (SampTA 2013), Bremen, Germany, 2013.

  11. Energy-aware adaptive bi-Lipschitz embeddings, A. Sadeghian, B. Bah and V. Cevher; 10th International Conference on Sampling Theory and Applications (SampTA 2013), Bremen, Germany, 2013.


Book Chapters

  1. Designing Data-driven Learning Algorithms: a necessity to ensure effective post-genomic medicine and biomedical research; Artificial Intelligence–Applications in Medicine and Biology, IntechOpen, (2019) (with G. Mazandu, I. Kyomugisha, M. Seuneu, and E. Chimusa).

  2. Editorial: Recent Developments in Signal Approximation and Reconstruction; Frontiers in Applied Mathematics and Statistics, (2020) (with J. L. Bouchot).


Preprints

  • Learning deep linear neural networks: Riemannian gradient flows and convergence to global minimizers; accepted at Information and Inference: A Journal of the IMA (with H. Rauhut, U. Terstiege, and M. Westdickenberg).

  • Efficient Tuning–Free l1 -Regression of Nonnegative Compressible Signals; submitted to a journal and on arXiv (with H. Petersen, and P. Jung).

  • Practical High-Throughput, Non-Adaptive and Noise-Robust SARS-CoV-2 Testing; submitted to a conference and on arXiv (with H. B. Petersen, and P. Jung).


PhD Thesis

  1. Restricted Isometry Constants in Compressed Sensing (supervised by Jared Tanner and Coralia Cartis).


MSc Dissertation

  1. Diffusion Maps: Analysis and Applications (supervised by Radek Erban).