Fan J, Yang Z, Yu M. Understanding implicit regularization in over-parameterized single index model. Journal of the American Statistical Association. 2023 Oct 2;118(544):2315-28. link
Fan J, Yang Z, Yu M. Understanding implicit regularization in over-parameterized single index model. Journal of the American Statistical Association. 2023 Oct 2;118(544):2315-28. link
Machidon AL, Pejović V. Deep learning for compressive sensing: a ubiquitous systems perspective. Artificial Intelligence Review. 2023 Apr;56(4):3619-58. link
Gao W, Ge D, Sun C, Ye Y. Solving linear programs with fast online learning algorithms. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 10649-10675). PMLR. link
Gao S, Zhang Z, Ma J, Li Z, Zhang S. Correlation-aware mutual learning for semi-supervised medical image segmentation. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 98-108). Cham: Springer Nature Switzerland. link
McRae AD, Karnik S, Davenport M, Muthukumar VK. Harmless interpolation in regression and classification with structured features. InInternational Conference on Artificial Intelligence and Statistics 2022 May 3 (pp. 5853-5875). PMLR. link
Mukkamala MC, Fadili J, Ochs P. Global convergence of model function based Bregman proximal minimization algorithms. Journal of Global Optimization. 2022 Aug 1:1-29. link
Liu D, Cevher V, Tran-Dinh Q. A Newton Frank–Wolfe method for constrained self-concordant minimization. Journal of Global Optimization. 2022 Jun 1:1-27. link
Lee Y, Lee S, Won JH. Statistical inference with implicit sgd: proximal robbins-monro vs. polyak-ruppert. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 12423-12454). PMLR. link
Kacham P, Woodruff D. Sketching algorithms and lower bounds for ridge regression. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 10539-10556). PMLR. link
Jin Q, Koppel A, Rajawat K, Mokhtari A. Sharpened quasi-Newton methods: Faster superlinear rate and larger local convergence neighborhood. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 10228-10250). PMLR. link
Heckel R. Provable continual learning via sketched Jacobian approximations. InInternational Conference on Artificial Intelligence and Statistics 2022 May 3 (pp. 10448-10470). PMLR. link
Croce F, Hein M. Adversarial Robustness against Multiple and Single $ l_p $-Threat Models via Quick Fine-Tuning of Robust Classifiers. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 4436-4454). PMLR. link
Mukkamala MC, Fadili J, Ochs P. Global convergence of model function based Bregman proximal minimization algorithms. Journal of Global Optimization. 2022 Aug 1:1-29. link
Liu D, Cevher V, Tran-Dinh Q. A Newton Frank–Wolfe method for constrained self-concordant minimization. Journal of Global Optimization. 2022 Jun 1:1-27. link
Pathak R, Ma C, Wainwright M. A new similarity measure for covariate shift with applications to nonparametric regression. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 17517-17530). PMLR. link
Hurault S, Leclaire A, Papadakis N. Proximal denoiser for convergent plug-and-play optimization with nonconvex regularization. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 9483-9505). PMLR. link
Axiotis K, Sviridenko M. Iterative hard thresholding with adaptive regularization: Sparser solutions without sacrificing runtime. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 1175-1197). PMLR. link
Nouiehed M, Razaviyayn M. Learning deep models: Critical points and local openness. INFORMS Journal on Optimization. 2022 Apr;4(2):148-73. link
Cai TT, Zhang AR, Zhou Y. Sparse group lasso: Optimal sample complexity, convergence rate, and statistical inference. IEEE transactions on information theory. 2022 May 16;68(9):5975-6002. link
Duchi JC, Namkoong H. Learning models with uniform performance via distributionally robust optimization. The Annals of Statistics. 2021 Jun;49(3):1378-406. link
Chen Y, Ma C, Poor HV, Chen Y. Learning mixtures of low-rank models. IEEE Transactions on Information Theory. 2021 Mar 12;67(7):4613-36. link
Zhu Z, Li Q, Tang G, Wakin MB. The global optimization geometry of low-rank matrix optimization. IEEE Transactions on Information Theory. 2021 Jan 5;67(2):1308-31. link
Amir T, Basri R, Nadler B. The trimmed lasso: Sparse recovery guarantees and practical optimization by the generalized soft-min penalty. SIAM journal on mathematics of data science. 2021;3(3):900-29. link
Charoenphakdee N, Cui Z, Zhang Y, Sugiyama M. Classification with rejection based on cost-sensitive classification. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 1507-1517). PMLR. link
Eftekhari A, Tanner J, Thompson A, Toader B, Tyagi H. Sparse non-negative super-resolution—simplified and stabilised. Applied and Computational Harmonic Analysis. 2021 Jan 1;50:216-80. link
Tan SS, Varvitsiotis A, Tan VY. Analysis of optimization algorithms via sum-of-squares. Journal of Optimization Theory and Applications. 2021 Jul;190(1):56-81. link
Pandit MK, Naaz R, Chishti MA. Learning sparse neural networks using non-convex regularization. IEEE Transactions on Emerging Topics in Computational Intelligence. 2021 Mar 8;6(2):287-99. link
Nguyen LM, Tran-Dinh Q, Phan DT, Nguyen PH, Van Dijk M. A unified convergence analysis for shuffling-type gradient methods. Journal of Machine Learning Research. 2021;22(207):1-44. link
Wang JK, Lin CH, Abernethy JD. A modular analysis of provable acceleration via polyak’s momentum: Training a wide relu network and a deep linear network. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 10816-10827). PMLR. link
Wang B, Zhang H, Zhang J, Meng Q, Chen W, Liu TY. Optimizing information-theoretical generalization bound via anisotropic noise of SGLD. Advances in Neural Information Processing Systems. 2021 Dec 6;34:26080-90. link
Song Z, Yu Z. Oblivious sketching-based central path method for linear programming. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 9835-9847). PMLR. link
Nguyen Q. On the proof of global convergence of gradient descent for deep relu networks with linear widths. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 8056-8062). PMLR. link
Liu T, Li Y, Wei S, Zhou E, Zhao T. Noisy gradient descent converges to flat minima for nonconvex matrix factorization. InInternational Conference on Artificial Intelligence and Statistics 2021 Mar 18 (pp. 1891-1899). PMLR. link
Jiang Y, Li Y, Sun Y, Wang J, Woodruff D. Single pass entrywise-transformed low rank approximation. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 4982-4991). PMLR. link
Jiang Y, Li Y, Sun Y, Wang J, Woodruff D. Single pass entrywise-transformed low rank approximation. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 4982-4991). PMLR. link
Asi H, Feldman V, Koren T, Talwar K. Private stochastic convex optimization: Optimal rates in l1 geometry. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 393-403). PMLR. link
Boffi N, Tu S, Slotine JJ. The role of optimization geometry in single neuron learning. InInternational Conference on Artificial Intelligence and Statistics 2022 May 3 (pp. 11528-11549). PMLR. link
Genzel M, Kutyniok G, März M. ℓ1-analysis minimization and generalized (co-) sparsity: when does recovery succeed?. Applied and Computational Harmonic Analysis. 2021 May 1;52:82-140. link
Genzel M, Kutyniok G, März M. ℓ1-analysis minimization and generalized (co-) sparsity: when does recovery succeed?. Applied and Computational Harmonic Analysis. 2021 May 1;52:82-140. link
Zhang X, Liu Y, Cui W. Spectrally sparse signal recovery via Hankel matrix completion with prior information. IEEE Transactions on Signal Processing. 2021 Mar 29;69:2174-87. link
Ahuja K, Shanmugam K, Dhurandhar A. Linear regression games: Convergence guarantees to approximate out-of-distribution solutions. InInternational Conference on Artificial Intelligence and Statistics 2021 Mar 18 (pp. 1270-1278). PMLR. link
Zhao YB, Luo ZQ. Analysis of optimal thresholding algorithms for compressed sensing. Signal Processing. 2021 Oct 1;187:108148. link
Genzel M, Kutyniok G, März M. ℓ1-analysis minimization and generalized (co-) sparsity: when does recovery succeed?. Applied and Computational Harmonic Analysis. 2021 May 1;52:82-140. link
Fridovich-Keil S, Recht B. Approximately exact line search. arXiv preprint arXiv:2011.04721. 2020 Nov 9. link
Pedregosa F, Negiar G, Askari A, Jaggi M. Linearly convergent Frank-Wolfe with backtracking line-search. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 1-10). PMLR. link
Garber D. On the convergence of stochastic gradient descent with low-rank projections for convex low-rank matrix problems. InConference on Learning Theory 2020 Jul 15 (pp. 1666-1681). PMLR. link
Carmon Y, Duchi JC. First-order methods for nonconvex quadratic minimization. SIAM Review. 2020;62(2):395-436. link
Burke JV, Curtis FE, Wang H, Wang J. Inexact sequential quadratic optimization with penalty parameter updates within the QP solver. SIAM Journal on Optimization. 2020;30(3):1822-49. link
Bonettini S, Prato M, Rebegoldi S. Convergence of Inexact Forward--Backward Algorithms Using the Forward--Backward Envelope. SIAM Journal on Optimization. 2020;30(4):3069-97. link
Banert S, Ringh A, Adler J, Karlsson J, Oktem O. Data-driven nonsmooth optimization. SIAM Journal on Optimization. 2020;30(1):102-31. link
Aybat NS, Fallah A, Gurbuzbalaban M, Ozdaglar A. Robust accelerated gradient methods for smooth strongly convex functions. SIAM Journal on Optimization. 2020;30(1):717-51. link
Amari SI, Ba J, Grosse R, Li X, Nitanda A, Suzuki T, Wu D, Xu J. When does preconditioning help or hurt generalization?. arXiv preprint arXiv:2006.10732. 2020 Jun 18. link
Mokhtari A, Koppel A. High-dimensional nonconvex stochastic optimization by doubly stochastic successive convex approximation. IEEE Transactions on Signal Processing. 2020 Oct 26;68:6287-302. link
Luke DR, Martins AL. Convergence Analysis of the Relaxed Douglas--Rachford Algorithm. SIAM Journal on Optimization. 2020;30(1):542-84. link
Liu H, Foygel Barber R. Between hard and soft thresholding: optimal iterative thresholding algorithms. Information and Inference: A Journal of the IMA. 2020 Dec;9(4):899-933. link
Heaton H, Fung SW, Lin AT, Osher S, Yin W. Projecting to manifolds via unsupervised learning. arXiv preprint arXiv:2008.02200. 2020. link
Ghosh A, Kannan R. Alternating minimization converges super-linearly for mixed linear regression. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 1093-1103). PMLR. link
Fu H, Chi Y, Liang Y. Guaranteed recovery of one-hidden-layer neural networks via cross entropy. IEEE transactions on signal processing. 2020 May 7;68:3225-35. link
Li X, Sun D, Toh KC. An asymptotically superlinearly convergent semismooth Newton augmented Lagrangian method for linear programming. SIAM Journal on Optimization. 2020;30(3):2410-40. link
Feng X, Yan S, Wu C. The ℓ2, q regularized group sparse optimization: lower bound theory, recovery bound and algorithms. Applied and Computational Harmonic Analysis. 2020 Sep 1;49(2):381-414. link
Farnia F, Zhang JM, David NT. A Fourier-based approach to generalization and optimization in deep learning. IEEE Journal on Selected Areas in Information Theory. 2020 Mar 26;1(1):145-56. link
Dwivedi R, Ho N, Khamaru K, Wainwright MJ, Jordan MI, Yu B. Singularity, misspecification and the convergence rate of EM. The Annals of Statistics. 2020 Dec 1;48(6):3161-82. link
Duchi JC, Hinder O, Naber A, Ye Y. Conic descent and its application to memory-efficient optimization over positive semidefinite matrices. Advances in Neural Information Processing Systems. 2020;33:8308-17. link
del Álamo M, Munk A. Total variation multiscale estimators for linear inverse problems. Information and Inference: A Journal of the IMA. 2020 Dec;9(4):961-86. link
Sarao Mannelli S, Vanden-Eijnden E, Zdeborová L. Optimization and generalization of shallow neural networks with quadratic activation functions. Advances in Neural Information Processing Systems. 2020;33:13445-55. link
Wan A. Uniform RIP Conditions for Recovery of Sparse Signals by $\ell _p\,(0< p\leq 1) $ Minimization. IEEE Transactions on Signal Processing. 2020 Sep 8;68:5379-94. link
Chen Y, Chi Y, Fan J, Ma C, Yan Y. Noisy matrix completion: Understanding statistical guarantees for convex relaxation via nonconvex optimization. SIAM journal on optimization. 2020;30(4):3098-121. link
Tirer T, Giryes R. Generalizing CoSaMP to signals from a union of low dimensional linear subspaces. Applied and Computational Harmonic Analysis. 2020 Jul 1;49(1):99-122. link
Moscoso M, Novikov A, Papanicolaou G, Tsogka C. The noise collector for sparse recovery in high dimensions. Proceedings of the National Academy of Sciences. 2020 May 26;117(21):11226-32. link
Meng N, Zhao YB. Newton-step-based hard thresholding algorithms for sparse signal recovery. IEEE Transactions on Signal Processing. 2020 Nov 13;68:6594-606. link
Zhang J, Zhao C, Gao W. Optimization-inspired compact deep compressive sensing. IEEE Journal of Selected Topics in Signal Processing. 2020 Mar 2;14(4):765-74. link
Pooladian AA, Finlay C, Hoheisel T, Oberman A. A principled approach for generating adversarial images under non-smooth dissimilarity metrics. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 1442-1452). PMLR. link
McKinney SM, Sieniek M, Godbole V, Godwin J, Antropova N, Ashrafian H, Back T, Chesus M, Corrado GS, Darzi A, Etemadi M. International evaluation of an AI system for breast cancer screening. Nature. 2020 Jan 2;577(7788):89-94. link
Mutny M, Derezinski M, Krause A. Convergence analysis of block coordinate algorithms with determinantal sampling. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 3110-3120). PMLR. link
Zhang L, Ma R, Cai TT, Li H. Estimation, confidence intervals, and large-scale hypotheses testing for high-dimensional mixed linear regression. arXiv preprint arXiv:2011.03598. 2020 Nov 6. link
Yang C, Gu Y, Chen B, Ma H, So HC. Learning proximal operator methods for nonconvex sparse recovery with theoretical guarantee. IEEE Transactions on Signal Processing. 2020 Mar 5;68:5244-59. link
Xu J, Tian Y, Sun Y, Scutari G. Accelerated primal-dual algorithms for distributed smooth convex optimization over networks. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 2381-2391). PMLR. link
Van Luong H, Joukovsky B, Deligiannis N. Interpretable Deep Recurrent Neural Networks via Unfolding Reweighted $\ell_1 $-$\ell_1 $ Minimization: Architecture Design and Generalization Analysis. arXiv preprint arXiv:2003.08334. 2020 Mar 18. link
Themelis A, Patrinos P. Douglas--Rachford splitting and ADMM for nonconvex optimization: Tight convergence results. SIAM Journal on Optimization. 2020;30(1):149-81. link
Teboulle M, Vaisbourd Y. Novel proximal gradient methods for nonnegative matrix factorization with sparsity constraints. SIAM Journal on Imaging Sciences. 2020;13(1):381-421. link
Takabe S, Wadayama T. Theoretical interpretation of learned step size in deep-unfolded gradient descent. arXiv preprint arXiv:2001.05142. 2020 Jan 15. link
Song Y, Cao Z, Wu K, Yan Z, Zhang C. Learning fast approximations of sparse nonlinear regression. arXiv preprint arXiv:2010.13490. 2020 Oct 26. link
Shah V, Basu S, Kyrillidis A, Sanghavi S. On generalization of adaptive methods for over-parameterized linear regression. arXiv preprint arXiv:2011.14066. 2020 Nov 28. link
Pedregosa F, Negiar G, Askari A, Jaggi M. Linearly convergent Frank-Wolfe with backtracking line-search. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 1-10). PMLR. link
Oymak S, Soltanolkotabi M. Toward moderate overparameterization: Global convergence guarantees for training shallow neural networks. IEEE Journal on Selected Areas in Information Theory. 2020 Apr 29;1(1):84-105. link
Sahiner A, Ergen T, Pauly J, Pilanci M. Vector-output relu neural network problems are copositive programs: Convex analysis of two layer networks and polynomial-time algorithms. arXiv preprint arXiv:2012.13329. 2020 Dec 24. link
Qi J, Du J, Siniscalchi SM, Ma X, Lee CH. Analyzing upper bounds on mean absolute errors for deep neural network-based vector-to-vector regression. IEEE Transactions on Signal Processing. 2020 May 7;68:3411-22. link
Li H, Zhang Q, Cui A, Peng J. Minimization of fraction function penalty in compressed sensing. IEEE Transactions on Neural Networks and Learning Systems. 2019 Jul 15;31(5):1626-37. link
Strohmer T, Wei K. Painless breakups—Efficient demixing of low rank matrices. Journal of Fourier Analysis and Applications. 2019 Feb 15;25:1-31. link
Yang T. Advancing non-convex and constrained learning: Challenges and opportunities. AI Matters. 2019 Dec 6;5(3):29-39. link
Sulam J, Aberdam A, Beck A, Elad M. On multi-layer basis pursuit, efficient algorithms and convolutional neural networks. IEEE transactions on pattern analysis and machine intelligence. 2019 Mar 11;42(8):1968-80. link
Li X, Lu J, Arora R, Haupt J, Liu H, Wang Z, Zhao T. Symmetry, saddle points, and global optimization landscape of nonconvex matrix factorization. IEEE Transactions on Information Theory. 2019 Mar 27;65(6):3489-514. link
Jalali S, Yuan X. Snapshot compressed sensing: Performance bounds and algorithms. IEEE Transactions on Information Theory. 2019 Sep 10;65(12):8005-24. link
Alaoui AE, Ramdas A, Krzakala F, Zdeborová L, Jordan MI. Decoding from pooled data: Sharp information-theoretic bounds. SIAM Journal on Mathematics of Data Science. 2019;1(1):161-88. link
Lanza A, Morigi S, Selesnick IW, Sgallari F. Sparsity-inducing nonconvex nonseparable regularization for convex image processing. SIAM Journal on Imaging Sciences. 2019;12(2):1099-134. Kwon J, Qian W, Caramanis C, Chen Y, Davis D. Global convergence of the EM algorithm for mixtures of two component linear regression. InConference on Learning Theory 2019 Jun 25 (pp. 2055-2110). PMLR. link
Gamarnik D, Kızıldağ EC, Zadik I. Stationary points of shallow neural networks with quadratic activation function. arXiv preprint arXiv:1912.01599. 2019 Dec 3. link
Allen-Zhu Z, Li Y, Song Z. On the convergence rate of training recurrent neural networks. Advances in neural information processing systems. 2019;32. link
Du S, Lee J, Li H, Wang L, Zhai X. Gradient descent finds global minima of deep neural networks. InInternational conference on machine learning 2019 May 24 (pp. 1675-1685). PMLR. link
Fercoq O, Alacaoglu A, Necoara I, Cevher V. Almost surely constrained convex optimization. InInternational Conference on Machine Learning 2019 May 24 (pp. 1910-1919). PMLR. link
Catala P, Duval V, Peyré G. A low-rank approach to off-the-grid sparse superresolution. SIAM Journal on Imaging Sciences. 2019;12(3):1464-500. link
Wang Y, Yin W, Zeng J. Global convergence of ADMM in nonconvex nonsmooth optimization. Journal of Scientific Computing. 2019 Jan 15;78:29-63. link
Beygi S, Jalali S, Maleki A, Mitra U. An efficient algorithm for compression-based compressed sensing. Information and Inference: A Journal of the IMA. 2019 Jun;8(2):343-75. link
Sagawa S, Koh PW, Hashimoto TB, Liang P. Distributionally robust neural networks for group shifts: On the importance of regularization for worst-case generalization. arXiv preprint arXiv:1911.08731. 2019 Nov 20. link
Shi Y, Huang J, Jiao Y, Yang Q. A semismooth newton algorithm for high-dimensional nonconvex sparse learning. IEEE transactions on neural networks and learning systems. 2019 Sep 12;31(8):2993-3006. link
Raj A, Li Y, Bresler Y. GAN-based projector for faster recovery with convergence guarantees in linear inverse problems. InProceedings of the IEEE/CVF international conference on computer vision 2019 (pp. 5602-5611). link
Zhang G, Martens J, Grosse RB. Fast convergence of natural gradient descent for over-parameterized neural networks. Advances in Neural Information Processing Systems. 2019;32. link
Chen H, Wu HC, Chan SC, Lam WH. A stochastic quasi-Newton method for large-scale nonconvex optimization with applications. IEEE transactions on neural networks and learning systems. 2019 Dec 31;31(11):4776-90. link
Sulam J, Aberdam A, Beck A, Elad M. On multi-layer basis pursuit, efficient algorithms and convolutional neural networks. IEEE transactions on pattern analysis and machine intelligence. 2019 Mar 11;42(8):1968-80. link
Wang R, Xiu N, Zhang C. Greedy projected gradient-Newton method for sparse logistic regression. IEEE transactions on neural networks and learning systems. 2019 Apr 11;31(2):527-38. link
Zarka J, Thiry L, Angles T, Mallat S. Deep network classification by scattering and homotopy dictionary learning. arXiv preprint arXiv:1910.03561. 2019 Oct 8. link
Mohammadi M. A projection neural network for the generalized lasso. IEEE transactions on neural networks and learning systems. 2019 Aug 7;31(6):2217-21. link
Chi Y, Lu YM, Chen Y. Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Transactions on Signal Processing. 2019 Aug 23;67(20):5239-69. link
Wang D, Zhang Z. KKT condition-based smoothing recurrent neural network for nonsmooth nonconvex optimization in compressed sensing. Neural Computing and Applications. 2019 Jul 1;31:2905-20. link
Joseph G, Murthy CR. On the convergence of a Bayesian algorithm for joint dictionary learning and sparse recovery. IEEE Transactions on Signal Processing. 2019 Nov 20;68:343-58. link
Lyu K, Li J. Gradient descent maximizes the margin of homogeneous neural networks. arXiv preprint arXiv:1906.05890. 2019 Jun 13. link
Adams Q, Hopfensperger KM, Kim Y, Wu X, Xu W, Shukla H, McGee J, Caster JM, Flynn RT. Effectiveness of rotating shield brachytherapy for prostate cancer dose escalation and urethral sparing. International Journal of Radiation Oncology* Biology* Physics. 2018 Dec 1;102(5):1543-50. link
Li P, Chen W. Matrix Recovery from Rank-One Projection Measurements via Nonconvex Minimization. arXiv preprint arXiv:1806.10803. 2018 Jun 28. link
Mei S, Bai Y, Montanari A. The landscape of empirical risk for nonconvex losses. The Annals of Statistics. 2018 Dec 1;46(6A):2747-74. link
Bartlett P, Helmbold D, Long P. Gradient descent with identity initialization efficiently learns positive definite linear transformations by deep residual networks. InInternational conference on machine learning 2018 Jul 3 (pp. 521-530). PMLR. link
Arora S, Cohen N, Golowich N, Hu W. A convergence analysis of gradient descent for deep linear neural networks. arXiv preprint arXiv:1810.02281. 2018 Oct 4. link
Tran-Dinh Q, Fercoq O, Cevher V. A smooth primal-dual optimization framework for nonsmooth composite convex minimization. SIAM Journal on Optimization. 2018;28(1):96-134. link
Xu T, Zhou Y, Ji K, Liang Y. Convergence of SGD in learning ReLU models with separable data. CoRR. 2018 Jan 1. link
Aggarwal HK, Mani MP, Jacob M. MoDL: Model-based deep learning architecture for inverse problems. IEEE transactions on medical imaging. 2018 Aug 13;38(2):394-405. link
Zou D, Cao Y, Zhou D, Gu Q. Stochastic gradient descent optimizes over-parameterized deep relu networks. arxiv e-prints, art. arXiv preprint arXiv:1811.08888. 2018 Nov. link
Zhou P, Feng J. Understanding generalization and optimization performance of deep CNNs. InInternational Conference on Machine Learning 2018 Jul 3 (pp. 5960-5969). PMLR. link
Hsieh YP, Kao YC, Mahabadi RK, Yurtsever A, Kyrillidis A, Cevher V. A non-Euclidean gradient descent framework for non-convex matrix factorization. IEEE Transactions on Signal Processing. 2018 Sep 14;66(22):5917-26. link
Kümmerle C, Sigl J. Harmonic mean iteratively reweighted least squares for low-rank matrix recovery. Journal of Machine Learning Research. 2018;19(47):1-49. link
Zhu Z, Li G, Ding J, Li Q, He X. On collaborative compressive sensing systems: The framework, design, and algorithm. SIAM Journal on imaging sciences. 2018;11(2):1717-58. link
Zhang X, Wang L, Gu Q. A unified framework for nonconvex low-rank plus sparse matrix recovery. InInternational Conference on Artificial Intelligence and Statistics 2018 Mar 31 (pp. 1097-1107). PMLR. link
Fu H, Chi Y, Liang Y. Local geometry of one-hidden-layer neural networks for logistic regression. arXiv preprint arXiv:1802.06463. 2018 Feb. link
Cao Y, Ding GW, Lui KY, Huang R. Improving GAN training via binarized representation entropy (BRE) regularization. arXiv preprint arXiv:1805.03644. 2018 May 9. link
Schiebinger G, Robeva E, Recht B. Superresolution without separation. Information and Inference: A Journal of the IMA. 2018 Mar 15;7(1):1-30. link
Boyd N, Schiebinger G, Recht B. The alternating descent conditional gradient method for sparse inverse problems. SIAM Journal on Optimization. 2017;27(2):616-39. link
Van Nguyen Q, Fercoq O, Cevher V. Smoothing technique for nonsmooth composite minimization with linear operator. arXiv preprint arXiv:1706.05837. 2017 Jun 19. link
Selesnick I. Sparse regularization via convex analysis. IEEE Transactions on Signal Processing. 2017 Jun 2;65(17):4481-94. link
Obermeier R, Martinez-Lorenzo JA. Sensing matrix design via mutual coherence minimization for electromagnetic compressive imaging applications. IEEE Transactions on Computational Imaging. 2017 Feb 17;3(2):217-29. link
Dadkhah H, Hopfensperger KM, Kim Y, Wu X, Flynn RT. Multisource rotating shield brachytherapy apparatus for prostate cancer. International Journal of Radiation Oncology* Biology* Physics. 2017 Nov 1;99(3):719-28. link
Ghayem F, Sadeghi M, Babaie-Zadeh M, Chatterjee S, Skoglund M, Jutten C. Sparse signal recovery using iterative proximal projection. IEEE Transactions on Signal Processing. 2017 Nov 29;66(4):879-94. link
Wang L, Zhang X, Gu Q. A universal variance reduction-based catalyst for nonconvex low-rank matrix recovery. arXiv preprint arXiv:1701.02301. 2017 Jan 9. link
Jin C, Ge R, Netrapalli P, Kakade SM, Jordan MI. How to escape saddle points efficiently. InInternational conference on machine learning 2017 Jul 17 (pp. 1724-1732). PMLR. link
Balcan MF, Liang Y, Woodruff DP, Zhang H. Matrix completion and related problems via strong duality. arXiv preprint arXiv:1704.08683. 2017 Apr 27. link
Ongie G, Willett R, Nowak RD, Balzano L. Algebraic variety models for high-rank matrix completion. InInternational Conference on Machine Learning 2017 Jul 17 (pp. 2691-2700). PMLR. link
Ciliberto C, Stamos D, Pontil M. Reexamining low rank matrix factorization for trace norm regularization. arXiv preprint arXiv:1706.08934. 2017 Jun 27. link
Ge R, Lee JD, Ma T. Learning one-hidden-layer neural networks with landscape design. arXiv preprint arXiv:1711.00501. 2017 Nov 1. link
Jain P, Kar P. Non-convex optimization for machine learning. Foundations and Trends® in Machine Learning. 2017 Dec 3;10(3-4):142-363. link
Zellinger W, Grubinger T, Lughofer E, Natschläger T, Saminger-Platz S. Central moment discrepancy (CMD) for domain-invariant representation learning. arXiv preprint arXiv:1702.08811. 2017 Feb 28. link
Curtó JD, Zarza IC, De La Torre F, King I, Lyu MR. High-resolution deep convolutional generative adversarial networks. arXiv preprint arXiv:1711.06491. 2017 Nov 17. link
Hu Y, Li C, Meng K, Qin J, Yang X. Group sparse optimization via lp, q regularization. Journal of Machine Learning Research. 2017;18(30):1-52. link
Shi Q, Hong M, Fu X, Chang TH. Penalty dual decomposition method for nonsmooth nonconvex optimization. arXiv preprint arXiv:1712.04767. 2017 Dec 13. link
Zhao R, Tan VY. A unified convergence analysis of the multiplicative update algorithm for regularized nonnegative matrix factorization. IEEE Transactions on Signal Processing. 2017 Sep 28;66(1):129-38. link
Zhao R, Haskell WB, Tan VY. Stochastic L-BFGS: Improved convergence rates and practical acceleration strategies. IEEE Transactions on Signal Processing. 2017 Dec 18;66(5):1155-69. link
Sigl J. Nonlinear residual minimization by iteratively reweighted least squares. Computational Optimization and Applications. 2016 Jul;64(3):755-92. link
Chartrand R, Yin W. Nonconvex sparse regularization and splitting algorithms. Splitting methods in communication, imaging, science, and engineering. 2016:237-49. link
Tanner J, Wei K. Low rank matrix completion by alternating steepest descent methods. Applied and Computational Harmonic Analysis. 2016 Mar 1;40(2):417-29. link
Ge R, Lee JD, Ma T. Matrix completion has no spurious local minimum. Advances in neural information processing systems. 2016;29. link
Yi X, Park D, Chen Y, Caramanis C. Fast algorithms for robust PCA via gradient descent. Advances in neural information processing systems. 2016;29. link
Laurent C. Fast projection onto the simplex and the l1 ball. Math. Prog. 2016 Jul;158:575-85.
Hardt M, Recht B, Singer Y. Train faster, generalize better: Stability of stochastic gradient descent. InInternational conference on machine learning 2016 Jun 11 (pp. 1225-1234). PMLR. link
Tu S, Boczar R, Simchowitz M, Soltanolkotabi M, Recht B. Low-rank solutions of linear matrix equations via procrustes flow. InInternational conference on machine learning 2016 Jun 11 (pp. 964-973). PMLR. link
Nishihara R, Lessard L, Recht B, Packard A, Jordan M. A general analysis of the convergence of ADMM. InInternational conference on machine learning 2015 Jun 1 (pp. 343-352). PMLR. link
Gill PE, Wong E. Methods for convex and general quadratic programming. Mathematical programming computation. 2015 Mar;7(1):71-112. link
Liu Q, Wang J. $ L_ {1} $-minimization algorithms for sparse signal reconstruction based on a projection neural network. IEEE Transactions on Neural Networks and Learning Systems. 2015 Oct 26;27(3):698-707. link
Bai H, Li G, Li S, Li Q, Jiang Q, Chang L. Alternating optimization of sensing matrix and sparsifying dictionary for compressed sensing. IEEE Transactions on Signal Processing. 2015 Feb 3;63(6):1581-94. link
De Sa C, Re C, Olukotun K. Global convergence of stochastic gradient descent for some non-convex matrix problems. InInternational conference on machine learning 2015 Jun 1 (pp. 2332-2341). PMLR. link
Chen W, Li Y. Stable recovery of low-rank matrix via nonconvex Schatten p-minimization. Science China Mathematics. 2015 Dec;58(12):2643-54. link
Jain P, Netrapalli P. Fast exact matrix completion with finite samples. InConference on Learning Theory 2015 Jun 26 (pp. 1007-1034). PMLR. link
Choromanska A, Henaff M, Mathieu M, Arous GB, LeCun Y. The loss surfaces of multilayer networks. InArtificial intelligence and statistics 2015 Feb 21 (pp. 192-204). PMLR. link
Haeffele BD, Vidal R. Global optimality in tensor factorization, deep learning, and beyond. arXiv preprint arXiv:1506.07540. 2015 Jun 24. link
Janzamin M, Sedghi H, Anandkumar A. Beating the perils of non-convexity: Guaranteed training of neural networks using tensor methods. arXiv preprint arXiv:1506.08473. 2015 Jun 28. link
Souiai M, Oswald MR, Kee Y, Kim J, Pollefeys M, Cremers D. Entropy minimization for convex relaxation approaches. InProceedings of the IEEE International Conference on Computer Vision 2015 (pp. 1778-1786). link
Lu C, Tang J, Yan S, Lin Z. Generalized nonconvex nonsmooth low-rank minimization. InProceedings of the IEEE conference on computer vision and pattern recognition 2014 (pp. 4130-4137). link
Hardt M. Understanding alternating minimization for matrix completion. In2014 IEEE 55th Annual Symposium on Foundations of Computer Science 2014 Oct 18 (pp. 651-660). IEEE. link
Artacho FJ, Borwein JM, Tam MK. Douglas–Rachford feasibility methods for matrix completion problems. The ANZIAM Journal. 2014 Apr;55(4):299-326. link
Netrapalli P, UN N, Sanghavi S, Anandkumar A, Jain P. Non-convex robust PCA. Advances in neural information processing systems. 2014;27. link
Artacho FJ, Borwein JM, Tam MK. Douglas–Rachford feasibility methods for matrix completion problems. The ANZIAM Journal. 2014 Apr;55(4):299-326. link
Liu Y, Flynn RT, Kim Y, Wu X. Asymmetric dose–volume optimization with smoothness control for rotating‐shield brachytherapy. Medical physics. 2014 Nov;41(11):111709. link
Iwen MA. Compressed sensing with sparse binary matrices: Instance optimal error guarantees in near-optimal time. Journal of Complexity. 2014 Feb 1;30(1):1-5. link
Yang AY, Zhou Z, Balasubramanian AG, Sastry SS, Ma Y. Fast $\ell_ {1} $-Minimization Algorithms for Robust Face Recognition. IEEE Transactions on Image Processing. 2013 May 13;22(8):3234-46. link
Candes E, Recht B. Simple bounds for recovering low-complexity models. Mathematical Programming. 2013 Oct;141(1):577-89. link
Tang G, Bhaskar BN, Shah P, Recht B. Compressed sensing off the grid. IEEE transactions on information theory. 2013 Aug 7;59(11):7465-90. link
Donoho DL, Johnstone I, Montanari A. Accurate prediction of phase transitions in compressed sensing via a connection to minimax denoising. IEEE transactions on information theory. 2013 Jan 10;59(6):3396-433. link
Jain P, Netrapalli P, Sanghavi S. Low-rank matrix completion using alternating minimization. InProceedings of the forty-fifth annual ACM symposium on Theory of computing 2013 Jun 1 (pp. 665-674). link
Kyrillidis A, Baldassarre L, Halabi ME, Tran-Dinh Q, Cevher V. Structured sparsity: Discrete and convex approaches. InCompressed Sensing and its Applications: MATHEON Workshop 2013 2015 (pp. 341-387). Springer International Publishing. link
Xu Y, Yin W, Wen Z, Zhang Y. An alternating direction algorithm for matrix completion with nonnegative factors. Frontiers of Mathematics in China. 2012 Apr;7:365-84. link
Chandrasekaran V, Recht B, Parrilo PA, Willsky AS. The convex geometry of linear inverse problems. Foundations of Computational mathematics. 2012 Dec;12(6):805-49. link
Recht B. A simpler approach to matrix completion. Journal of Machine Learning Research. 2011 Dec 1;12(12). link
Juditsky A, Nemirovski A. On verifiable sufficient conditions for sparse signal recovery via ℓ 1 minimization. Mathematical programming. 2011 Mar;127:57-88. link
Foygel R, Srebro N. Concentration-based guarantees for low-rank matrix reconstruction. InProceedings of the 24th Annual Conference on Learning Theory 2011 Dec 21 (pp. 315-340). JMLR Workshop and Conference Proceedings. link
Chi Y, Scharf LL, Pezeshki A, Calderbank AR. Sensitivity to basis mismatch in compressed sensing. IEEE Transactions on Signal Processing. 2011 Feb 10;59(5):2182-95. link
Becker SR, Candès EJ, Grant MC. Templates for convex cone problems with applications to sparse signal recovery. Mathematical programming computation. 2011 Sep;3:165-218. link
Loh PL, Wainwright MJ. High-dimensional regression with noisy and missing data: Provable guarantees with non-convexity. Advances in neural information processing systems. 2011;24. link
Friedlander MP, Mansour H, Saab R, Yilmaz Ö. Recovering compressively sampled signals using partial support information. IEEE Transactions on Information Theory. 2011 Sep 5;58(2):1122-34. link
Yun S, Toh KC. A coordinate gradient descent method for ℓ 1-regularized convex minimization. Computational Optimization and Applications. 2011 Mar;48:273-307. link
Yun S, Toh KC. A coordinate gradient descent method for ℓ 1-regularized convex minimization. Computational Optimization and Applications. 2011 Mar;48:273-307. link
Yang J, Zhang Y. Alternating direction algorithms for \ell_1-problems in compressive sensing. SIAM journal on scientific computing. 2011;33(1):250-78. link
Calderbank R, Howard S, Jafarpour S. Construction of a large class of deterministic sensing matrices that satisfy a statistical isometry property. IEEE journal of selected topics in signal processing. 2010 Feb 22;4(2):358-74. link
Shalev-Shwartz S, Tewari A. Stochastic methods for l 1 regularized loss minimization. InProceedings of the 26th Annual International Conference on Machine Learning 2009 Jun 14 (pp. 929-936). link
Beck A, Teboulle M. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM journal on imaging sciences. 2009 Jan 1;2(1):183-202. link
Goldstein T, Osher S. The split Bregman method for L1-regularized problems. SIAM journal on imaging sciences. 2009;2(2):323-43. link
Xu W, Hassibi B. Compressed sensing over the Grassmann manifold: A unified analytical framework. In2008 46th Annual Allerton Conference on Communication, Control, and Computing 2008 Sep 23 (pp. 562-567). IEEE. link
Duchi J, Shalev-Shwartz S, Singer Y, Chandra T. Efficient projections onto the l 1-ball for learning in high dimensions. InProceedings of the 25th international conference on Machine learning 2008 Jul 5 (pp. 272-279). link
Candes EJ, Wakin MB, Boyd SP. Enhancing sparsity by reweighted ℓ 1 minimization. Journal of Fourier analysis and applications. 2008 Dec;14:877-905. link
Chartrand R, Yin W. Iteratively reweighted algorithms for compressive sensing. In2008 IEEE international conference on acoustics, speech and signal processing 2008 Mar 31 (pp. 3869-3872). IEEE. link
Donoho DL. High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension. Discrete & Computational Geometry. 2006 May;35:617-52. link
Donoho DL, Tanner J. Neighborliness of randomly projected simplices in high dimensions. Proceedings of the National Academy of Sciences. 2005 Jul 5;102(27):9452-7. link
Hill SI, Williamson RC. Convergence of exponentiated gradient algorithms. IEEE Transactions on Signal Processing. 2002 Aug 7;49(6):1208-15. link
Gubin LG, Polyak BT, Raik EV. The method of projections for finding the common point of convex sets. USSR Computational Mathematics and Mathematical Physics. 1967 Jan 1;7(6):1-24. link
https://github.com/ngcthuong/Reproducible-Deep-Compressive-Sensing
Hollmann N, Müller S, Purucker L, Krishnakumar A, Körfer M, Hoo SB, Schirrmeister RT, Hutter F. Accurate predictions on small data with a tabular foundation model. Nature. 2025 Jan 9;637(8045):319-26. link
Yi J, Gao J, Wang T, Wu X, Xu W. Outlier Detection Using Generative Models with Theoretical Performance Guarantees. IEEE transactions on information theory. 2024 Dec 11. link
Chand JR, Jacob M. Multi-scale energy (muse) framework for inverse problems in imaging. IEEE transactions on computational imaging. 2024 Aug 23. link
Shen M, Gan H, Ma C, Ning C, Li H, Liu F. MTC-CSNet: Marrying transformer and convolution for image compressed sensing. IEEE Transactions on Cybernetics. 2024 Feb 26. link
Gan H, Shen M, Hua Y, Ma C, Zhang T. From patch to pixel: A transformer-based hierarchical framework for compressive image sensing. IEEE Transactions on Computational Imaging. 2023 Feb 22;9:133-46. link
Zhong Y, Zhang C, Li J. Image compressed sensing reconstruction via deep image prior with structure-texture decomposition. IEEE Signal Processing Letters. 2023 Feb 2;30:85-9. link
Fei B, Lyu Z, Pan L, Zhang J, Yang W, Luo T, Zhang B, Dai B. Generative diffusion prior for unified image restoration and enhancement. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2023 (pp. 9935-9946). link
Zhang Z, Chen M, Wang M, Liao W, Zhao T. Effective minkowski dimension of deep nonparametric regression: function approximation and statistical theories. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 40911-40931). PMLR. link
Zhang H, Li Y, Lu W, Lin Q. On the optimality of misspecified kernel ridge regression. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 41331-41353). PMLR. link
Mao Y, Jiang L, Chen X, Li C. Disc-diff: Disentangled conditional diffusion model for multi-contrast mri super-resolution. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 387-397). Cham: Springer Nature Switzerland. link
Luo H, Bao J, Wu Y, He X, Li T. Segclip: Patch aggregation with learnable centers for open-vocabulary semantic segmentation. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 23033-23044). PMLR. link
Lin B, Ye Y, Zhu B, Cui J, Ning M, Jin P, Yuan L. Video-llava: Learning united visual representation by alignment before projection. arXiv preprint arXiv:2311.10122. 2023 Nov 16. link
Kotelnikov A, Baranchuk D, Rubachev I, Babenko A. Tabddpm: Modelling tabular data with diffusion models. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 17564-17579). PMLR. link
Laousy O, Araujo A, Chassagnon G, Paragios N, Revel MP, Vakalopoulou M. Certification of deep learning models for medical image segmentation. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 611-621). Cham: Springer Nature Switzerland. link
Lin Z, Liu C, Zhang R, Gao P, Qiu L, Xiao H, Qiu H, Lin C, Shao W, Chen K, Han J. Sphinx: The joint mixing of weights, tasks, and visual embeddings for multi-modal large language models. arXiv preprint arXiv:2311.07575. 2023 Nov 13. link
Choi J, Park Y, Kang M. Restoration based generative models. arXiv preprint arXiv:2303.05456. 2023 Feb 20. link
Daras G, Dagan Y, Dimakis AG, Daskalakis C. Score-guided intermediate layer optimization: Fast Langevin mixing for inverse problems. arXiv preprint arXiv:2206.09104. 2022 Jun 18. link
Li B, Liu X, Hu P, Wu Z, Lv J, Peng X. All-in-one image restoration for unknown corruption. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2022 (pp. 17452-17462). link
Lu R, Ye K. Tree-structured dilated convolutional networks for image compressed sensing. IEEE Access. 2022 Sep 14;10:98374-83. link
Chen B, Zhang J. Content-aware scalable deep compressed sensing. IEEE Transactions on Image Processing. 2022 Aug 10;31:5412-26. link
Jia M, Yu L, Bai W, Zhang P, Zhang L, Wang W, Gao F. Single pixel imaging via unsupervised deep compressive sensing with collaborative sparsity in discretized feature space. Journal of Biophotonics. 2022 Jul;15(7):e202200045. link
Zhang K, Hua Z, Li Y, Chen Y, Zhou Y. Ams-net: Adaptive multi-scale network for image compressive sensing. IEEE Transactions on Multimedia. 2022 Aug 15;25:5676-89. link
Shen M, Gan H, Ning C, Hua Y, Zhang T. TransCS: A transformer-based hybrid architecture for image compressed sensing. IEEE Transactions on Image Processing. 2022 Nov 1;31:6991-7005. link
Lorenzana MB, Engstrom C, Chandra SS. Transformer compressed sensing via global image tokens. In2022 IEEE International Conference on Image Processing (ICIP) 2022 Oct 16 (pp. 3011-3015). IEEE. link
Lee B, Ko K, Hong J, Ku B, Ko H. Information bottleneck measurement for compressed sensing image reconstruction. IEEE Signal Processing Letters. 2022 Sep 8;29:1943-7. link
Wang X, Li Y, Zhang H, Shan Y. Towards real-world blind face restoration with generative facial prior. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2021 (pp. 9168-9178). link
Huang W, Hand P, Heckel R, Voroninski V. A provably convergent scheme for compressive sensing under random generative priors. Journal of Fourier Analysis and Applications. 2021 Apr;27:1-34. link
Jalal A, Arvinte M, Daras G, Price E, Dimakis AG, Tamir J. Robust compressed sensing mri with deep generative priors. Advances in Neural Information Processing Systems. 2021 Dec 6;34:14938-54. link
Chen X, Derezinski M. Query complexity of least absolute deviation regression via robust uniform convergence. InConference on Learning Theory 2021 Jul 21 (pp. 1144-1179). PMLR. link
Charisopoulos V, Chen Y, Davis D, Díaz M, Ding L, Drusvyatskiy D. Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence. Foundations of Computational Mathematics. 2021 Dec;21(6):1505-93. link
Kim KS, Lee JH, Yang E. Compressed sensing via measurement-conditional generative models. IEEE Access. 2021 Nov 17;9:155335-52. link
Cui W, Liu S, Jiang F, Zhao D. Image compressed sensing using non-local neural network. IEEE Transactions on multimedia. 2021 Dec 3;25:816-30. link
Song J, Chen B, Zhang J. Memory-augmented deep unfolding network for compressive sensing. InProceedings of the 29th ACM international conference on multimedia 2021 Oct 17 (pp. 4249-4258). link
Zheng R, Zhang Y, Huang D, Chen Q. Sequential convolution and runge-kutta residual architecture for image compressed sensing. InComputer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part IX 16 2020 (pp. 232-248). Springer International Publishing. link
Zhang Z, Liu Y, Liu J, Wen F, Zhu C. AMP-Net: Denoising-based deep unfolding for compressive image sensing. IEEE Transactions on Image Processing. 2020 Dec 18;30:1487-500. link
Kasem HM, Selim MM, Mohamed EM, Hussein AH. DRCS-SR: Deep robust compressed sensing for single image super-resolution. IEEE Access. 2020 Sep 15;8:170618-34. link
Zhang J, Zhao C, Gao W. Optimization-inspired compact deep compressive sensing. IEEE Journal of Selected Topics in Signal Processing. 2020 Mar 2;14(4):765-74. link
Zhang J, Li Y, Yu ZL, Gu Z, Cheng Y, Gong H. Deep Unfolding With Weighted ℓ₂ Minimization for Compressive Sensing. IEEE Internet of Things Journal. 2020 Sep 4;8(4):3027-41. link
Qiao M, Meng Z, Ma J, Yuan X. Deep learning for video compressive sensing. Apl Photonics. 2020 Mar 1;5(3). link
Ran M, Xia W, Huang Y, Lu Z, Bao P, Liu Y, Sun H, Zhou J, Zhang Y. MD-Recon-Net: a parallel dual-domain convolutional neural network for compressed sensing MRI. IEEE Transactions on Radiation and Plasma Medical Sciences. 2020 May 1;5(1):120-35. link
Huang B, Zhou J, Yan X, Jing ME, Wan R, Fan Y. Cs-mcnet: A video compressive sensing reconstruction network with interpretable motion compensation. InProceedings of the Asian Conference on Computer Vision 2020. link
Deora P, Vasudeva B, Bhattacharya S, Pradhan PM. Structure preserving compressive sensing MRI reconstruction using generative adversarial networks. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops 2020 (pp. 522-523). link
Popilka B, Setzer S, Steidl G. Signal recovery from incomplete measurements in the presence of outliers. Inverse Problems and Imaging. 2007 Nov 1;1(4):661. link
Polykovskiy D, Vetrov D. Deterministic decoding for discrete data in variational autoencoders. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 3046-3056). PMLR. link
Liu L, Shen Y, Li T, Caramanis C. High dimensional robust sparse regression. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 411-421). PMLR. link
Daskalakis C, Rohatgi D, Zampetakis E. Constant-expansion suffices for compressed sensing with generative priors. Advances in Neural Information Processing Systems. 2020;33:13917-26. link
Wang J, Huang J, Zhang F, Wang W. Group sparse recovery in impulsive noise via alternating direction method of multipliers. Applied and Computational Harmonic Analysis. 2020 Nov 1;49(3):831-62. link
Calderon D, Juba B, Li S, Li Z, Ruan L. Conditional linear regression. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 2164-2173). PMLR. link
Macris N, Rush C. All-or-nothing statistical and computational phase transitions in sparse spiked matrix estimation. Advances in Neural Information Processing Systems. 2020;33:14915-26. link
Popilka B, Setzer S, Steidl G. Signal recovery from incomplete measurements in the presence of outliers. Inverse Problems and Imaging. 2007 Nov 1;1(4):661. link
Aberdam A, Simon D, Elad M. When and how can deep generative models be inverted?. arXiv preprint arXiv:2006.15555. 2020 Jun 28. link
Asim M, Daniels M, Leong O, Ahmed A, Hand P. Invertible generative models for inverse problems: mitigating representation error and dataset bias. InInternational conference on machine learning 2020 Nov 21 (pp. 399-409). PMLR. link
Cocola J, Hand P, Voroninski V. Nonasymptotic guarantees for spiked matrix recovery with generative priors. Advances in Neural Information Processing Systems. 2020;33:15185-97. link
Cherapanamjeri Y, Aras E, Tripuraneni N, Jordan MI, Flammarion N, Bartlett PL. Optimal robust linear regression in nearly linear time. arXiv preprint arXiv:2007.08137. 2020 Jul 16. link
Pandit P, Sahraee-Ardakan M, Rangan S, Schniter P, Fletcher AK. Inference with deep generative priors in high dimensions. IEEE Journal on Selected Areas in Information Theory. 2020 Apr 8;1(1):336-47. link
Koh PW, Nguyen T, Tang YS, Mussmann S, Pierson E, Kim B, Liang P. Concept bottleneck models. InInternational conference on machine learning 2020 Nov 21 (pp. 5338-5348). PMLR. link
Zeng C, Wu C, Jia R. Non-Lipschitz models for image restoration with impulse noise removal. SIAM Journal on Imaging Sciences. 2019;12(1):420-58. link
Shen Y, Sanghavi S. Learning with bad training data via iterative trimmed loss minimization. InInternational conference on machine learning 2019 May 24 (pp. 5739-5748). PMLR. link
Heckel R, Soltanolkotabi M. Denoising and regularization via exploiting the structural bias of convolutional generators. arXiv preprint arXiv:1910.14634. 2019 Oct 31. link
Latorre F, Cevher V. Fast and provable ADMM for learning with generative priors. Advances in Neural Information Processing Systems. 2019;32. link
Ma J, Liu XY, Shou Z, Yuan X. Deep tensor admm-net for snapshot compressive imaging. InProceedings of the IEEE/CVF International Conference on Computer Vision 2019 (pp. 10223-10232). link
Liu R, Zhang Y, Cheng S, Fan X, Luo Z. A theoretically guaranteed deep optimization framework for robust compressive sensing mri. InProceedings of the AAAI Conference on Artificial Intelligence 2019 Jul 17 (Vol. 33, No. 01, pp. 4368-4375). link
Shi W, Jiang F, Liu S, Zhao D. Image compressed sensing using convolutional neural network. IEEE Transactions on Image Processing. 2019 Jul 17;29:375-88. link
Shi W, Jiang F, Liu S, Zhao D. Scalable convolutional neural network for image compressed sensing. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2019 (pp. 12290-12299). link
Canh TN, Jeon B. Difference of convolution for deep compressive sensing. In2019 IEEE International Conference on Image Processing (ICIP) 2019 Sep 22 (pp. 2105-2109). IEEE. link
Yang Y, Sun J, Li H, Xu Z. ADMM-CSNet: A deep learning approach for image compressive sensing. IEEE transactions on pattern analysis and machine intelligence. 2018 Nov 28;42(3):521-38. link
Hammernik K, Klatzer T, Kobler E, Recht MP, Sodickson DK, Pock T, Knoll F. Learning a variational network for reconstruction of accelerated MRI data. Magnetic resonance in medicine. 2018 Jun;79(6):3055-71. link
Kabkab M, Samangouei P, Chellappa R. Task-aware compressed sensing with generative adversarial networks. InProceedings of the AAAI conference on artificial intelligence 2018 Apr 26 (Vol. 32, No. 1). link
Seitzer M, Yang G, Schlemper J, Oktay O, Würfl T, Christlein V, Wong T, Mohiaddin R, Firmin D, Keegan J, Rueckert D. Adversarial and perceptual refinement for compressed sensing MRI reconstruction. InInternational conference on medical image computing and computer-assisted intervention 2018 Sep 16 (pp. 232-240). Cham: Springer International Publishing. link
Iliadis M, Spinoulas L, Katsaggelos AK. Deep fully-connected networks for video compressive sensing. Digital Signal Processing. 2018 Jan 1;72:9-18. link
Cui W, Xu H, Gao X, Zhang S, Jiang F, Zhao D. An efficient deep convolutional Laplacian pyramid architecture for CS reconstruction at low sampling ratios. In2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2018 Apr 15 (pp. 1748-1752). IEEE. link
Du J, Xie X, Wang C, Shi G. Perceptual compressive sensing. InPattern Recognition and Computer Vision: First Chinese Conference, PRCV 2018, Guangzhou, China, November 23-26, 2018, Proceedings, Part III 1 2018 (pp. 268-279). Springer International Publishing. link
Zhang J, Ghanem B. ISTA-Net: Interpretable optimization-inspired deep network for image compressive sensing. InProceedings of the IEEE conference on computer vision and pattern recognition 2018 (pp. 1828-1837). link
Dhar M, Grover A, Ermon S. Modeling sparse deviations for compressed sensing using generative models. InInternational Conference on Machine Learning 2018 Jul 3 (pp. 1214-1223). PMLR. link
Shah V, Hegde C. Solving linear inverse problems using gan priors: An algorithm with provable guarantees. In2018 IEEE international conference on acoustics, speech and signal processing (ICASSP) 2018 Apr 15 (pp. 4609-4613). IEEE. link
Vaswani N, Narayanamurthy P. Static and dynamic robust PCA and matrix completion: A review. Proceedings of the IEEE. 2018 Aug 6;106(8):1359-79. link
Hand P, Voroninski V. Global guarantees for enforcing deep generative priors by empirical risk. InConference On Learning Theory 2018 Jul 3 (pp. 970-978). PMLR. link
Hand P, Voroninski V. Global guarantees for enforcing deep generative priors by empirical risk. InConference On Learning Theory 2018 Jul 3 (pp. 970-978). PMLR. link
Zhang X, Wang L, Gu Q. A nonconvex free lunch for low-rank plus sparse matrix recovery. arXiv preprint arXiv:1702.06525. 2017 Feb. link
Cherapanamjeri Y, Gupta K, Jain P. Nearly optimal robust matrix completion. InInternational Conference on Machine Learning 2017 Jul 17 (pp. 797-805). PMLR. link
Wan Q, Duan H, Fang J, Li H, Xing Z. Robust Bayesian compressed sensing with outliers. Signal Processing. 2017 Nov 1;140:104-9. link
Bora A, Jalal A, Price E, Dimakis AG. Compressed sensing using generative models. InInternational conference on machine learning 2017 Jul 17 (pp. 537-546). PMLR. link
Yang G, Yu S, Dong H, Slabaugh G, Dragotti PL, Ye X, Liu F, Arridge S, Keegan J, Guo Y, Firmin D. DAGAN: deep de-aliasing generative adversarial networks for fast compressed sensing MRI reconstruction. IEEE transactions on medical imaging. 2017 Dec 21;37(6):1310-21. link
Liu Q, Leung H. Synthesis-analysis deconvolutional network for compressed sensing. In2017 IEEE International Conference on Image Processing (ICIP) 2017 Sep 17 (pp. 1940-1944). IEEE. link
Metzler C, Mousavi A, Baraniuk R. Learned D-AMP: Principled neural network based compressive image recovery. Advances in neural information processing systems. 2017;30. link
Perdios D, Besson A, Arditi M, Thiran JP. A deep learning approach to ultrasound image recovery. In2017 IEEE international ultrasonics symposium (IUS) 2017 Sep 6 (pp. 1-4). Ieee. link
Mousavi A, Baraniuk RG. Learning to invert: Signal recovery via deep convolutional networks. In2017 IEEE international conference on acoustics, speech and signal processing (ICASSP) 2017 Mar 5 (pp. 2272-2276). IEEE. link
Kulkarni K, Lohit S, Turaga P, Kerviche R, Ashok A. Reconnet: Non-iterative reconstruction of images from compressively sensed measurements. InProceedings of the IEEE conference on computer vision and pattern recognition 2016 (pp. 449-458). link
Palangi H, Ward R, Deng L. Distributed compressive sensing: A deep learning approach. IEEE Transactions on Signal Processing. 2016 Apr 21;64(17):4504-18. link
Xu K, Ren F. CSvideonet: A recurrent convolutional neural network for compressive sensing video reconstruction. arXiv preprint arXiv:1612.05203. 2016 Dec. link
Mu C, Zhang Y, Wright J, Goldfarb D. Scalable robust matrix recovery: Frank--Wolfe meets proximal methods. SIAM Journal on Scientific Computing. 2016;38(5):A3291-317. link
Wipf D. Non-convex rank minimization via an empirical Bayesian approach. arXiv preprint arXiv:1408.2054. 2014 Aug 9. link
Foygel R, Mackey L. Corrupted sensing: Novel guarantees for separating structured signals. IEEE Transactions on Information Theory. 2014 Jan 14;60(2):1223-47. link
Chen Y, Jalali A, Sanghavi S, Caramanis C. Low-rank matrix recovery from errors and erasures. IEEE Transactions on Information Theory. 2013 Mar 7;59(7):4324-37. link
Mitra K, Veeraraghavan A, Chellappa R. Analysis of sparse regularization based robust regression approaches. IEEE Transactions on Signal Processing. 2012 Nov 27;61(5):1249-57. link
Studer C, Kuppinger P, Pope G, Bolcskei H. Recovery of sparsely corrupted signals. IEEE Transactions on Information Theory. 2011 Dec 14;58(5):3115-30. link
Mahoney MW. Randomized algorithms for matrices and data. Foundations and Trends® in Machine Learning. 2011 Nov 21;3(2):123-224. link
Candès EJ, Li X, Ma Y, Wright J. Robust principal component analysis?. Journal of the ACM (JACM). 2011 Jun 9;58(3):1-37. link
Xu H, Caramanis C, Sanghavi S. Robust PCA via outlier pursuit. Advances in neural information processing systems. 2010;23. link
Wright J, Ma Y. Dense error correction via $\ell^ 1$-minimization. IEEE Transactions on Information Theory. 2010 Jun 14;56(7):3540-60. link
Lin Z, Chen M, Ma Y. The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices. arXiv preprint arXiv:1009.5055. 2010 Sep 26. link
Carrillo RE, Barner KE, Aysal TC. Robust sampling and reconstruction methods for sparse signals in the presence of impulsive noise. IEEE Journal of Selected Topics in Signal Processing. 2010 Feb 22;4(2):392-408. link
Wright J, Ganesh A, Rao S, Peng Y, Ma Y. Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization. Advances in neural information processing systems. 2009;22. link
Lin Z, Ganesh A, Wright J, Wu L, Chen M, Ma Y. Fast convex optimization algorithms for exact recovery of a corrupted low-rank matrix. Coordinated Science Laboratory Report no. UILU-ENG-09-2214, DC-246. 2009. link
Phan DN, Gillis N. An inertial block majorization minimization framework for nonsmooth nonconvex optimization. Journal of Machine Learning Research. 2023;24(18):1-41. link
Cai JF, Choi JK, Li J, Wei K. Image restoration: structured low rank matrix framework for piecewise smooth functions and beyond. Applied and Computational Harmonic Analysis. 2022 Jan 1;56:26-60. link
Liu P, Zhang H. A theory of computational resolution limit for line spectral estimation. IEEE Transactions on Information Theory. 2021 Apr 22;67(7):4812-27. link
Chen Y, Chi Y, Fan J, Ma C. Spectral methods for data science: A statistical perspective. Foundations and Trends® in Machine Learning. 2021 Oct 20;14(5):566-806. link
Gribonval R, Nikolova M. On Bayesian estimation and proximity operators. Applied and Computational Harmonic Analysis. 2021 Jan 1;50:49-72. link
Chen J, Gao W, Wei K. Exact matrix completion based on low rank Hankel structure in the Fourier domain. Applied and Computational Harmonic Analysis. 2021 Nov 1;55:149-84. link
Li Y, Chi Y, Zhang H, Liang Y. Non-convex low-rank matrix recovery with arbitrary outliers via median-truncated gradient descent. Information and Inference: A Journal of the IMA. 2020 Jun;9(2):289-325. link
Cuyt A, Lee WS. How to get high resolution results from sparse and coarsely sampled data. Applied and Computational Harmonic Analysis. 2020 May 1;48(3):1066-87. link
Zhang C, Chen X. A smoothing active set method for linearly constrained non-Lipschitz nonconvex optimization. SIAM Journal on Optimization. 2020;30(1):1-30. link
Tang J, Egiazarian K, Golbabaee M, Davies M. The practicality of stochastic optimization in imaging inverse problems. IEEE Transactions on Computational Imaging. 2020 Oct 21;6:1471-85. link
Li Q, Tang G. Approximate support recovery of atomic line spectral estimation: A tale of resolution and precision. Applied and Computational Harmonic Analysis. 2020 May 1;48(3):891-948. link
Cai JF, Wang T, Wei K. Fast and provable algorithms for spectrally sparse signal reconstruction via low-rank Hankel matrix completion. Applied and Computational Harmonic Analysis. 2019 Jan 1;46(1):94-121. link
Polisano K, Condat L, Clausel M, Perrier V. A convex approach to superresolution and regularization of lines in images. SIAM Journal on Imaging Sciences. 2019;12(1):211-58. link
Rabanser S, Neumann L, Haltmeier M. Analysis of the block coordinate descent method for linear ill-posed problems. SIAM Journal on Imaging Sciences. 2019;12(4):1808-32. link
He Y, Wang F, Li Y, Qin J, Chen B. Robust matrix completion via maximum correntropy criterion and half-quadratic optimization. IEEE Transactions on Signal Processing. 2019 Nov 8;68:181-95. link
Poddar S, Mohsin YQ, Ansah D, Thattaliyath B, Ashwath R, Jacob M. Manifold recovery using kernel low-rank regularization: Application to dynamic imaging. IEEE transactions on computational imaging. 2019 Jan 24;5(3):478-91. link
Janzamin M, Ge R, Kossaifi J, Anandkumar A. Spectral learning on matrices and tensors. Foundations and Trends® in Machine Learning. 2019 Nov 27;12(5-6):393-536. link
Fei Y, Chen Y. Hidden integrality of SDP relaxations for sub-Gaussian mixture models. InConference On Learning Theory 2018 Jul 3 (pp. 1931-1965). PMLR. link
Hu Y, Liu X, Jacob M. A generalized structured low-rank matrix completion algorithm for MR image recovery. IEEE transactions on medical imaging. 2018 Dec 11;38(8):1841-51. link
Ulicny M, Krylov VA, Dahyot R. Harmonic networks: Integrating spectral information into CNNs. arXiv preprint arXiv:1812.03205. 2018 Dec 7. link
Zhang J, Lei Q, Dhillon I. Stabilizing gradients for deep neural networks via efficient svd parameterization. InInternational Conference on Machine Learning 2018 Jul 3 (pp. 5806-5814). PMLR. link
Cai JF, Wang T, Wei K. Spectral compressed sensing via projected gradient descent. SIAM Journal on Optimization. 2018;28(3):2625-53. link
Balachandrasekaran A, Magnotta V, Jacob M. Recovery of damped exponentials using structured low rank matrix completion. IEEE transactions on medical imaging. 2017 Jul 14;36(10):2087-98. link
Sokolić J, Giryes R, Sapiro G, Rodrigues MR. Robust large margin deep neural networks. IEEE Transactions on Signal Processing. 2017 May 25;65(16):4265-80. link
Ongie G, Jacob M. A fast algorithm for convolutional structured low-rank matrix recovery. IEEE transactions on computational imaging. 2017 Jun 30;3(4):535-50. link
Jawanpuria P, Mishra B. A Saddle Point Approach to Structured Low-rank Matrix Learning in Large-scale Applications. stat. 2017 Apr 25;1050:24. link
Chi Y. Convex relaxations of spectral sparsity for robust super-resolution and line spectrum estimation. InWavelets and Sparsity XVII 2017 Aug 24 (Vol. 10394, pp. 314-321). SPIE. link
Liu S. Projected Wirtinger gradient descent for spectral compressed sensing. The University of Iowa; 2017. link
Cai JF, Qu X, Xu W, Ye GB. Robust recovery of complex exponential signals from random Gaussian projections via low rank Hankel matrix reconstruction. Applied and computational harmonic analysis. 2016 Sep 1;41(2):470-90. link
Shang F, Liu Y, Cheng J. Tractable and scalable Schatten quasi-norm approximations for rank minimization. InArtificial Intelligence and Statistics 2016 May 2 (pp. 620-629). PMLR. link
Liao W, Fannjiang A. MUSIC for single-snapshot spectral estimation: Stability and super-resolution. Applied and Computational Harmonic Analysis. 2016 Jan 1;40(1):33-67. link
Usevich K, Comon P. Hankel low-rank matrix completion: Performance of the nuclear norm relaxation. IEEE Journal of Selected Topics in Signal Processing. 2016 Feb 26;10(4):637-46. link
Tang G. Resolution limits for atomic decompositions via Markov-Bernstein type inequalities. In2015 International Conference on Sampling Theory and Applications (SampTA) 2015 May 25 (pp. 548-552). IEEE. link
Tang G, Bhaskar BN, Recht B. Near minimax line spectral estimation. IEEE Transactions on Information Theory. 2014 Nov 6;61(1):499-512. link
Chen Y, Chi Y. Spectral compressed sensing via structured matrix completion. InInternational conference on machine learning 2013 May 26 (pp. 414-422). PMLR. link
Duarte MF, Baraniuk RG. Spectral compressive sensing. Applied and Computational Harmonic Analysis. 2013 Jul 1;35(1):111-29. link
Bhaskar BN, Tang G, Recht B. Atomic norm denoising with applications to line spectral estimation. IEEE Transactions on Signal Processing. 2013 Jul 16;61(23):5987-99. link
Hu Y, Lingala SG, Jacob M. A fast majorize–minimize algorithm for the recovery of sparse and low-rank matrices. IEEE Transactions on Image Processing. 2011 Aug 22;21(2):742-53. link
Vershynin R. Spectral norm of products of random and deterministic matrices. Probability theory and related fields. 2011 Aug;150(3):471-509. link
Stoica P, Babu P, Li J. New method of sparse parameter estimation in separable models and its use for spectral analysis of irregularly sampled data. IEEE Transactions on Signal Processing. 2010 Oct 14;59(1):35-47. link
Lewis AS, Malick J. Alternating projections on manifolds. Mathematics of Operations Research. 2008 Feb;33(1):216-34. link
Li Y, Razavilar J, Liu KR. A high-resolution technique for multidimensional NMR spectroscopy. IEEE Transactions on Biomedical Engineering. 1998 Jan;45(1):78-86. link
Engl HW, Hanke M, Neubauer A. Regularization of inverse problems. Springer Science & Business Media; 1996 Jul 31. link
Lingala SG, Hu Y, DiBella E, Jacob M. Accelerated dynamic MRI exploiting sparsity and low-rank structure: kt SLR. IEEE transactions on medical imaging. 2011 Jan 31;30(5):1042-54. link
Xi Y, Rocke DM. Baseline correction for NMR spectroscopic metabolomics data analysis. BMC bioinformatics. 2008 Dec;9:1-0. link
Chen GH, Tang J, Leng S. Prior image constrained compressed sensing (PICCS): a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets. Medical physics. 2008 Feb;35(2):660-3. link
Lustig M, Donoho D, Pauly JM. Sparse MRI: The application of compressed sensing for rapid MR imaging. Magnetic Resonance in Medicine: An Official Journal of the International Society for Magnetic Resonance in Medicine. 2007 Dec;58(6):1182-95. link
Bhave S, Lingala SG, Newell Jr JD, Nagle SK, Jacob M. Blind compressed sensing enables 3-dimensional dynamic free breathing magnetic resonance imaging of lung volumes and diaphragm motion. Investigative radiology. 2016 Jun 1;51(6):387-99. link
Kerdegari H, Phung Tran Huy N, Nguyen VH, Truong TP, Le NM, Le TP, Le TM, Pisani L, Denehy L, Razavi R, Thwaites L. Automatic retrieval of corresponding US views in longitudinal examinations. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 152-161). Cham: Springer Nature Switzerland. link
Judge T, Bernard O, Cho Kim WJ, Gomez A, Chartsias A, Jodoin PM. Asymmetric contour uncertainty estimation for medical image segmentation. InInternational conference on medical image computing and computer-assisted intervention 2023 Oct 1 (pp. 210-220). Cham: Springer Nature Switzerland. link
Wang X, Zhu H. Artificial intelligence in image-based cardiovascular disease analysis: A comprehensive survey and future outlook. arXiv preprint arXiv:2402.03394. 2024 Feb 4. link
Chen Z, Gao Q, Zhang Y, Shan H. ASCON: Anatomy-aware supervised contrastive learning framework for low-dose CT denoising. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 355-365). Cham: Springer Nature Switzerland. link
Adams J, Elhabian SY. Can point cloud networks learn statistical shape models of anatomies?. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 486-496). Cham: Springer Nature Switzerland. link
Zhang X, Zhang J, Ma L, Xue P, Hu Y, Wu D, Zhan Y, Feng J, Shen D. Progressive deep segmentation of coronary artery via hierarchical topology learning. InInternational conference on medical image computing and computer-assisted intervention 2022 Sep 16 (pp. 391-400). Cham: Springer Nature Switzerland. link
Liu J, Desrosiers C, Zhou Y. Semi-supervised medical image segmentation using cross-model pseudo-supervision with shape awareness and local context constraints. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2022 Sep 16 (pp. 140-150). Cham: Springer Nature Switzerland. link
Lin W, Liu H, Gu L, Gao Z. A geometry-constrained deformable attention network for aortic segmentation. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2022 Sep 16 (pp. 287-296). Cham: Springer Nature Switzerland. link
Gong K, Catana C, Qi J, Li Q. Direct reconstruction of linear parametric images from dynamic PET using nonlocal deep image prior. IEEE Transactions on Medical Imaging. 2021 Oct 15;41(3):680-9. link
Yao Z, Xie W, Zhang J, Dong Y, Qiu H, Yuan H, Jia Q, Wang T, Shi Y, Zhuang J, Que L. Imagetbad: A 3d computed tomography angiography image dataset for automatic segmentation of type-b aortic dissection. Frontiers in Physiology. 2021 Sep 27;12:732711. link
Solomon O, Eldar YC, Mutzafi M, Segev M. SPARCOM: Sparsity based super-resolution correlation microscopy. SIAM Journal on Imaging Sciences. 2019;12(1):392-419. link
Poon C, Peyré G. Multidimensional sparse super-resolution. SIAM Journal on Mathematical Analysis. 2019;51(1):1-44. link
Morgenshtern VI, Candes EJ. Super-resolution of positive sources: The discrete setup. SIAM Journal on Imaging Sciences. 2016;9(1):412-44. link
Huang S, Li J, Mei L, Zhang T, Chen Z, Dong Y, Dong L, Liu S, Lyu M. Accurate multi-contrast mri super-resolution via a dual cross-attention transformer network. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 313-322). Cham: Springer Nature Switzerland. link
Huang J, Aviles-Rivero AI, Schönlieb CB, Yang G. Cdiffmr: Can we replace the gaussian noise with k-space undersampling for fast mri?. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 3-12). Cham: Springer Nature Switzerland. link
He C, Li K, Zhang Y, Tang L, Zhang Y, Guo Z, Li X. Camouflaged object detection with feature decomposition and edge reconstruction. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2023 (pp. 22046-22055). link
Han Z, Wang Y, Zhou L, Wang P, Yan B, Zhou J, Wang Y, Shen D. Contrastive diffusion model with auxiliary guidance for coarse-to-fine PET reconstruction. InInternational conference on medical image computing and computer-assisted intervention 2023 Oct 1 (pp. 239-249). Cham: Springer Nature Switzerland. link
Han H, Kim S, Choi HS, Yoon S. On the impact of knowledge distillation for model interpretability. arXiv preprint arXiv:2305.15734. 2023 May 25. link
Gan Z, Zhao S, Kang J, Shang L, Chen H, Li C. Superclass learning with representation enhancement. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2023 (pp. 24060-24069). link
Dheur V, Taieb SB. A large-scale study of probabilistic calibration in neural network regression. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 7813-7836). PMLR. link
Suliman MA, Dai W. Blind two-dimensional super-resolution and its performance guarantee. IEEE Transactions on Signal Processing. 2022 Apr 27;70:2844-58. link
Suliman MA, Dai W. Exact three-dimensional estimation in blind super-resolution via convex optimization. In2019 53rd Annual Conference on Information Sciences and Systems (CISS) 2019 Mar 20 (pp. 1-9). IEEE. link
Lingala SG, Hu Y, DiBella E, Jacob M. Accelerated dynamic MRI exploiting sparsity and low-rank structure: kt SLR. IEEE transactions on medical imaging. 2011 Jan 31;30(5):1042-54. link
Chi Y, Da Costa MF. Harnessing sparsity over the continuum: Atomic norm minimization for superresolution. IEEE Signal Processing Magazine. 2020 Feb 27;37(2):39-57. link
Chi Y, Chen Y. Compressive two-dimensional harmonic retrieval via atomic norm minimization. IEEE Transactions on Signal Processing. 2014 Dec 24;63(4):1030-42. link
Chandrasekaran V, Recht B, Parrilo PA, Willsky AS. The convex geometry of linear inverse problems. Foundations of Computational mathematics. 2012 Dec;12(6):805-49. link
Lee K, Bresler Y. Admira: Atomic decomposition for minimum rank approximation. IEEE Transactions on Information Theory. 2010 Aug 16;56(9):4402-16. link
Chen SS, Donoho DL, Saunders MA. Atomic decomposition by basis pursuit. SIAM review. 2001;43(1):129-59. link
Cheng X, Cao Y, Li X, An B, Feng L. Weakly supervised regression with interval targets. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 5428-5448). PMLR. link
Chen D, Bai Y, Shen W, Li Q, Yu L, Wang Y. Magicnet: Semi-supervised multi-organ segmentation via magic-cube partition and recovery. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2023 (pp. 23869-23878). link
Cai H, Qi L, Yu Q, Shi Y, Gao Y. 3d medical image segmentation with sparse annotation via cross-teaching between 3d and 2d networks. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2023 Oct 1 (pp. 614-624). Cham: Springer Nature Switzerland. link
Zhou K, Li W, Lu L, Han X, Lu J. Revisiting temporal alignment for video restoration. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2022 (pp. 6053-6062). link
Zhang W, Zhu L, Hallinan J, Zhang S, Makmur A, Cai Q, Ooi BC. Boostmis: Boosting medical image semi-supervised learning with adaptive pseudo labeling and informative active annotation. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2022 (pp. 20666-20676). link
Zhang K, Zhuang X. Cyclemix: A holistic strategy for medical image segmentation from scribble supervision. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 11656-11665). link
Zeng F, Dong B, Zhang Y, Wang T, Zhang X, Wei Y. Motr: End-to-end multiple-object tracking with transformer. InEuropean conference on computer vision 2022 Oct 23 (pp. 659-675). Cham: Springer Nature Switzerland. link
Yang J, Lindenbaum O, Kluger Y. Locally sparse neural networks for tabular biomedical data. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 25123-25153). PMLR. link
Yang J, Wickramasinghe U, Ni B, Fua P. Implicitatlas: learning deformable shape templates in medical imaging. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 15861-15871). link
Xu S, Kim J, Walter JR, Ghaffari R, Rogers JA. Translational gaps and opportunities for medical wearables in digital health. Science translational medicine. 2022 Oct 12;14(666):eabn6036. link
Wang Y, Sun X, Fu Y. Scalable penalized regression for noise detection in learning with noisy labels. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2022 (pp. 346-355). link
Wang J, Lukasiewicz T. Rethinking bayesian deep learning methods for semi-supervised volumetric medical image segmentation. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 182-190). link
Thaker D, Giampouras P, Vidal R. Reverse Engineering $\ell_p $ attacks: A block-sparse optimization approach with recovery guarantees. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 21253-21271). PMLR. link
Stadnick B, Witowski J, Rajiv V, Chłędowski J, Shamout FE, Cho K, Geras KJ. Meta-repository of screening mammography classifiers. arXiv preprint arXiv:2108.04800. 2021 Aug 10. link
Ron T, Hazan T. Dual decomposition of convex optimization layers for consistent attention in medical images. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 18754-18769). PMLR. link
Park N, Kim S. Blurs behave like ensembles: Spatial smoothings to improve accuracy, uncertainty, and robustness. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 17390-17419). PMLR. link
Oh J, Kim H, Nah S, Hong C, Choi J, Lee KM. Attentive fine-grained structured sparsity for image restoration. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 17673-17682). link
Mücke N, Reiss E, Rungenhagen J, Klein M. Data-splitting improves statistical performance in overparameterized regimes. InInternational conference on artificial intelligence and statistics 2022 May 3 (pp. 10322-10350). PMLR. link
Matsoukas C, Haslum JF, Sorkhei M, Söderberg M, Smith K. What makes transfer learning work for medical images: Feature reuse & other factors. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 9225-9234). link
Lou A, Guan S, Ko H, Loew MH. CaraNet: context axial reverse attention network for segmentation of small medical objects. InMedical Imaging 2022: Image Processing 2022 Apr 4 (Vol. 12032, pp. 81-92). SPIE. link
Liu J, Liu Z. Non-iterative recovery from nonlinear observations using generative models. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 233-243). link
Liu F, Tian Y, Chen Y, Liu Y, Belagiannis V, Carneiro G. Acpl: Anti-curriculum pseudo-labelling for semi-supervised medical image classification. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2022 (pp. 20697-20706). link
Berg WA, Leung J. Diagnostic Imaging: Breast E-Book: Diagnostic Imaging: Breast E-Book. Elsevier Health Sciences; 2019 Jun 17. link
Huang X, Yu C, Liu H. Physiological model based deep learning framework for cardiac TMP recovery. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2022 Sep 16 (pp. 433-443). Cham: Springer Nature Switzerland. link
Hong Y, Pan H, Sun W, Yu X, Gao H. Representation separation for semantic segmentation with vision transformers. arXiv preprint arXiv:2212.13764. 2022 Dec 28. link
Haghighi F, Taher MR, Gotway MB, Liang J. Dira: Discriminative, restorative, and adversarial learning for self-supervised medical image analysis. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 20824-20834). link
Guo X, Zhou B, Chen X, Liu C, Dvornek NC. MCP-Net: Inter-frame motion correction with Patlak regularization for whole-body dynamic PET. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention 2022 Sep 16 (pp. 163-172). Cham: Springer Nature Switzerland. link
Gharleghi R, Adikari D, Ellenberger K, Ooi SY, Ellis C, Chen CM, Gao R, He Y, Hussain R, Lee CY, Li J. Automated segmentation of normal and diseased coronary arteries–the asoca challenge. Computerized Medical Imaging and Graphics. 2022 Apr 1;97:102049. link
Eriksson M, Destounis S, Czene K, Zeiberg A, Day R, Conant EF, Schilling K, Hall P. A risk model for digital breast tomosynthesis to predict breast cancer and guide clinical care. Science Translational Medicine. 2022 May 11;14(644):eabn3971. link
Darestani MZ, Liu J, Heckel R. Test-time training can close the natural distribution shift performance gap in deep learning based compressed sensing. InInternational conference on machine learning 2022 Jun 28 (pp. 4754-4776). PMLR. link
Collins L, Mokhtari A, Oh S, Shakkottai S. Maml and anil provably learn representations. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 4238-4310). PMLR. link
Chen Z, Tian Z, Zhu J, Li C, Du S. C-cam: Causal cam for weakly supervised semantic segmentation on medical image. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2022 (pp. 11676-11685). link
Chen G, Wang H, Chen K, Li Z, Song Z, Liu Y, Chen W, Knoll A. A survey of the four pillars for small object detection: Multiscale representation, contextual information, super-resolution, and region proposal. IEEE Transactions on systems, man, and cybernetics: systems. 2020 Jul 17;52(2):936-53. link
Hamlomo S, Atemkeng M, Brima Y, Nunhokee C, Baxter J. A systematic review of low-rank and local low-rank matrix approximation in big data medical imaging. Neural Computing and Applications. 2025 Mar 4:1-56. link
Garber D, Kaplan A. On the efficient implementation of the matrix exponentiated gradient algorithm for low-rank matrix optimization. Mathematics of Operations Research. 2023 Nov;48(4):2094-128. link
Cao S, Liang P, Valiant G. One-sided matrix completion from two observations per row. InInternational Conference on Machine Learning 2023 Jul 3 (pp. 3599-3624). PMLR. link
Dresdner G, Vladarean ML, Rätsch G, Locatello F, Cevher V, Yurtsever A. Faster one-sample stochastic conditional gradient method for composite convex minimization. InInternational Conference on Artificial Intelligence and Statistics 2022 May 3 (pp. 8439-8457). PMLR. link
Liu T, Li Y, Zhou E, Zhao T. Noise Regularizes Over-parameterized Rank One Matrix Recovery, Provably. InInternational Conference on Artificial Intelligence and Statistics 2022 May 3 (pp. 2784-2802). PMLR. link
Wang H, Lu H, Mazumder R. Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning. SIAM Journal on Optimization. 2022;32(4):2938-68. link
Zhao S, Frangella Z, Udell M. NysADMM: faster composite convex optimization via low-rank approximation. InInternational Conference on Machine Learning 2022 Jun 28 (pp. 26824-26840). PMLR. link
Liu D, Cevher V, Tran-Dinh Q. A Newton Frank–Wolfe method for constrained self-concordant minimization. Journal of Global Optimization. 2022 Jun 1:1-27. link
Mukkamala MC, Fadili J, Ochs P. Global convergence of model function based Bregman proximal minimization algorithms. Journal of Global Optimization. 2022 Aug 1:1-29. link
Wang J, Wong RK, Mao X, Chan KC. Matrix completion with model-free weighting. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 10927-10936). PMLR. link
Huang M, Ma S, Lai L. Robust low-rank matrix completion via an alternating manifold proximal gradient continuation method. IEEE Transactions on Signal Processing. 2021 Apr 16;69:2639-52. link
Kümmerle C, Verdun CM. A scalable second order method for ill-conditioned matrix completion from few samples. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 5872-5883). PMLR. link
Li W, Liao W. Stable super-resolution limit and smallest singular value of restricted Fourier matrices. Applied and Computational Harmonic Analysis. 2021 Mar 1;51:118-56. link
Bhaskara A, Ruwanpathirana AK, Wijewardena M. Principal component regression with semirandom observations via matrix completion. InInternational Conference on Artificial Intelligence and Statistics 2021 Mar 18 (pp. 2665-2673). PMLR. link
Tong T, Ma C, Chi Y. Low-rank matrix recovery with scaled subgradient methods: Fast and robust convergence without the condition number. IEEE Transactions on Signal Processing. 2021 Apr 7;69:2396-409. link
Bhaskara A, Ruwanpathirana AK, Wijewardena M. Additive error guarantees for weighted low rank approximation. InInternational Conference on Machine Learning 2021 Jul 1 (pp. 874-883). PMLR. link
Mukhopadhyay S, Chakraborty M. A two stage generalized block orthogonal matching pursuit (TSGBOMP) algorithm. IEEE Transactions on Signal Processing. 2021 Sep 24;69:5846-58. link
Sagan A, Mitchell JE. Low-rank factorization for rank minimization with nonconvex regularizers. Computational Optimization and Applications. 2021 Jun;79(2):273-300. link
Groß B, Flinth A, Roth I, Eisert J, Wunder G. Hierarchical sparse recovery from hierarchically structured measurements with application to massive random access. In2021 IEEE Statistical Signal Processing Workshop (SSP) 2021 Jul 11 (pp. 531-535). IEEE. link
Junge M, Lee K. Generalized notions of sparsity and restricted isometry property. Part II: Applications. Journal of Fourier Analysis and Applications. 2021 Apr;27:1-40. link
Waldspurger I. Lecture notes on non-convex algorithms for low-rank matrix recovery. arXiv preprint arXiv:2105.10318. 2021 May 21. link
Zhou Z, Ma Y. Comments on Efficient Singular Value Thresholding Computation. arXiv preprint arXiv:2011.06710. 2020 Nov 13. link
Pramanik A, Aggarwal HK, Jacob M. Deep generalization of structured low-rank algorithms (Deep-SLR). IEEE transactions on medical imaging. 2020 Aug 5;39(12):4186-97. link
Ornhag MV, Olsson C. A unified optimization framework for low-rank inducing penalties. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition 2020 (pp. 8474-8483). link
Pedregosa F, Negiar G, Askari A, Jaggi M. Linearly convergent Frank-Wolfe with backtracking line-search. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 1-10). PMLR. link
Ndaoud M, Tsybakov AB. Optimal variable selection and adaptive noisy compressed sensing. IEEE Transactions on Information Theory. 2020 Jan 10;66(4):2517-32. link
Liu W, Mao X, Wong RK. Median matrix completion: From embarrassment to optimality. InInternational Conference on Machine Learning 2020 Nov 21 (pp. 6294-6304). PMLR. link
Geyer K, Kyrillidis A, Kalev A. Low-rank regularization and solution uniqueness in over-parameterized matrix sensing. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 930-940). PMLR. link
Li Y, Chi Y, Zhang H, Liang Y. Non-convex low-rank matrix recovery with arbitrary outliers via median-truncated gradient descent. Information and Inference: A Journal of the IMA. 2020 Jun;9(2):289-325. link
Li X, Zhu Z, Man-Cho So A, Vidal R. Nonconvex robust low-rank matrix recovery. SIAM Journal on Optimization. 2020;30(1):660-86. link
Li S, Li Q, Zhu Z, Tang G, Wakin MB. The global geometry of centralized and distributed low-rank matrix recovery without regularization. IEEE Signal Processing Letters. 2020 Jul 15;27:1400-4. link
Kümmerle C, Verdun CM. Escaping saddle points in ill-conditioned matrix completion with a scalable second order method. arXiv preprint arXiv:2009.02905. 2020 Sep 7. link
Junge M, Lee K. Generalized notions of sparsity and restricted isometry property. Part I: a unified framework. Information and Inference: A Journal of the IMA. 2020 Mar;9(1):157-93. link
Krahmer F, Kümmerle C, Melnyk O. On the robustness of noise-blind low-rank recovery from rank-one measurements. Linear Algebra and its Applications. 2022 Nov 1;652:37-81. link
Zhou D, Cao Y, Gu Q. Accelerated factored gradient descent for low-rank matrix factorization. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 4430-4440). PMLR. link
Tong T, Ma C, Chi Y. Accelerating ill-conditioned low-rank matrix estimation via scaled gradient descent. Journal of Machine Learning Research. 2021;22(150):1-63. link
Zhao Y, Udell M. Matrix completion with quantified uncertainty through low rank gaussian copula. Advances in Neural Information Processing Systems. 2020;33:20977-88. link
Ji K, Tan J, Xu J, Chi Y. Learning latent features with pairwise penalties in low-rank matrix completion. IEEE Transactions on Signal Processing. 2020 Jul 8;68:4210-25. link
Ding L, Chen Y. Leave-one-out approach for matrix completion: Primal and dual analysis. IEEE Transactions on Information Theory. 2020 May 6;66(11):7274-301. link
Heuer J, Matter F, Pfetsch ME, Theobald T. Block-sparse recovery of semidefinite systems and generalized null space conditions. Linear Algebra and its Applications. 2020 Oct 15;603:470-95. link
Mazumder R, Saldana D, Weng H. Matrix completion with nonconvex regularization: Spectral operators and scalable algorithms. Statistics and Computing. 2020 Jul;30(4):1113-38. link
Bi S, Pan S, Sun D. A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery. Mathematical Programming Computation. 2020 Dec;12(4):569-602. link
Jacob M, Mani MP, Ye JC. Structured low-rank algorithms: Theory, magnetic resonance applications, and links to machine learning. IEEE Signal Processing Magazine. 2020 Jan 17;37(1):54-68. link
Roulet V, Boumal N, d’Aspremont A. Computational complexity versus statistical performance on sparse recovery problems. Information and Inference: A Journal of the IMA. 2020 Mar;9(1):1-32. link
Saligrama V, Olshevsky A, Hendrickx J. Minimax Rank-$1 $ Matrix Factorization. InInternational Conference on Artificial Intelligence and Statistics 2020 Jun 3 (pp. 3426-3436). PMLR. link
You C, Zhu Z, Qu Q, Ma Y. Robust recovery via implicit bias of discrepant learning rates for double over-parameterization. Advances in Neural Information Processing Systems. 2020;33:17733-44. link
Yi J, Xu W. Necessary and sufficient null space condition for nuclear norm minimization in low-rank matrix recovery. IEEE transactions on information theory. 2020 Apr 28;66(10):6597-604. link
Wen J, Zhang R, Yu W. Signal-dependent performance analysis of orthogonal matching pursuit for exact sparse recovery. IEEE Transactions on Signal Processing. 2020 Aug 14;68:5031-46. link
Wang Y, Yao Q, Kwok JT. A Scalable, Adaptive and Sound Nonconvex Regularizer for Low-rank Matrix Completion. arXiv preprint arXiv:2008.06542. 2020 Aug 14. link
Hand P, Leong O, Voroninski V. Optimal sample complexity of subgradient descent for amplitude flow via non-Lipschitz matrix concentration. arXiv preprint arXiv:2011.00288. 2020 Oct 31. link
Gürel NM, Kara K, Stojanov A, Smith T, Lemmin T, Alistarh D, Püschel M, Zhang C. Compressive sensing using iterative hard thresholding with low precision data representation: Theory and applications. IEEE Transactions on Signal Processing. 2020 Jul 20;68:4268-82. link
Zhang H, Qian J, Zhang B, Yang J, Gong C, Wei Y. Low-rank matrix recovery via modified Schatten-$ p $ norm minimization with convergence guarantees. IEEE Transactions on Image Processing. 2019 Dec 11;29:3132-42. link
Huang Y, Liao G, Xiang Y, Zhang L, Li J, Nehorai A. Low-rank approximation via generalized reweighted iterative nuclear and Frobenius norms. IEEE Transactions on Image Processing. 2019 Oct 30;29:2244-57. link
Oymak S, Fabian Z, Li M, Soltanolkotabi M. Generalization guarantees for neural networks via harnessing the low-rank structure of the jacobian. arXiv preprint arXiv:1906.05392. 2019 Jun 12. link
Ma W, Chen GH. Missing not at random in matrix completion: The effectiveness of estimating missingness probabilities under a low nuclear norm assumption. Advances in neural information processing systems. 2019;32. link
Fan J, Ding L, Chen Y, Udell M. Factor group-sparse regularization for efficient low-rank matrix recovery. Advances in neural information processing Systems. 2019;32. link
Klopp O, Lu Y, Tsybakov AB, Zhou HH. Structured matrix estimation and completion. Bernoulli. 2019 Nov 1;25(4B):3883-911. link
Haeffele BD, Vidal R. Structured low-rank matrix factorization: Global optimality, algorithms, and applications. IEEE transactions on pattern analysis and machine intelligence. 2019 Feb 19;42(6):1468-82. link
Chen Y, Chi Y. Harnessing structures in big data via guaranteed low-rank matrix estimation: Recent theory and fast algorithms via convex and nonconvex optimization. IEEE Signal Processing Magazine. 2018 Jun 28;35(4):14-31. link
Grussler C, Rantzer A, Giselsson P. Low-rank optimization with convex constraints. IEEE Transactions on Automatic Control. 2018 Mar 7;63(11):4000-7. link
Eftekhari A, Yang D, Wakin MB. Weighted matrix completion and recovery with prior subspace information. IEEE Transactions on Information Theory. 2018 Mar 16;64(6):4044-71. link
Ma CQ, Ren YS. Proximal iteratively reweighted algorithm for low-rank matrix recovery. Journal of Inequalities and Applications. 2018 Jan 8;2018(1):12. link
Yao Q, Kwok JT. Accelerated and inexact soft-impute for large-scale matrix and tensor completion. IEEE Transactions on Knowledge and Data Engineering. 2018 Aug 28;31(9):1665-79. link
Li Y, Ma T, Zhang H. Algorithmic regularization in over-parameterized matrix sensing and neural networks with quadratic activations. InConference On Learning Theory 2018 Jul 3 (pp. 2-47). PMLR. link
Oymak S. Learning compact neural networks with regularization. InInternational Conference on Machine Learning 2018 Jul 3 (pp. 3966-3975). PMLR. link
Mani M, Jacob M, Kelley D, Magnotta V. Multi‐shot sensitivity‐encoded diffusion data recovery using structured low‐rank matrix completion (MUSSELS). Magnetic resonance in medicine. 2017 Aug;78(2):494-507. link
Kueng R, Rauhut H, Terstiege U. Low rank matrix recovery from rank one measurements. Applied and Computational Harmonic Analysis. 2017 Jan 1;42(1):88-116. link
Bhattacharya I, Jacob M. Compartmentalized low‐rank recovery for high‐resolution lipid unsuppressed MRSI. Magnetic resonance in medicine. 2017 Oct;78(4):1267-80. link
Niranjan UN, Rajkumar A, Tulabandhula T. Provable inductive robust pca via iterative hard thresholding. arXiv preprint arXiv:1704.00367. 2017 Apr 2. link
Gamarnik D, Li Q, Zhang H. Matrix Completion from $ O (n) $ Samples in Linear Time. InConference on Learning Theory 2017 Jun 18 (pp. 940-947). PMLR. link
Kliesch M, Kueng R, Eisert J, Gross D. Improving compressed sensing with the diamond norm. IEEE transactions on information theory. 2016 Sep 7;62(12):7445-63. link
Wei K, Cai JF, Chan TF, Leung S. Guarantees of Riemannian optimization for low rank matrix recovery. SIAM Journal on Matrix Analysis and Applications. 2016;37(3):1198-222. link
Hu Y, Zhao C, Cai D, He X, Li X. Atom decomposition with adaptive basis selection strategy for matrix completion. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM). 2016 Jun 15;12(3):1-25. link
Gu Q, Wang ZW, Liu H. Low-rank and sparse structure pursuit via alternating minimization. InArtificial Intelligence and Statistics 2016 May 2 (pp. 600-609). PMLR. link
Gamarnik D, Misra S. A note on alternating minimization algorithm for the matrix completion problem. IEEE Signal Processing Letters. 2016 Jun 6;23(10):1340-3. link
Oymak S, Jalali A, Fazel M, Eldar YC, Hassibi B. Simultaneously structured models with application to sparse and low-rank matrices. IEEE Transactions on Information Theory. 2015 Feb 9;61(5):2886-908. link
Liu Q, Lai Z, Zhou Z, Kuang F, Jin Z. A truncated nuclear norm regularization method based on weighted residual error for matrix completion. IEEE Transactions on Image Processing. 2015 Nov 23;25(1):316-30. link
Malek-Mohammadi M, Babaie-Zadeh M, Skoglund M. Performance guarantees for Schatten-p quasi-norm minimization in recovery of low-rank matrices. Signal Processing. 2015 Sep 1;114:225-30. link
Lin X, Wei G. Accelerated reweighted nuclear norm minimization algorithm for low rank matrix recovery. Signal Processing. 2015 Sep 1;114:24-33. link
Hastie T, Mazumder R, Lee JD, Zadeh R. Matrix completion and low-rank SVD via fast alternating least squares. The Journal of Machine Learning Research. 2015 Jan 1;16(1):3367-402. link
Chen Y. Incoherence-optimal matrix completion. IEEE Transactions on Information Theory. 2015 Mar 20;61(5):2909-23. link
Cao X. A Scalable and Feasible Matrix Completion Approach Using Random Projection. InNeural Information Processing: 22nd International Conference, ICONIP 2015, Istanbul, Turkey, November 9-12, 2015, Proceedings Part III 22 2015 (pp. 550-558). Springer International Publishing. link
CAI TT, ZHANG A. ROP: MATRIX RECOVERY VIA RANK-ONE PROJECTIONS. The Annals of Statistics. 2015;43(1):102-38. link
Yi X, Caramanis C. Regularized em algorithms: A unified framework and statistical guarantees. Advances in Neural Information Processing Systems. 2015;28. link
Wang Z, Lai MJ, Lu Z, Fan W, Davulcu H, Ye J. Orthogonal rank-one matrix pursuit for low rank matrix completion. SIAM Journal on Scientific Computing. 2015;37(1):A488-514. link
Pitaval RA, Dai W, Tirkkonen O. Convergence of gradient descent for low-rank matrix approximation. IEEE Transactions on Information Theory. 2015 Jun 23;61(8):4451-7. link
Lu C, Lin Z, Yan S. Smoothed low rank and sparse matrix recovery by iteratively reweighted least squares minimization. IEEE Transactions on Image Processing. 2014 Dec 12;24(2):646-54. link
Kyrillidis A, Cevher V. Matrix recipes for hard thresholding methods. Journal of mathematical imaging and vision. 2014 Feb;48:235-65. link
Malek-Mohammadi M, Babaie-Zadeh M, Skoglund M. Iterative concave rank approximation for recovering low-rank matrices. IEEE Transactions on Signal Processing. 2014 Jul 17;62(20):5213-26. link
Li YF, Zhang YJ, Huang ZH. A reweighted nuclear norm minimization algorithm for low rank matrix recovery. Journal of Computational and Applied Mathematics. 2014 Jun 1;263:338-50. link
Nesterov Y, Nemirovski A. On first-order algorithms for l1/nuclear norm minimization. Acta Numerica. 2013 May;22:509-75. link
Lai MJ, Yin W. Augmented \ell_1 and nuclear-norm models with a globally linearly convergent algorithm. SIAM Journal on Imaging Sciences. 2013;6(2):1059-91. link
Cai TT, Zhang A. Sharp RIP bound for sparse signal and low-rank matrix recovery. Applied and Computational Harmonic Analysis. 2013 Jul 1;35(1):74-93. link
Malek-Mohammadi M, Babaie-Zadeh M, Amini A, Jutten C. Recovery of low-rank matrices under affine constraints via a smoothed rank function. IEEE Transactions on Signal Processing. 2013 Dec 20;62(4):981-92. link
Donoho DL, Javanmard A, Montanari A. Information-theoretically optimal compressed sensing via spatial coupling and approximate message passing. IEEE transactions on information theory. 2013 Jul 23;59(11):7434-64. link
Vandereycken B. Low-rank matrix completion by Riemannian optimization. SIAM Journal on Optimization. 2013;23(2):1214-36. link
Hu Y, Zhang D, Liu J, Ye J, He X. Accelerated singular value thresholding for matrix completion. InProceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining 2012 Aug 12 (pp. 298-306). link
Candes E, Recht B. Exact matrix completion via convex optimization. Communications of the ACM. 2012 Jun 1;55(6):111-9. link
Hu Y, Zhang D, Ye J, Li X, He X. Fast and accurate matrix completion via truncated nuclear norm regularization. IEEE transactions on pattern analysis and machine intelligence. 2012 Dec 20;35(9):2117-30. link
Wen Z, Yin W, Zhang Y. Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm. Mathematical Programming Computation. 2012 Dec;4(4):333-61. link
Lu X, Gong T, Yan P, Yuan Y, Li X. Robust alternative minimization for matrix completion. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 2012 Feb 15;42(3):939-49. link
Liu YJ, Sun D, Toh KC. An implementable proximal point algorithmic framework for nuclear norm minimization. Mathematical programming. 2012 Jun;133:399-436. link
Eldar YC, Needell D, Plan Y. Uniqueness conditions for low-rank matrix recovery. Applied and Computational Harmonic Analysis. 2012 Sep 1;33(2):309-14. link
Avron H, Kale S, Kasiviswanathan S, Sindhwani V. Efficient and practical stochastic subgradient descent for nuclear norm regularization. arXiv preprint arXiv:1206.6384. 2012 Jun 27. link
Zhang H, Cai JF, Cheng L, Zhu J. Strongly convex programming for exact matrix completion and robust principal component analysis. arXiv preprint arXiv:1112.3946. 2011 Dec 16. link
NEGAHBAN S, WAINWRIGHT MJ. ESTIMATION OF (NEAR) LOW-RANK MATRICES WITH NOISE AND HIGH-DIMENSIONAL SCALING. The Annals of Statistics. 2011;39(2):1069-97. link
Ma S, Goldfarb D, Chen L. Fixed point and Bregman iterative methods for matrix rank minimization. Mathematical Programming. 2011 Jun;128(1):321-53. link
Ma Y, Zhi L. The minimum-rank gram matrix completion via modified fixed point continuation method. InProceedings of the 36th international symposium on Symbolic and algebraic computation 2011 Jun 8 (pp. 241-248). link
Goldfarb D, Ma S. Convergence of fixed-point continuation algorithms for matrix rank minimization. Foundations of Computational Mathematics. 2011 Apr;11(2):183-210. link
Khajehnejad A, Oymak S, Hassibi B. Subspace expanders and matrix rank minimization. arXiv preprint arXiv:1102.3947. 2011 Feb 19. link
Recht B, Xu W, Hassibi B. Null space conditions and thresholds for rank minimization. Mathematical programming. 2011 Mar;127:175-202. link
Oymak S, Hassibi B. New null space results and recovery thresholds for matrix rank minimization. arXiv preprint arXiv:1011.6326. 2010 Nov 29. link
Recht B, Fazel M, Parrilo PA. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM review. 2010;52(3):471-501. link
Cai JF, Candès EJ, Shen Z. A singular value thresholding algorithm for matrix completion. SIAM Journal on optimization. 2010;20(4):1956-82. link
Toh KC, Yun S. An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems. Pacific Journal of optimization. 2010 Sep 1;6(615-640):15. link
Keshavan RH, Montanari A, Oh S. Matrix completion from a few entries. IEEE transactions on information theory. 2010 May 18;56(6):2980-98. link
Journée M, Bach F, Absil PA, Sepulchre R. Low-rank optimization on the cone of positive semidefinite matrices. SIAM Journal on Optimization. 2010;20(5):2327-51. link
Recht B, Fazel M, Parrilo PA. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM review. 2010;52(3):471-501. link
Candès EJ, Tao T. The power of convex relaxation: Near-optimal matrix completion. IEEE transactions on information theory. 2010 Apr 19;56(5):2053-80. link
Dvijotham K, Fazel M. A nullspace analysis of the nuclear norm heuristic for rank minimization. In2010 IEEE International Conference on Acoustics, Speech and Signal Processing 2010 Mar 14 (pp. 3586-3589). IEEE. link
Jaggi M, Sulovsk M. A simple algorithm for nuclear norm regularized problems. InProceedings of the 27th international conference on machine learning (ICML-10) 2010 (pp. 471-478). link
Herman GT. Fundamentals of computerized tomography: image reconstruction from projections. Springer Science & Business Media; 2009 Jul 14. link
Ji S, Ye J. An accelerated gradient method for trace norm minimization. InProceedings of the 26th annual international conference on machine learning 2009 Jun 14 (pp. 457-464). link
MEINSHAUSEN N, YU B. LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA. The Annals of Statistics. 2009;37(1):246-70. link
Meka R, Jain P, Dhillon I. Matrix completion from power-law distributed samples. Advances in neural information processing systems. 2009;22. link
Yamashita N. Sparse quasi-Newton updates with positive definite matrix completion. Mathematical programming. 2008 Sep;115(1):1-30. link
Li M, Soltanolkotabi M, Oymak S. Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks. InInternational conference on artificial intelligence and statistics 2020 Jun 3 (pp. 4313-4324). PMLR. link
Lei L, Jordan MI. On the adaptivity of stochastic gradient-based optimization. SIAM Journal on Optimization. 2020;30(2):1473-500. link
Lee J, Xiao L, Schoenholz S, Bahri Y, Novak R, Sohl-Dickstein J, Pennington J. Wide neural networks of any depth evolve as linear models under gradient descent. Advances in neural information processing systems. 2019;32. link
Ledent A, Lei Y, Kloft M. Improved Generalisation Bounds for Deep Learning Through $ L^\infty $ Covering Numbers. arXiv preprint arXiv:1905.12430. 2019 May 29. link
Latorre F, Cevher V. Fast and provable ADMM for learning with generative priors. Advances in Neural Information Processing Systems. 2019;32. link
Krahmer F, Stöger D. On the convex geometry of blind deconvolution and matrix completion. Communications on Pure and Applied Mathematics. 2021 Apr;74(4):790-832. link
Kolchinsky A, Tracey BD, Wolpert DH. Nonlinear information bottleneck. Entropy. 2019 Nov 30;21(12):1181. link
Kavis A, Levy KY, Bach F, Cevher V. Unixgrad: A universal, adaptive algorithm with optimal guarantees for constrained optimization. Advances in neural information processing systems. 2019;32. link
Jacob M, Mani MP, Ye JC. Structured low-rank algorithms: theory, MR applications, and links to machine learning, arXiv preprint,(2019) [Internet]. arXiv preprint arXiv:1910.12162; link
Hazan E. Introduction to online convex optimization. Foundations and Trends® in Optimization. 2016 Aug 29;2(3-4):157-325. link
Gu S, Li W, Gool LV, Timofte R. Fast image restoration with multi-bin trainable linear units. InProceedings of the IEEE/CVF International Conference on Computer Vision 2019 (pp. 4190-4199). link
Foucart S, Gribonval R, Jacques L, Rauhut H. Jointly low-rank and bisparse recovery: Questions and partial answers. Analysis and Applications. 2020 Jan 27;18(01):25-48. link
Al-Subaihi S, Bibi A, Alfadly M, Ghanem B. Probabilistically True and Tight Bounds for Robust Deep Neural Network Training. arXiv preprint arXiv:1905.12418. 2019 May. link
Arora S, Du S, Hu W, Li Z, Wang R. Fine-grained analysis of optimization and generalization for overparameterized two-layer neural networks. InInternational conference on machine learning 2019 May 24 (pp. 322-332). PMLR. link
Schmitt R, Schlett CL, Sperl JI, Rapaka S, Jacob AJ, Hein M, Hagar MT, Ruile P, Westermann D, Soschynski M, Bamberg F. Fully Automated Assessment of Cardiac Chamber Volumes and Myocardial Mass on Non-Contrast Chest CT with a Deep Learning Model: Validation Against Cardiac MR. Diagnostics. 2024 Dec 21;14(24):2884. link
Schaefferkoetter J, Shah V, Hayden C, Prior JO, Zuehlsdorff S. Deep learning for improving PET/CT attenuation correction by elastic registration of anatomical data. European Journal of Nuclear Medicine and Molecular Imaging. 2023 Jul;50(8):2292-304. link