Publications


Preprints:

  • A. Patrascu, Finite convergence of the inexact proximal gradient method to sharp minima, submitted, 2021.

  • A. Patrascu and P. Irofti, Computational complexity of Inexact Proximal Point Algorithm for Convex Optimization under Holderian Growth, submitted (pdf), 2021.

  • A. Patrascu and P. Irofti, Truncated convex models for robust learning, preprint, 2022.


Journal papers

    • A. Patrascu and P. Irofti, On finite termination of an inexact Proximal Point algorithm, Applied Mathematics Letters, 2022.

    • A. Patrascu and P. Irofti, Stochastic proximal splitting algorithm for stochastic composite minimization, Optimization Letters, 2021, https://doi.org/10.1007/s11590-021-01702-7 .

    • A. Patrascu, New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization problems, Optimization, 1-29, 2020, DOI:10.1080/02331934.2020.1761364.

    • I. Necoara, P. Richtarik and A. Patrascu, Randomized projection methods for convex feasibility problems: conditioning and convergence rates, SIAM Journal on Optimization, 29(4), 2814–2852, 2019. (pdf)

    • A. Patrascu and I. Necoara, Nonasymptotic convergence of stochastic proximal point algorithms for constrained convex optimization, in Journal of Machine Learning Research, 18:1-42, 2018. (pdf)

    • A. Patrascu and I. Necoara, On the convergence of inexact projection first order methods for convex minimization, in IEEE Transactions on Automatic Control, 63(10):3317-3329, 2018. (pdf)

    • I. Necoara, A. Patrascu and F. Glineur, Complexity certifications of first order inexact Lagrangian and penalty methods for conic convex programming, in Optimization Methods & Software, 34(2): 305-335, 2019. (pdf)

    • I. Necoara and A. Patrascu, Iteration complexity analysis of dual first order methods for conic convex programming, in Optimization Methods and Software, 31(3): 645-678, 2016. (pdf)

    • A. Patrascu, I. Necoara and Q. Tran-Dinh, Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization, in Optimization Letters, 11(3): 609--626, 2017. (pdf)

    • A. Patrascu and I. Necoara, Random coordinate descent methods for \ell_0 regularized convex optimization, in IEEE Transactions on Automatic Control, 60(7): 1811-1824, 2015. (pdf)

    • A. Patrascu and I. Necoara, Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization, in Journal of Global Optimization, 61(1):19-46, 2015. (received Best Paper Award for a paper published in JOGO in 2015). (pdf)

    • I. Necoara and A. Patrascu, A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints, in Computational Optimization and Applications, 57(2): 307-337, 2014. (pdf)

Books and book chapters

  • P. Irofti, A. Patrascu and A. Baltoiu, Fraud Detection in Networks, chapter in Enabling AI Applications in Data Science, Editors: A.-E. Hassanien, M. H. N. Taha, N. E. Mahmoud, ISBN:978-3-030-52066-3, Springer, DOI:10.1007/978-3-030-52067-0, 2021.

  • A. Patrascu, C. Paduraru and P. Irofti, Stochastic Proximal Gradient Algorithm with Minibatches. Application to Large Scale Learning Models, chapter in Enabling AI Applications in Data Science, Editors: A.-E. Hassanien, M. H. N. Taha, N. E. Mahmoud, ISBN: 978-3-030-52066-3, Springer,DOI:10.1007/978-3-030-52067-0, 2021.

  • I. Necoara, A. Patrascu and A. Nedich, Computational complexity certifications for inexact dual first-order methods and its application to real-time MPC, in Developments in Model-Based Optimization and Control, Editors: S. Olaru, A. Grancharova, F.Lobo Pereira, Springer. (pdf)

  • I. Necoara, D. Clipici, A. Patrascu, Metode de optimizare numerica. Culegere de probleme, Editura Politehnica Press, 2014.

Selected conference papers

    • P. Irofti, C. Rusu and A. Patrascu, Dictionary Learning with Uniform Sparse Representations for Anomaly Detection, ICASSP, 2022 (pdf).

    • A. Baltoiu, A. Patrascu and P. Irofti, Graph Anomaly Detection Using Dictionary Learning, accepted to IFAC, 2020.

    • I. Necoara and A. Patrascu, A random coordinate descent algorithm for singly linear constrained smooth optimization, In Proceedings of Mathematical Theory of Networks and Systems, 0210, 2012, http://www.mtns2012.conference.net.au/.

    • Andrei Patrascu and Ion Necoara, A random coordinate descent algorithm for large-scale sparse non-convex optimization, In Proceedings of European Control Conference, 2013.

    • Ion Necoara and Andrei Patrascu, A random coordinate descent algorithm for optimization problems with composite objective function: application to SVM problems, The Fourth International Conference on Continuous Optimization, Lisbon, 2013.

    • A. Patrascu, I. Necoara, V. Nedelcu and D. Clipici, A proximal alternating minimization method for \ell_0-regularized nonlinear optimization problems: application to state estimation, Conference on Decision and Control, 2014.

    • A. Patrascu, I. Necoara and P. Patrinos, A proximal alternating minimization method for l0 - regularized nonlinear optimization problems: application to state estimation, Proceedings of the 53nd Conference on Decision and Control, Los Angeles, 2014.

    • A. Patrascu and I. Necoara, Coordinate descent methods for l0 regularized optimization problems, SIAM Conference on Optimization, San Diego, 2014.

    • I. Necoara and A. Patrascu, Efficient Random Coordinate Descent Algorithms for Large-Scale Structured Nonconvex Optimization, SIAM Conference on Optimization, San Diego, 2014.

    • I. Necoara and A. Patrascu, Unified analysis of primal/dual methods for conic convex optimization, invited paper at the 22nd International Symposium on Mathematical Programming, 2015.

    • I. Necoara and A. Patrascu, On the behavior of first-order penalty methods for convex programming when Lagrange multipliers do not exist, invited paper in session Large scale optimization at Conference on Decision and Control, 2015.

    • I. Necoara, A. Patrascu and R. Findeisen, Computational complexity of a dual fast gradient method for convex optimization: application to embedded MPC, Conference on Decision and Control, 2015.

    • I. Necoara and A. Patrascu, OR-SAGA: Over-relaxed stochastic average gradient mapping algorithms for finite sum minimization, accepted to ECC 2018.


PhD thesis:

Andrei Patrascu (under the supervision of Ion Necoara), Efficient first order methods for sparse convex optimization, 2015 (pdf).