![]() ![]() "What we think, we become" (Buddha) Last Updated by January 2019 Name Used in Bibliographic Citations: J.Y. Bello Cruz & J.Y.B. Cruz & J.Y. Bello-Cruz About Me: I am an Assistant Professor: Numerical Analysis and Optimization in the Department of Mathematical Sciences at Northern Illinois University (NIU), USA. Prior NIU I was an Assistant Professor (with tenure) in the Institute of Mathematics and Statistics at the Federal University of Goias, Brazil. I worked from 2013 to 2015 with Prof. Heinz H. Bauschke as a CNPq Postdoctoral Research Fellow in the Irving K. Barber School of Arts and Sciences, Mathematics at the University of British Columbia, Canada. My Ph.D. in Mathematics was supervised by Prof. Alfredo N. Iusem at the Institute of Pure and Applied Mathematics (IMPA), Rio de Janeiro, Brazil. All my basic education, the 5 years Bachelor in Mathematics and the 2 years M.Sc. in Analysis, were coursed in the Faculty of Mathematics and Computer Science, University of Havana, Cuba. My main research interest is in the area of Continuous Optimization, focusing on Nonsmooth and Convex Optimization, Variational Inequalities, Inclusion Problems, Algorithms, and Applications. My research is supported by the National Science Foundation (NSF) on the research Grant # DMS - 1816449 (September 2018 - August 2021) and by internal grants from NIU. For further details please see my CV. Professional Address: Department of Mathematical Sciences, Northern Illinois University. Watson Hall 366, DeKalb, IL, USA - 60115. Phones: +1 8157536764 (Office Number) or 8159818002 (Online Number) Email: yunierbello at niu dot edu or yunier.bello at gmail dot com EXPERIENCE:
EDUCATION:
RESEARCH PROJECTS AND GRANTS, SCHOLARSHIPS AND AWARDS
J.Y. Bello Cruz, G. Li, T.T.A. Nghia. On the Q-linear convergence of forward-backward splitting method and uniqueness of optimal solution to Lasso. pp. 35, (2018). J.Y. Bello Cruz, G. Bouza Allende. An Inexact strategy for the projected gradient algorithm in vector optimization problems on variable ordered spaces. pp. 23, (2018). Conditional extragradient algorithms for solving constrained variational inequalities. pp. 28, (2017). An Iterative method for split inclusion problems without prior knowledge of operator norms. Journal of Fixed Point Theory and Applications. 19, 2017--2036 (2017). A semi-smooth Newton method for projection equations and linear complementarity problems with respect to the second order cone. Linear Algebra and its Applications. 513, 160--181 (2017). J.Y. Bello Cruz, T.T.A. Nghia. Optimal rates of linear convergence of relaxed alternating projections and generalized Douglas-Rachford methods for two subspaces. Numerical Algorithms. 73, 33--76 (2016). J.Y. Bello Cruz, O.P. Ferreira, L.F. Prudente. On the global convergence of the inexact semi-smooth Newton method for absolute value equation. Computational Optimization and Applications. 65, 93--108 (2016). J.Y. Bello Cruz, R. Díaz Millán. A relaxed-projection splitting algorithm for variational inequalities in Hilbert spaces. Journal of Global Optimization. 65, 597--614 (2016). A semi-smooth Newton method for a special piecewise linear system with application to positively constrained convex programming. Journal of Computational and Applied Mathematics. 301, 91--100 (2016). On weak and strong convergence of the projected gradient method for convex optimization in real Hilbert spaces. Numerical Functional Analysis and Optimization. 37, 129--144 (2016). A strongly convergent proximal bundle method for convex minimization in Hilbert spaces. Optimization. 65, 145--167 (2016). J.Y. Bello Cruz, A.N. Iusem. Full convergence of an approximate projection method for nonsmooth variational inequalities. Mathematics and Computers in Simulation. 114, 2--13 (2015). J.Y. Bello Cruz, R. Díaz Millán. A variant of forward-backward splitting method for the sum of two monotone operators with a new search strategy. Optimization. 64, 1471--1486 (2015). J.Y. Bello Cruz, G. Bouza Allende. A steepest descent-like method for variable order vector optimization problems. Journal of Optimization Theory and Application. 162, 371--391 (2014). Subgradient algorithms for solving variable inequalities. Applied Mathematics and Computation. 247, 1052--1063 (2014). The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle. Journal of Approximation Theory. 185, 63--79 (2014). J.Y. Bello Cruz, R. Díaz Millán. A direct splitting method for nonsmooth variational inequalities. Journal of Optimization Theory and Application. 161, 728--737 (2014). J.Y. Bello Cruz, W. de Oliveira. Level bundle-like algorithms for convex optimization. Journal of Global Optimization. 59, 787--809 (2014). J.Y. Bello Cruz, L.R. Lucambio Pérez. A subgradient-like algorithm for solving vector convex inequalities. Journal of Optimization Theory and Application. 162, 392--404 (2014). J.Y. Bello Cruz. A subgradient method for vector optimization problems. SIAM Journal on Optimization. 23, 2169--2182 (2013). J.Y. Bello Cruz, S. Scheimberg, P.S.M. Santos. A two-phase algorithm for a variational inequality formulation of equilibrium problems. Journal of Optimization Theory and Application. 159, 562--575 (2013). J.Y. Bello Cruz, A.N. Iusem. An explicit algorithm for monotone variational inequalities. Optimization. 61, 855--871 (2012). J.Y. Bello Cruz, L.R. Lucambio Pérez, J.G. Melo. Convergence of the projected gradient method for quasiconvex multiobjective optimization. Nonlinear Analysis. 74, 5268--5273 (2011). J.Y. Bello Cruz, A.N. Iusem. A strongly convergent method for nonsmooth convex minimization in Hilbert spaces. Numerical Functional Analysis and Optimization. 32, 1009--1018 (2011). J.Y. Bello Cruz, H. Pijeira Cabrera, C. Márquez, W. Urbina Romero. Sobolev-Gegenbauer type orthogonality and a hydrodynamical interpretation. Integral Transforms and Special Functions. 22, 711--722 (2011). J.Y. Bello Cruz, H. Pijeira Cabrera, W. Urbina Romero. On polar Legendre polynomials. Rocky Mountain Journal of Mathematics. 40, 2025--2036 (2010). J.Y. Bello Cruz, A.N. Iusem. Convergence of direct methods for paramonotone variational inequalities. Computational Optimization and Applications. 46, 247--263 (2010). J.Y. Bello Cruz, L.R. Lucambio Pérez. Convergence of a projected gradient method variant for quasiconvex objectives. Nonlinear Analysis. 9, 2917--2922 (2010). J.Y. Bello Cruz, A.N. Iusem. A strongly convergent direct method for monotone variational inequalities in Hilbert spaces. Numerical Functional Analysis and Optimization. 30, 23--36 (2009). Direct Methods for Monotone Variational Inequalities. Doctoral dissertation. PREPRINT IMPA C095 / 2009. 1--93 (2009). Research Interests: Calculus of variations and optimal control; Optimization; Fourier analysis; Operations research; Mathematical programming; Operator theory
Co-authors and Collaborators from MathSciNet and ScopusAckooij, W.V. & Barrios, J.G. & Bauschke, H.H. & Behling, R. & Bouza Allende, G. & de Oliveira, W. & Díaz Millán, R. & Ferreira, O.P. & Iusem, A.N. & Lucambio Pérez, L.R. & Márquez, C. & Melo, J.G. & Nemeth, S.Z. & Nghia, T.T.A & Pang, H.M. & Pijeira-Cabrera, H. & Prudente, L.F. & Scheimberg, S. & Santos, L.R. & Santos, P.S.M. & Shehu, Y. & Urbina, W.O. & Wang, X. SCIENTIFIC MEETINGS
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- Subgradient algorithms for vector optimization problems.
- A subgradient method for vector optimization problems.
- A strongly convergent method for nonsmooth convex minimization in Hilbert spaces.
- A modified extragradient method for variational inequalities in Hilbert spaces.
- A strongly convergent direct method for monotone variational inequalities in Hilbert spaces.
- Forcing strong convergence of Korpelevich-type algorithm in Hilbert spaces.
- Primitives of Classical Orthogonal Polynomial
- On Polar Legendre Polynomials
"There is no substitute for hard work" "Genius is an infinite capacity for taking pains" "I have not failed. I've just found 10,000 ways that won't work" "Genius is one percent inspiration and ninety-nine percent perspiration" "Our greatest weakness lies in giving up. The most certain way to succeed is always to try just one more time" (Thomas Edison) "Curiosity has its own reason for existing" (Albert Einstein) |