Minimizing smooth Kurdyka-Łojasiewicz functions via generalized descent methods: Convergence rate and complexity
Minimizing smooth Kurdyka-Łojasiewicz functions via generalized descent methods: Convergence rate and complexity
Author(s): A. Kabgani & M. Ahookhosh
Year: 2025
Title: Minimizing smooth Kurdyka-Łojasiewicz functions via generalized descent methods: Convergence rate and complexity
Journal: arXiv:2511.10414
Volume:
Pages:
Doi: https://doi.org/10.48550/arXiv.2511.10414
Cite this article
Ahookhosh, M., Ghaderi, S., Kabgani, A., Rahimi, M.: Minimizing smooth Kurdyka-Łojasiewicz functions via generalized descent methods: Convergence rate and complexity. arXiv:2511.10414. https://doi.org/10.48550/arXiv.2511.10414
Abstract
This paper addresses the generalized descent algorithm (DEAL) for minimizing smooth functions, which is analyzed under the Kurdyka-Łojasiewicz (KL) inequality. In particular, the suggested algorithm guarantees a sufficient decrease by adapting to the cost function's geometry. We leverage the KL property to establish the global convergence, convergence rates, and complexity. A particular focus is placed on the linear convergence of generalized descent methods. We show that the constant step-size and Armijo line search strategies along a generalized descent direction satisfy our generalized descent condition. Additionally, for nonsmooth functions by leveraging the smoothing techniques such as forward-backward and high-order Moreau envelopes, we show that the boosted proximal gradient method (BPGA) and the boosted high-order proximal-point (BPPA) methods are also specific cases of DEAL, respectively. It is notable that if the order of the high-order proximal term is chosen in a certain way (depending on the KL exponent), then the sequence generated by BPPA converges linearly for an arbitrary KL exponent. Our preliminary numerical experiments on inverse problems and LASSO demonstrate the efficiency of the proposed methods, validating our theoretical findings.