First-order majorization-minimization meets high-order majorant: Boosted inexact high-order forward-backward method
First-order majorization-minimization meets high-order majorant: Boosted inexact high-order forward-backward method
Author(s): A. Kabgani & M. Ahookhosh
Year: 2025
Title: First-order majorization-minimization meets high-order majorant: Boosted inexact high-order forward-backward method
Journal: arXiv:2510.22231
Volume:
Pages:
Doi: https://doi.org/10.48550/arXiv.2510.22231
Cite this article
Kabgani, A., Ahookhosh, M.: First-order majorization-minimization meets high-order majorant: Boosted inexact high-order forward-backward method. arXiv:2510.22231. https://doi.org/10.48550/arXiv.2510.22231
Abstract
This paper introduces a first-order majorization-minimization framework based on a high-order majorant for continuous functions, incorporating a non-quadratic regularization term of degree . Notably, it is shown to be valid if and only if the function is p-paraconcave, thus extending beyond Lipschitz and Hölder gradient continuity for p∈(1,2], and implying concavity for p>2. In the smooth setting, this majorant recovers a variant of the classical descent lemma with quadratic regularization. Building on this foundation, we develop a high-order inexact forward-backward algorithm (HiFBA) and its line-search-accelerated variant, named Boosted HiFBA. For convergence analysis, we introduce a high-order forward-backward envelope (HiFBE), which serves as a Lyapunov function. We establish subsequential convergence under suitable inexactness conditions, and we prove global convergence with linear rates for functions satisfying the Kurdyka-Łojasiewicz inequality. Our preliminary experiments on linear inverse problems and regularized nonnegative matrix factorization highlight the efficiency of HiFBA and its boosted variant, demonstrating their potential for solving challenging nonconvex optimization problems.