Lyapunov theory offers many different and efficient criteria to analyze the stability of discrete-time switched systems; in particular, the existence of a common Lyapunov function is a necessary and sufficient condition for stability. However, one usually resctricts the search to classical templates which are easy to use in practice (like quadratic functions or Sum-Of-Square poynomials for instance) but which are not universal. Then, the introduction of multiple Lyapunov functions offers the possibility of making the stability criterion more complex while keeping the usual templates. The Path-Complete Lypaunov framework generalizes and characterizes the whole Lyapunov approach for discrete-time switched systems, and involves two structural components, i.e. a graph which encodes the Lyapunov constraints and a template which defines the search space for the Lyapunov functions. This formalism allows to leverage combinotarial tools and graph/automaton theory to induce properties on the stability criteria, and to compare the different graphs with respect to their level of conservatism.
Example of a path-complete graph which encode a common Lyapunov functions for a switched system with M modes.
Example the evolution of the JSR approximation provided by a neural-based Lyapunov function for different architectures.
Neural networks are known to be universal approximators of any continuous function. In particular, we use them to synthesize neural common Lyapunov functions for linear switched systems and consequently approximate the joint spectral radius of the corresponding set of matrices.
Indeed, in recent years, neural networks have received an increasing amount of attention to compute Lyapunov functions. However, several questions remain open such as the soundness of the training procedure, or the capabilities to provide good Lyapunov functions as a function of the network structure. In our work, we fine-tune the loss function to provide an approximation of the JSR of the corresponding set of matrices. Moreover, we are able to link the network’s approximation capabilities to its structure. Among others, we consider different approaches to train neural Lyapunov functions for switched systems that come with formal guarantees of correctness. In particular, we leverage the CEGIS approach that has already proved useful in various fields of system verification and control.