Persistence Fisher

Project: Persistence Fisher Kernel: A Riemannian Manifold Kernel

for Persistence Diagrams

Collaborator: Makoto Yamada

(RIKEN AIP, Japan)

Abstract:

Algebraic topology methods have recently played an important role for statistical analysis with complicated geometric structured data such as shapes, linked twist maps, and material data. Among them, persistent homology is a well-known tool to extract robust topological features, and outputs as persistence diagrams (PDs). However, PDs are point multi-sets which can not be used in machine learning algorithms for vector data. To deal with it, an emerged approach is to use kernel methods, and an appropriate geometry for PDs is an important factor to measure the similarity of PDs. A popular geometry for PDs is the Wasserstein metric. However, Wasserstein distance is not negative definite. Thus, it is limited to build positive definite kernels upon the Wasserstein distance without approximation. In this work, we rely upon the alternative Fisher information geometry to propose a positive definite kernel for PDs without approximation, namely the Persistence Fisher (PF) kernel. Then, we analyze eigensystem of the integral operator induced by the proposed kernel for kernel machines. Based on that, we derive generalization error bounds via covering numbers and Rademacher averages for kernel machines with the PF kernel. Additionally, we show some nice properties such as stability and infinite divisibility for the proposed kernel. Furthermore, we also propose a linear time complexity over the number of points in PDs for an approximation of our proposed kernel with a bounded error. Throughout experiments with many different tasks on various benchmark datasets, we illustrate that the PF kernel compares favorably with other baseline kernels for PDs.

Matlab code is available here [Download/MirrorAtGithub]. (Version 0.1 - October 19th, 2018)

Related publication:

Tam Le, Makoto Yamada, Persistence Fisher Kernel: A Riemannian Manifold Kernel for Persistence Diagrams, to appear at the 32nd Conference on Neural Information Processing Systems (NIPS), Canada, 2018. [Arxiv]