Reconstructive subspace learning seeks a projection that preserves as much information in the data as possible in a least-squares sense. It requires unlabeled data for training. It is an unsupervised technique.
Conventional Reconstructive Subspace Learning [Linear]
Principal Components Analysis (PCA) (Pearson, 1901) [Euclidean structure][Global structure]
Independent Components Analysis (ICA) (Comon, 1994) [Blind Source Separation]
Non-negative Matrix Factorization (NMF) (Paatero and Tapper, 1994)
Minimum Trace Factor Analysis (MTFA) Â (Ledermann, 1940)
Recent Reconstructive Subspace Learning [Manifold]
Isometric Feature Mapping (ISOMAP) (Tenenbaum, 1998) [Geometry structure] [Linear manifold]
Locally Linear Embedding (LLE) (Roweis and Saul, 2000) [Geometry structure] [Linear manifold]
Laplacian Eigenmaps (LE) (Belkin and Niyogi, 2003) [Geometry structure] [Linear manifold]
Hessian Eigenmaps (HE) (Donoho and Grim, 2003) [Geometry structure]
Diffusion Maps (DM) ( Coifman and Lafon, 2006) [Geometry structure]
Local Tangent Space Alignment (LTSA) (Zhang and Zha, 2005) [Geometry structure] [Linear manifold]
Locality Preserving Projections (LPP) (He and Niyogi, 2003) [Geometry structure] [Out-of-sample] [non-linear manifold]
Neighborhood Preserving Embedding (NPE) (He et al., 2005) [Geometry structure] [Out-of-sample] [non-linear manifold]
Stationary Subspace Analysis (SSA) (von Bunau et al., 2009) [Blind Source Separation]
Signal Subspace Matching (SSM) (Wax and Adler, 2021)