Least-Squares Gradients for Dimension Reduction (LSGDR)
Least-squares gradients for dimension reduction (LSGDR) is a supervised dimension reduction method.
LSGDR possesses the following features:
The transformation matrix is efficiently estimated "in a closed form" through the eigenvalue decomposition to the expectation of the outer product of the logarithmic conditional density gradient.
The gradients of the logarithmic conditional densities are accurately estimated by a proposed estimator called least-squares logarithmic conditional density gradients (LSLCG):
LSLCG directly estimates log-conditional density gradients without conditional density estimation.
The solutions in LSLCG are analytically computed.
A cross-validation method is available for tuning all parameters in LSLCG.
MATLAB implementation of LSGDR and LSLCG: LSGDR.zip
Currently, this implementation is intended for regression only. Implementation for classification will be available soon.
References
Hiroaki Sasaki, Voot Tangkaratt and Masashi Sugiyama, "Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities", the 7th Asian Conference on Machine learning (ACML), JMLR: Workshop and Conference Proceedings, vol.xx, pp.xxx-xxx, 2015.