Software
R package
MTLRRC: Computes multi-task learning models via robust regularized clustering.
Okazaki, A. and Kawano, S. (2024) Multi-task learning via robust regularized clustering with non-convex group penalties. arXiv:2404.03250 [URL].
CSNL: Computes multi-task learning models for compositional data via sparse network lasso.
Okazaki, A. and Kawano, S. (2022) Multi-task learning for compositional data via sparse network lasso. Entropy, 24, 1839 (doi: 10.3390/e24121839).
SVaRu: Computes nonparametric regression models with automatically determined smoothing parameters. The hyper-tuning parameters are also optimized by generalized information criteria.
Kim, D., Kawano, S. and Ninomiya, Y. (2023) Smoothly varying regularization. Computational Statistics & Data Analysis, 179, 107644 (doi: 10.1016/j.csda.2022.107644).
MCCA: Computes multilinear common component analysis via Kronecker product representation.
Yoshikawa, K. and Kawano, S. (2021) Multilinear common component analysis via Kronecker product representation. Neural Computation, 33, 2853-2880 (doi: 10.1162/neco_a_01425).
spcr-svd: Computes the sparse principal component regression via singular value decomposition approach. The regularization parameters are also optimized by cross-validation.
Kawano, S. (2021) Sparse principal component regression via singular value decomposition approach. Advances in Data Analysis and Classification, 15, 795-823 (doi: 10.1007/s11634-020-00435-2). (Online Supplement)
RVSManOpt: Computes the sparse reduced-rank factor regression based on manifold optimization. This package can perform estimating the rank of the coefficient matrix, selecting the number of explanatory variables which composes factors included in the regression, and selecting the number of the factors are relevant with the response variables.
Yoshikawa, K. and Kawano, S. (2023) "Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization" Computational Statistics, 38, 53-75 (doi: 10.1007/s00180-022-01216-5).
neggfl: Computes the Bayesian generalized fused lasso regression based on a normal-exponential-gamma (NEG) prior distribution.
Shimamura, K., Ueki, M., Kawano, S. and Konishi, S. (2019) "Bayesian generalized fused lasso modeling via NEG distribution" Communications in Statistics - Theory and Methods, 48, 4132-4153 (doi: 10.1080/03610926.2018.1489056).
spcr: Computes the sparse principal component regression. The regularization parameters are also optimized by cross-validation.
Kawano, S., Fujisawa, H., Takada, T. and Shiroishi, T. (2015) "Sparse principal component regression with adaptive loading" Computational Statistics & Data Analysis, 89, 192-203 (doi: 10.1016/j.csda.2015.03.016 ).
Kawano, S., Fujisawa, H., Takada, T. and Shiroishi, T. (2018) "Sparse principal component regression for generalized linear models" Computational Statistics & Data Analysis, 124, 180-196 (doi: 10.1016/j.csda.2018.03.008 ).
sAIC: Computes the Akaike information criterion for the generalized linear models (logistic regression, Poisson regression, and Gaussian graphical models) estimated by the lasso.
Ninomiya, Y. and Kawano, S. (2016) "AIC for the Lasso in generalized linear models" Electronic Journal of Statistics, 10, 2537-2560 (doi: 10.1214/16-EJS1179).