Software

quantes: Convolution Smoothed Quantile and Expected Shortfall Regression

This is the Python version of the conquer package, enhanced with new classes and functions designed for fitting expected shortfall regression under both linear and nonparametric models. 

This package contains two modules: the linear module, which implements convolution-smoothed quantile regression in both low and high dimensions, and the joint module, designed for joint quantile and expected shortfall regression. The low_dim class in the linear module applies a convolution smoothing approach to fit linear quantile regression models, known as conquer. It also constructs normal-based and (multiplier) bootstrap confidence intervals for all slope coefficients. The high_dim class fits sparse quantile regression models in high dimensions via L1-penalized and iteratively reweighted L1-penalized (IRW-L1) conquer methods. The IRW method is inspired by the local linear approximation (LLA) algorithm proposed by Zou & Li (2008) for folded concave penalized estimation, exemplified by the SCAD penalty (Fan & Li, 2001) and the minimax concave penalty (MCP) (Zhang, 2010). Computationally, each weighted L1-penalized conquer estimator is solved using the local adaptive majorize-minimization algorithm (LAMM). For comparison, the proximal ADMM algorithm (pADMM) is also implemented. The LR class in the joint module fits joint linear quantile and expected shortfall (ES) regression models (Dimitriadis & Bayer, 2019; Patton, Ziegel & Chen, 2019) using either FZ loss minimization (Fissler & Ziegel, 2016) or two-step procedures (Barendse, 2020; Peng & Wang, 2023; He, Tan & Zhou, 2023). For the second step of ES estimation, setting robust=TRUE uses the Huber loss with an adaptively chosen robustification parameter to gain robustness against heavy-tailed error/response; see He, Tan & Zhou (2023) for more details. Moreover, a combination of the iteratively reweighted least squares (IRLS) algorithm and quadratic programming is utilized to compute non-crossing ES estimates. This ensures that the fitted ES does not exceed the fitted quantile at each observation. The KRR and ANN classes in the joint module implement two nonparametric methods for joint quantile and expected shortfall regressions: kernel ridge regression (Takeuchi et al., 2006) and neural network regression. For fitting nonparametric QR through the qt() method in both KRR and ANN, there is a smooth option available. When set to TRUE, it uses the Gaussian kernel convoluted check loss. For fitting nonparametric ES regression using nonparametrically generated surrogate response variables, the es() function provides two options: squared loss (robust=FALSE) and the Huber loss (robust=TRUE).

References:

conquer: Convolution-Type Smoothed Quantile Regression

Estimation and inference for conditional linear quantile regression models using a convolution-smoothed approach, named conquer. In the low-dimensional setting (p<n), this package implements gradient descent with step size determined by the Barzilai-Borwein method to compute either a single conquer estimator or a conquer process estimator over a specified quantile range. Normal-based and (multiplier) bootstrap (pointwise) confidence intervals are constructed. In high dimensions (p>n), the conquer method is complemented with various types of penalties, including the lasso (L1-penalty), elastic-net, group lasso, sparse group lasso, SCAD and MCP, to deal with the correspondent low-dimensional structures. The conquer R package is available on CRAN, and the Python package is available on GitHub.

Please cite He et al. (2023) or Man et al. (2023) if you use this package. 

References:

adaHuber: Adaptive Huber Regression

This R package implements the adaptive Huber loss based method to mean estimation, covariance matrix estimation, linear regression, and sparse linear regression. In all these methods, the robustification parameter tau is calibrated by a data-driven procedure introduced in Wang et al. (2021)

References:

FarmTest: Factor-Adjusted Robust Multiple Testing

This package performs robust multiple testing of mean effects in the presence of known and unknown latent factors (Fan et al., 2019). To estimate model parameters and construct test statistics that are robust against heavy-tailed and/or asymmetric error distributions, it relies on a series of adaptive Huber methods combined with fast data-drive tuning schemes proposed in Ke et al. (2019). Extensions to two-sample simultaneous mean comparison problems are included. This package also contains functions that compute adaptive Huber mean, covariance and regression estimators.

References:

ILAMM: Nonconvex Regularized Robust Regression

This package employs the I-LAMM (iterative local adaptive majorize-minimization) algorithm to solve regularized Huber regression. The choice of penalty functions includes the L1-norm, the smoothly clipped absolute deviation (SCAD) and the minimax concave penalty (MCP). Two tuning parameters lambda (regularization parameter) and tau (for Huber loss) are calibrated by cross-validation. As a by-product, this package also produces regularized least squares estimators, including the Lasso, SCAD and MCP. 

References: