API Documentation

sparse_ho:

Models

sparse_ho.models:

ElasticNet([estimator])

Sparse ho ElasticNet model (inner problem).

Lasso([estimator])

Linear Model trained with L1 prior as regularizer (aka the Lasso).

SimplexSVR([estimator])

The simplex support vector regression without bias The optimization problem is solved in the dual.

SparseLogreg([estimator])

Sparse Logistic Regression classifier.

SVM([estimator])

Support Vector Machine classifier without bias.

SVR([estimator])

The support vector regression without bias.

WeightedLasso([estimator])

Linear Model trained with weighted L1 regularizer (aka weighted Lasso).

Criterion

sparse_ho.criterion:

CrossVal(criterion[, cv])

Cross-validation loss.

FiniteDiffMonteCarloSure(sigma[, ...])

Smoothed version of the Stein Unbiased Risk Estimator (SURE).

HeldOutMSE(idx_train, idx_val)

Held out loss for quadratic datafit.

HeldOutSmoothedHinge(idx_train, idx_val)

Smooth Hinge loss.

HeldOutLogistic(idx_train, idx_val)

Logistic loss on held out data

LogisticMulticlass(idx_train, idx_val, algo)

Multiclass logistic loss.

Algorithms

sparse_ho.algo:

Implicit([max_iter, max_iter_lin_sys, ...])

Algorithm to compute the hypergradient using implicit differentiation.

ImplicitForward([tol_jac, max_iter, ...])

Algorithm to compute the hypergradient using implicit forward differentiation.

Forward([use_stop_crit, verbose])

Algorithm to compute the hypergradient using forward differentiation of proximal coordinate descent.

Backward([use_stop_crit, verbose])

Algorithm to compute the hypergradient using backward differentiation.

Optimizers

sparse_ho.optimizers:

Adam([n_outer, epsilon, lr, beta_1, beta_2, ...])

ADAM optimizer for the outer problem.

GradientDescent([n_outer, step_size, ...])

Gradient descent for the outer problem.

LineSearch([n_outer, verbose, ...])

Gradient descent with line search for the outer problem.

Functions

Utils

sparse_ho.utils:

Monitor([callback])

Class used to store computed metrics at each iteration of the outer loop.