sparse_ho.models.SimplexSVR¶
- class sparse_ho.models.SimplexSVR(estimator=None)¶
The simplex support vector regression without bias The optimization problem is solved in the dual.
It solves the SVR with probability vector constraints: sum_i beta_i = 1 beta_i >= 0
- Parameters
- estimator: sklearn
An estimator that follows the scikit-learn API.
- __init__(estimator=None)¶
Methods
__init__([estimator])generalized_supp(X, v, log_hyperparam)Generalized support of iterate.
get_L(X)Compute Lipschitz constant of datafit.
get_dual_v(mask, dense, X, y, v, log_hyperparam)Compute the dual of v
get_full_jac_v(mask, jac_v, n_features)TODO
get_jac_residual_norm(Xs, ys, n_samples, ...)get_jac_v(X, y, mask, dense, jac, v)Compute hypergradient.
get_mat_vec(X, y, mask, dense, log_C)Returns a LinearOperator computing the matrix vector product with the Hessian of datafit.
proj_hyperparam(X, y, log_hyperparam)Project hyperparameter on an admissible range of values.
reduce_X(X, mask)Reduce design matrix to generalized support.
reduce_y(y, mask)Reduce observation vector to generalized support.
sign(beta, log_hyperparams)Get sign of iterate.