sparse_ho.optimizers.Adam¶
- class sparse_ho.optimizers.Adam(n_outer=100, epsilon=0.001, lr=0.01, beta_1=0.9, beta_2=0.999, verbose=False, tol=1e-05, t_max=10000)¶
ADAM optimizer for the outer problem.
This Adam code is taken from https://github.com/sagarvegad/Adam-optimizer/blob/master/Adam.py
- Parameters
- n_outer: int, optional (default=100).
Number of maximum updates of alpha.
- epsilon: float, optional (default=1e-3)
- lr: float, optional (default=1e-2)
Learning rate
- beta_1: float, optional (default=0.9)
- beta_2: float, optional (default=0.999)
- verbose: bool, optional (default=False)
Indicates whether information about hyperparameter optimization process is printed or not.
- tolfloat, optional (default=1e-5)
Tolerance for the inner optimization solver.
- t_max: float, optional (default=10_000)
Maximum running time threshold in seconds.
- __init__(n_outer=100, epsilon=0.001, lr=0.01, beta_1=0.9, beta_2=0.999, verbose=False, tol=1e-05, t_max=10000)¶
Methods
__init__
([n_outer, epsilon, lr, beta_1, ...])