libreasr.lib.optimizerΒΆ
A collection of optimizers.
Functions
A Optimizer for Adam with lr, mom, sqr_mom, eps and params |
|
A Optimizer for Adam with lr, mom, sqr_mom, eps and params |
|
Step for Adam with lr on p |
|
Step for RAdam with lr on p |
|
Convenience method for Lookahead with RAdam |
Classes
Implements Atom algorithm. :param params: iterable of parameters to optimize or dicts defining parameter groups :type params: iterable :param lr: learning rate :type lr: float :param beta: coefficient used for computing running averages of gradient (default: 0.9) :type beta: float, optional :param eps: term added to the denominator to improve numerical stability (default: 1e-4) :type eps: float, optional :param warmup: number of warmup steps (default: 0) :type warmup: int, optional :param init_lr: initial learning rate for warmup (default: 0.01) :type init_lr: float, optional :param wd: weight decay coefficient (default: 0) :type wd: float, optional. |