libreasr.lib.optimizer.adahessian_step

libreasr.lib.optimizer.adahessian_step(p, lr, mom, step, sqr_mom, grad_avg, sqr_avg_diag_hessian, hessian_power, eps, **kwargs)[source]

Step for Adam with lr on p