jactorch.optim.accum_grad#
Classes
A wrapper for optimizer that accumulates gradients for several steps. |
Class AccumGrad
- class AccumGrad[source]#
Bases:
CustomizedOptimizer
A wrapper for optimizer that accumulates gradients for several steps.
Basically, this wrapper will accumulate gradients for several steps, and then call the base optimizer’s step method.
- __init__(base_optimizer, nr_acc)[source]#
Initialize the wrapper.
- Parameters:
base_optimizer – the base optimizer.
nr_acc – the number of steps to accumulate gradients.
- __new__(**kwargs)#
- property param_groups#
The parameter groups of the optimizer.
- property state#
The state of the optimizer.