jactorch.optim.accum_grad#

Classes

AccumGrad

A wrapper for optimizer that accumulates gradients for several steps.

Class AccumGrad

class AccumGrad[source]#

Bases: CustomizedOptimizer

A wrapper for optimizer that accumulates gradients for several steps.

Basically, this wrapper will accumulate gradients for several steps, and then call the base optimizer’s step method.

__init__(base_optimizer, nr_acc)[source]#

Initialize the wrapper.

Parameters:
  • base_optimizer – the base optimizer.

  • nr_acc – the number of steps to accumulate gradients.

__new__(**kwargs)#
load_state_dict(state_dict)[source]#

Load the state of the optimizer from a dictionary.

state_dict()[source]#

A dictionary that contains the state of the optimizer.

step(closure=None)[source]#

Performs a single optimization step.

zero_grad()[source]#

Clear the gradients of all optimized parameters.

property param_groups#

The parameter groups of the optimizer.

property state#

The state of the optimizer.