Skip to content

grad_clip

Gradient norm clipping callback.

Clips gradients on modules returned by method.get_grad_clip_targets() before the optimizer step. Optionally logs per-module grad norms to the tracker.

Classes

fastvideo.train.callbacks.grad_clip.GradNormClipCallback

GradNormClipCallback(*, max_grad_norm: float = 1.0, log_grad_norms: bool = True)

Bases: Callback

Clip gradient norms before the optimizer step.

max_grad_norm must be set explicitly in the callback config (callbacks.grad_clip.max_grad_norm).

Source code in fastvideo/train/callbacks/grad_clip.py
def __init__(
    self,
    *,
    max_grad_norm: float = 1.0,
    log_grad_norms: bool = True,
) -> None:
    self._max_grad_norm = float(max_grad_norm)
    self._log_grad_norms = bool(log_grad_norms)

Functions