jactorch.functional.arith#

Arithmetic operations.

Functions

atanh(x[, eps])

Computes \(\mathrm{arc}\tanh(x)\).

log_sigmoid(x)

Computes \(\log \sigma(x)\).

logit(x[, eps])

Computes \(\mathrm{logit}(x)\).

soft_amax(x, dim[, tau, keepdim])

Compute a soft maximum over the given dimension.

soft_amin(x, dim[, tau, keepdim])

Compute a soft minimum over the given dimension.

tstat(x)

Tensor stats: produces a summary of the tensor, including shape, min, max, mean, and std.

Functions

atanh(x, eps=1e-8)[source]#

Computes \(\mathrm{arc}\tanh(x)\).

Parameters:
  • x (Tensor) – input.

  • eps (float) – eps for numerical stability.

Returns:

\(\mathrm{arc}\tanh(x)\).

Return type:

Tensor

log_sigmoid(x)[source]#

Computes \(\log \sigma(x)\).

Parameters:

x (Tensor) – input.

Returns:

\(\log \sigma(x)\).

Return type:

Tensor

logit(x, eps=1e-8)[source]#

Computes \(\mathrm{logit}(x)\).

Parameters:
  • x (Tensor) – input.

  • eps (float) – eps for numerical stability.

Returns:

\(\mathrm{logit}(x)\).

Return type:

Tensor

soft_amax(x, dim, tau=1.0, keepdim=False)[source]#

Compute a soft maximum over the given dimension. It can be viewed as a differentiable version of torch.amax().

Parameters:
  • x (Tensor) – input tensor.

  • dim (int) – dimension to compute the soft maximum.

  • tau (float) – temperature.

  • keepdim (bool) – whether to keep the dimension.

Returns:

the soft maximum.

Return type:

Tensor

soft_amin(x, dim, tau=1.0, keepdim=False)[source]#

Compute a soft minimum over the given dimension. It can be viewed as a differentiable version of torch.amin().

Parameters:
  • x – input tensor.

  • dim – dimension to compute the soft minimum.

  • tau – temperature.

  • keepdim – whether to keep the dimension.

Returns:

the soft minimum.

See also

soft_amax()

tstat(x)[source]#

Tensor stats: produces a summary of the tensor, including shape, min, max, mean, and std.

Parameters:

x (Tensor) – input tensor.

Returns:

a dict of stats.

Return type:

Dict[str, Any]