jactorch#
Jacinle PyTorch functions and modules.
Contexts
A context manager that serves as a global variable for the forward pass. |
|
A basic environment that wraps around a nn.Module. |
|
Get the current forward context. |
IO
|
Get a state dict representation of the model. |
|
Load a state dict into the model. |
|
Load weights from a file. |
Parameter Filtering and Grouping
|
Find parameters in a module with a pattern. |
|
Filter parameters with a pattern. |
|
Exclude parameters from a list of parameters. |
|
Compose the param_groups argument for torch optimizers. |
|
A helper function used for human-friendly declaration of param groups. |
|
Freeze all parameters in a model. |
|
Unfreeze all parameters in a model. |
|
A context manager that temporarily detach all parameters in the input list of modules. |
Data Structures and Helpful Functions
All of the following functions accepts an arbitrary Python data structure as inputs (e.g., tuples, lists, dictionaries). They will recursively traverse the data structure and apply the function to each element.
|
Copy an object to a specific device asynchronizedly. |
|
Convert elements in a Python data structure to tensors. |
|
DEPRECATED(Jiayuan Mao): as_variable has been deprecated and will be removed by 10/23/2018; please use as_tensor instead. |
|
Convert elements in a Python data structure to numpy arrays. |
|
Convert elements in a Python data structure to Python floating-point scalars. |
|
Move elements in a Python data structure to CPU. |
|
Move elements in a Python data structure to CPU. |
|
Detach elements in a Python data structure. |
Arithmetics
|
Computes \(\mathrm{arc}\tanh(x)\). |
|
Computes \(\mathrm{logit}(x)\). |
|
Computes \(\log \sigma(x)\). |
|
Tensor stats: produces a summary of the tensor, including shape, min, max, mean, and std. |
|
Compute a soft maximum over the given dimension. |
|
Compute a soft minimum over the given dimension. |
Clustering
|
Gradient
|
Scale the gradient with respect to the input. |
|
Zero-grad the variable. |
|
A decorator to disable gradient calculation for a function. |
Indexing
|
Select elements from tensor according to batched_indices. |
|
Iteratively generates the values of tensor where mask is nonzero. |
|
tensor[:, :, index, :] |
|
tensor[:, :, index, ...]. |
|
Inverse a permutation. |
|
Return the smallest nonzero index along the dim axis. |
|
Convert a list of class labels into one-hot representation. |
|
Convert a tensor of class labels into one-hot representation by adding a new dimension indexed at dim. |
|
Convert a tensor of class labels into one-hot representation. |
|
Reverse a tensor along the given dimension. |
|
Return the smallest nonzero index along the dim axis. |
|
tensor[:, :, index, :, :] = value. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Kernel
|
Cosine distance kernel. |
|
Dot product kernel, essentially a cosine distance kernel without normalization. |
|
Inverse distance kernel. |
Linear Algebra
|
Normalize the input along a specific dimension. |
Log-Linear
|
Computes |
|
Computes |
|
Computes |
|
Computes |
|
Computes |
|
Computes |
|
Computes |
Masking
|
Convert a length vector to a mask. |
|
Reverse a padded sequence tensor along the given dimension. |
|
Compute the softmax of the tensor while ignoring some masked elements. |
|
Create an N-dimensional meshgrid-like mask, where |
|
Compute the average of the tensor while ignoring some masked elements. |
|
Compute the softmax of the tensor while ignoring some masked elements. |
Ranges
|
Perform np.meshgrid along given axis. |
|
Exclude self from the grid. |
Probability
|
Check if the probability is normalized along a specific dimension. |
|
Perform 1-norm along the specific dimension. |
Quantization
|
Quantize a tensor to binary values: |
Quantize a tensor to binary values: |
Sampling
Sample from a Bernoulli distribution. |
|
|
Sample from a multinomial distribution. |
Shape
|
Add a dimension at dim with size size. |
|
Add dimension for the input tensor so that |
|
Broadcast a specific dim for size times. |
|
Add AND expand dimension for the input tensor so that |
|
Concatenate shapes into a tuple. |
|
Flatten the tensor. |
|
Flatten the tensor while keep the first (batch) dimension. |
|
Do a view with optional contiguous copy. |
|
Move a specific dimension to a designated dimension. |
|
Repeat a specific dimension for count times. |
|
Repeat each element along a specific dimension for repeats times. |
Submodules