jactorch.nn.cnn.layers#
Classes
Class Conv1dLayer
- class Conv1dLayer[source]#
Bases:
ConvNDLayerBase
- __add__(other)#
- Return type:
- __init__(in_channels, out_channels, kernel_size, stride=1, padding_mode='default', padding=0, border_mode='zeros', dilation=1, groups=1, batch_norm=None, dropout=None, bias=None, activation=None)#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- append(module)#
Append a given module to the end.
- Parameters:
module (nn.Module) – module to append
- Return type:
- extend(sequential)#
- Return type:
- forward(input)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- insert(index, module)#
- Parameters:
- Return type:
- reset_parameters()#
- property input_dim#
- property output_dim#
Class Conv2dLayer
- class Conv2dLayer[source]#
Bases:
ConvNDLayerBase
- __add__(other)#
- Return type:
- __init__(in_channels, out_channels, kernel_size, stride=1, padding_mode='default', padding=0, border_mode='zeros', dilation=1, groups=1, batch_norm=None, dropout=None, bias=None, activation=None)#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- append(module)#
Append a given module to the end.
- Parameters:
module (nn.Module) – module to append
- Return type:
- extend(sequential)#
- Return type:
- forward(input)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- insert(index, module)#
- Parameters:
- Return type:
- reset_parameters()#
- property input_dim#
- property output_dim#
Class Conv3dLayer
- class Conv3dLayer[source]#
Bases:
ConvNDLayerBase
- __add__(other)#
- Return type:
- __init__(in_channels, out_channels, kernel_size, stride=1, padding_mode='default', padding=0, border_mode='zeros', dilation=1, groups=1, batch_norm=None, dropout=None, bias=None, activation=None)#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- append(module)#
Append a given module to the end.
- Parameters:
module (nn.Module) – module to append
- Return type:
- extend(sequential)#
- Return type:
- forward(input)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- insert(index, module)#
- Parameters:
- Return type:
- reset_parameters()#
- property input_dim#
- property output_dim#
Class ConvNDLayerBase
- class ConvNDLayerBase[source]#
Bases:
Sequential
- __add__(other)#
- Return type:
- __init__(in_channels, out_channels, kernel_size, stride=1, padding_mode='default', padding=0, border_mode='zeros', dilation=1, groups=1, batch_norm=None, dropout=None, bias=None, activation=None)[source]#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- append(module)#
Append a given module to the end.
- Parameters:
module (nn.Module) – module to append
- Return type:
- extend(sequential)#
- Return type:
- forward(input)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- insert(index, module)#
- Parameters:
- Return type:
- property input_dim#
- property output_dim#
Class Deconv1dLayer
- class Deconv1dLayer[source]#
Bases:
_DeconvLayerBase
- __init__(in_channels, out_channels, kernel_size, stride=None, padding_mode='same', padding=0, border_mode=None, dilation=1, groups=1, output_size=None, scale_factor=None, resize_mode='nearest', batch_norm=None, dropout=None, bias=None, activation=None, algo='resizeconv')#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input, output_size=None)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- reset_parameters()#
- property input_dim#
- property output_dim#
Class Deconv2dLayer
- class Deconv2dLayer[source]#
Bases:
_DeconvLayerBase
- __init__(in_channels, out_channels, kernel_size, stride=None, padding_mode='same', padding=0, border_mode=None, dilation=1, groups=1, output_size=None, scale_factor=None, resize_mode='nearest', batch_norm=None, dropout=None, bias=None, activation=None, algo='resizeconv')#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input, output_size=None)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- reset_parameters()#
- property input_dim#
- property output_dim#
Class Deconv3dLayer
- class Deconv3dLayer[source]#
Bases:
_DeconvLayerBase
- __init__(in_channels, out_channels, kernel_size, stride=None, padding_mode='same', padding=0, border_mode=None, dilation=1, groups=1, output_size=None, scale_factor=None, resize_mode='nearest', batch_norm=None, dropout=None, bias=None, activation=None, algo='resizeconv')#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input, output_size=None)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- reset_parameters()#
- property input_dim#
- property output_dim#
Class DeconvAlgo
- class DeconvAlgo[source]#
Bases:
JacEnum
- __new__(value)#
- classmethod assert_valid(value)#
Assert if the value is a valid choice.
- classmethod choice_names()#
Returns the list of the name of all possible choices.
- classmethod choice_objs()#
Returns the list of the object of all possible choices.
- classmethod choice_values()#
Returns the list of the value of all possible choices.
- classmethod is_valid(value)#
Check if the value is a valid choice.
- classmethod type_name()#
Return the type name of the enum.
- CONVTRANSPOSE = 'convtranspose'#
- RESIZECONV = 'resizeconv'#
Class LinearLayer
- class LinearLayer[source]#
Bases:
Sequential
- __add__(other)#
- Return type:
- __init__(in_features, out_features, batch_norm=None, dropout=None, bias=None, activation=None)[source]#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- append(module)#
Append a given module to the end.
- Parameters:
module (nn.Module) – module to append
- Return type:
- extend(sequential)#
- Return type:
- forward(input)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- insert(index, module)#
- Parameters:
- Return type:
- property input_dim#
- property output_dim#
Class MLPLayer
- class MLPLayer[source]#
Bases:
Module
- __init__(input_dim, output_dim, hidden_dims, batch_norm=None, dropout=None, activation='relu', flatten=True, last_activation=False)[source]#
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input)[source]#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.