Utilities

Helper Tools

synthtorch.util.helper

define helper function for defining neural networks in pytorch

Author: Jacob Reinhold (jacob.reinhold@jhu.edu)

Created on: Nov 2, 2018

synthtorch.util.helper.get_act(name: str, inplace: bool = True, params: Optional[dict] = None) → Union[<Mock name='mock.nn.modules.activation.ReLU' id='140035744029888'>, <Mock name='mock.nn.modules.activation.LeakyReLU' id='140035744030056'>, <Mock name='mock.nn.modules.activation.Tanh' id='140035744030112'>, <Mock name='mock.nn.modules.activation.Sigmoid' id='140035744030168'>]

get activation module from pytorch must be one of: relu, lrelu, linear, tanh, sigmoid

Parameters:
  • name (str) – name of activation function desired
  • inplace (bool) – flag activation to do operations in-place (if option available)
  • params (dict) – dictionary of parameters (as per pytorch documentation)
Returns:

instance of activation class

Return type:

act (activation)

synthtorch.util.helper.get_loss(name: str)

get a loss function by name

synthtorch.util.helper.get_norm2d(name: str, num_features: int, params: Optional[dict] = None) → Union[<Mock name='mock.nn.modules.instancenorm.InstanceNorm3d' id='140035744561976'>, <Mock name='mock.nn.modules.batchnorm.BatchNorm3d' id='140035744563152'>]

get a 2d normalization module from pytorch must be one of: instance, batch, none

Parameters:
  • name (str) – name of normalization function desired
  • num_features (int) – number of channels in the normalization layer
  • params (dict) – dictionary of optional other parameters for the normalization layer as specified by the pytorch documentation
Returns:

instance of normalization layer

Return type:

norm

synthtorch.util.helper.get_norm3d(name: str, num_features: int, params: Optional[dict] = None) → Union[<Mock name='mock.nn.modules.instancenorm.InstanceNorm3d' id='140035744561976'>, <Mock name='mock.nn.modules.batchnorm.BatchNorm3d' id='140035744563152'>]

get a 3d normalization module from pytorch must be one of: instance, batch, none

Parameters:
  • name (str) – name of normalization function desired
  • num_features (int) – number of channels in the normalization layer
  • params (dict) – dictionary of optional other parameters for the normalization layer as specified by the pytorch documentation
Returns:

instance of normalization layer

Return type:

norm

synthtorch.util.helper.get_optim(name: str)

get an optimizer by name

synthtorch.util.helper.init_weights(net, init_type='kaiming', init_gain=0.02)

Initialize network weights (inspired by https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/)

Parameters:
  • net (nn.Module) – network to be initialized
  • init_type (str) – the name of an initialization method: normal, xavier, kaiming, or orthogonal
  • init_gain (float) – scaling factor for normal, xavier and orthogonal.
Returns:

None