Utility functions for FastAI optimizers in Pytorch models.

Some utility functions for training Pytorch models with FastAI fine tuning method. The code is from this repository.

params[source]

params(m)

Return all parameters of m (Pytorch model).

convert_params[source]

convert_params(o:list)

Converts o into Pytorch-compatable param groups o should be a set of layer-groups that should be split in the optimizer Example:

def splitter(m): return convert_params([[m.a], [m.b]])

Where m is a model defined as:

class RegModel(Module):
  def __init__(self): self.a,self.b = nn.Parameter(torch.randn(1)),nn.Parameter(torch.randn(1))
  def forward(self, x): return x*self.a + self.b

In the FastAI library is largely used transfer learning with layer-group learning rate freezing. The convert_params function returns a list of parameters for specific layers in a model that allows discriminative learning rates.

smp_splitter[source]

smp_splitter(model)

opt_func[source]

opt_func(params, torch_opt, *args, **kwargs)

Pytorch Optimizer for fastai Learner.