Optimizer

Utility functions for FastAI optimizers in Pytorch models.

Some utility functions for training Pytorch models with FastAI fine tuning method. The code is from this repository.


source

params

 params (m)

Return all parameters of m (Pytorch model).


source

convert_params

 convert_params (o:list)

*Converts o into Pytorch-compatable param groups o should be a set of layer-groups that should be split in the optimizer Example:

def splitter(m): return convert_params([[m.a], [m.b]])

Where m is a model defined as: python class RegModel(Module): def __init__(self): self.a,self.b = nn.Parameter(torch.randn(1)),nn.Parameter(torch.randn(1)) def forward(self, x): return x*self.a + self.b*

In the FastAI library is largely used transfer learning with layer-group learning rate freezing. The convert_params function returns a list of parameters for specific layers in a model that allows discriminative learning rates.


source

smp_splitter

 smp_splitter (model)

source

opt_func

 opt_func (params, torch_opt, *args, **kwargs)

Pytorch Optimizer for fastai Learner.