Optimizer
Some utility functions for training Pytorch models with FastAI fine tuning method. The code is from this repository.
params
params (m)
Return all parameters of m
(Pytorch model).
convert_params
convert_params (o:list)
*Converts o
into Pytorch-compatable param groups o
should be a set of layer-groups that should be split in the optimizer Example:
def splitter(m): return convert_params([[m.a], [m.b]])
Where m
is a model defined as: python class RegModel(Module): def __init__(self): self.a,self.b = nn.Parameter(torch.randn(1)),nn.Parameter(torch.randn(1)) def forward(self, x): return x*self.a + self.b
*
In the FastAI library is largely used transfer learning with layer-group learning rate freezing. The convert_params
function returns a list of parameters for specific layers in a model that allows discriminative learning rates.
smp_splitter
smp_splitter (model)
opt_func
opt_func (params, torch_opt, *args, **kwargs)
Pytorch Optimizer for fastai Learner.