PyTorch: How to change the learning rate of an optimizer at any given moment (no LR schedule) PyTorch: How to change the learning rate of an optimizer at any given moment (no LR schedule) python python

PyTorch: How to change the learning rate of an optimizer at any given moment (no LR schedule)


So the learning rate is stored in optim.param_groups[i]['lr'].optim.param_groups is a list of the different weight groups which can have different learning rates. Thus, simply doing:

for g in optim.param_groups:    g['lr'] = 0.001

will do the trick.


Alternatively,

as mentionned in the comments, if your learning rate only depends on the epoch number, you can use a learning rate scheduler.

For example (modified example from the doc):

torch.optim.lr_scheduler import LambdaLRoptimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)# Assuming optimizer has two groups.lambda_group1 = lambda epoch: epoch // 30lambda_group2 = lambda epoch: 0.95 ** epochscheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])for epoch in range(100):    train(...)    validate(...)    scheduler.step()

Also, there is a prebuilt learning rate scheduler to reduce on plateaus.


Instead of a loop in @patapouf_ai's answer, you can do it directly via:

optim.param_groups[0]['lr'] = 0.001

Cheers