Adding L1/L2 regularization in PyTorch?
Following should help for L2 regularization:
optimizer = torch.optim.Adam(model.parameters(), lr=1e-4, weight_decay=1e-5)
This is presented in the documentation for PyTorch. You can add L2 loss using the weight_decay
parameter to the Optimization function.
For L2 regularization,
l2_lambda = 0.01l2_reg = torch.tensor(0.)for param in model.parameters(): l2_reg += torch.norm(param)loss += l2_lambda * l2_reg
References: