Adding L1/L2 regularization in PyTorch? Adding L1/L2 regularization in PyTorch? python python

Adding L1/L2 regularization in PyTorch?


Following should help for L2 regularization:

optimizer = torch.optim.Adam(model.parameters(), lr=1e-4, weight_decay=1e-5)


This is presented in the documentation for PyTorch. You can add L2 loss using the weight_decay parameter to the Optimization function.