Should I use softmax as output when using cross entropy loss in pytorch? Should I use softmax as output when using cross entropy loss in pytorch? python python

Should I use softmax as output when using cross entropy loss in pytorch?


As stated in the torch.nn.CrossEntropyLoss() doc:

This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class.

Therefore, you should not use softmax before.