Should I use softmax as output when using cross entropy loss in pytorch?
As stated in the torch.nn.CrossEntropyLoss()
doc:
This criterion combines
nn.LogSoftmax()
andnn.NLLLoss()
in one single class.
Therefore, you should not use softmax before.