Describe the bug
Currently, the last layer of a ClassificationNeuralNetwork module uses softmax as an activation function, but this is not required and can lead to problems in the learning because pytorch CategoricalCrossEntropy already includes a softmax function, as described in the documentation:
Note that this case is equivalent to applying LogSoftmax on an input, followed by NLLLoss.
To Reproduce
/
Expected behavior
Don't use an extra softmax function
Screenshots (optional)
No response
Additional Context (optional)
No response