[Loss] Bug fix for loss
authorParichay Kapoor <pk.kapoor@samsung.com>
Thu, 16 Jul 2020 05:12:43 +0000 (14:12 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Thu, 16 Jul 2020 07:12:47 +0000 (16:12 +0900)
commit47d04d077c58610652a92aa22de24dfa9eb4f77a
tree132795827383351712d35400260f2dacf4755c21
parentece55603206f0afb359f99b75690cc23600cd675
[Loss] Bug fix for loss

Added bug fix for loss forwarding
- for sigmoid with cross entropy, formula was correct, however implementation was wrong, also inverted sign of the output
- for MSE, average is needed than sum
- for softmax with cross entropy, divide by input width is not needed but still mismatch

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/include/lazy_tensor.h
nntrainer/include/tensor.h
nntrainer/include/tensor_dim.h
nntrainer/src/lazy_tensor.cpp
nntrainer/src/loss_layer.cpp
nntrainer/src/optimizer.cpp
nntrainer/src/tensor.cpp
packaging/unittest_layers.tar.gz