[activation] Derivative for activation
authorParichay Kapoor <pk.kapoor@samsung.com>
Wed, 24 Jun 2020 05:56:59 +0000 (14:56 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Wed, 24 Jun 2020 10:51:21 +0000 (19:51 +0900)
commit2b550556d71c9fa7061f2b02dfeb53681cef13a3
tree8d8570a94dc8552c9b69b429cc937f85ba54a393
parent90d12fea94141bc2c4927dd9a54b1dcb1b8cf394
[activation] Derivative for activation

The derivative of softmax has been hand crafted to be different from others
Refer to https://github.com/nnstreamer/nntrainer/blob/2a650512813db6ce3bba828b5790066fbc655f14/nntrainer/src/fc_layer.cpp#L265 for original implementation
Softmax requires softmax(x) as input for derivative while other activations require x as input for derivative_

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/src/activation_layer.cpp