[WeightSharing] Remove zero grad
authorJihoon Lee <jhoon.it.lee@samsung.com>
Wed, 6 Oct 2021 12:18:04 +0000 (21:18 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Thu, 7 Oct 2021 10:20:47 +0000 (19:20 +0900)
commitf2b0d35ed292bdd3239c4ff10bd7d9500f953dab
tree8e3f854a33e177261cc6cb1356de114242ef7292
parent8f4dac4595d7b4f048a38b8b1551daf9cc62da09
[WeightSharing] Remove zero grad

Removing zero grad function in the cost of the layer should handle the scenarios

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
nntrainer/graph/network_graph.cpp
nntrainer/graph/network_graph.h
nntrainer/layers/fc_layer.cpp
nntrainer/layers/layer_context.cpp
nntrainer/layers/layer_context.h
nntrainer/models/neuralnet.cpp
nntrainer/tensor/manager.cpp
nntrainer/tensor/tensor_pool.cpp
nntrainer/tensor/var_grad.cpp
nntrainer/tensor/var_grad.h
nntrainer/tensor/weight.h