[Weights] Split weight variable init and alloc
authorParichay Kapoor <pk.kapoor@samsung.com>
Wed, 10 Mar 2021 14:03:56 +0000 (23:03 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 19 Mar 2021 02:30:21 +0000 (11:30 +0900)
commit67d86cdc99ce022eb17377671d1142be153539dd
treec1fa719ee9795df60d6b8942608cc20d3d83bd76
parent70fed4af4b51f285c1cdc03a1db94af4e5155412
[Weights] Split weight variable init and alloc

Split the initialization and memory allocation for weights
3 exposed bugs with this has been resolved:
- manager does not allow tracking var_grads once initialized
- resetGradient confirms the allocation is done before accessing the memory
- allocateVariable() calls the correct initializer now

Further, the logic of reinitialize in unittest for layers has been
split into two parts - initialize and reinitialize where
reinitialize will reset layer and manager and then call initialize.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/bn_layer.cpp
nntrainer/layers/conv2d_layer.cpp
nntrainer/layers/fc_layer.cpp
nntrainer/models/neuralnet.cpp
nntrainer/tensor/manager.cpp
nntrainer/tensor/var_grad.h
nntrainer/tensor/weight.h
test/unittest/unittest_nntrainer_layers.cpp