[weight/var_grad] Make internal variable as shared_ptr
authorParichay Kapoor <pk.kapoor@samsung.com>
Tue, 1 Dec 2020 10:46:38 +0000 (19:46 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Mon, 7 Dec 2020 00:32:10 +0000 (09:32 +0900)
commit49a4b29554ab39031f349faf2d57c703fe7f21f2
tree6c961ea74088d9d8f42e16e662385b796d0f1783
parent53327dfe3f2c9771a90c4eff81b6b2d16c7fd602
[weight/var_grad] Make internal variable as shared_ptr

Internal variables in weights/var_grad, namely, the variable and gradient itself
are changed to shared_ptr so that weights can be shared without worrying about
shallow copies.

Also changed the copy constructor to not create new Tensor as copy constructor
of weight will get called and its unnecessary + unintentional overhead.
As weight is just wrapper over tensor, their copy constructors should follow
same behavior as tensor which is to not create new memory.
Added clone as an alternative to create new copy of a given weight.

See also #774 #766

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/layer.cpp
nntrainer/tensor/var_grad.cpp
nntrainer/tensor/var_grad.h
nntrainer/tensor/weight.cpp
nntrainer/tensor/weight.h
test/unittest/unittest_nntrainer_models.cpp