[optimizer] Simplify optimizer initialize
authorParichay Kapoor <pk.kapoor@samsung.com>
Thu, 3 Dec 2020 06:19:02 +0000 (15:19 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Thu, 10 Dec 2020 10:20:41 +0000 (19:20 +0900)
commit739b909234fd3353913f05f15f1bbd80579bc55f
tree8e9e21ec19a4379f51794b9f8276b16a9e870242
parenta023b3ace5a999c4841351bdf717b8bf8f7943cc
[optimizer] Simplify optimizer initialize

As there is just one optimizer and shared by layers, it must be initialized just once by the neural network.
Also, addOptimizerVariables() moved out separately from initialize() as initialize() should work
on optimizers parameters and should not need list of weights.

Also remove set_tensor argument which was redundant

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/layer.cpp
nntrainer/models/neuralnet.cpp
nntrainer/optimizers/adam.cpp
nntrainer/optimizers/adam.h
nntrainer/optimizers/optimizer.cpp
nntrainer/optimizers/optimizer_internal.h
nntrainer/optimizers/sgd.cpp
nntrainer/optimizers/sgd.h