[optimizer] Bug fix
authorParichay Kapoor <pk.kapoor@samsung.com>
Fri, 26 Jun 2020 09:51:43 +0000 (18:51 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Tue, 30 Jun 2020 03:44:15 +0000 (12:44 +0900)
Tensor copy constructor and copy assigment operator creates a copy of the vector
This led to bug in optimizer which updated the copy of the weight than the weight itself
Fixed by refernce of weight in optimizer

Resolves #241

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/src/optimizer.cpp

index e9d686c..ff57a84 100644 (file)
@@ -103,7 +103,7 @@ void Optimizer::apply_gradients(
   std::vector<std::reference_wrapper<Tensor>>::iterator w_iter, g_iter;
   for (w_iter = weights.begin(), g_iter = gradients.begin();
       w_iter != weights.end(); ++w_iter, ++g_iter) {
-    Tensor x = *w_iter;
+    Tensor &x = *w_iter;
     Tensor x_grad = *g_iter;
 
     x_grad = x_grad.average();