[bnlayer] Batch norm layer bug fix
authorParichay Kapoor <pk.kapoor@samsung.com>
Wed, 8 Sep 2021 11:14:43 +0000 (20:14 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 10 Sep 2021 03:04:25 +0000 (12:04 +0900)
This patch provides bug fix to the initialization of the batch norm
values.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/bn_layer.h

index 6728770..d5592d5 100644 (file)
@@ -44,15 +44,14 @@ public:
   BatchNormalizationLayer(
     int axis = -1, float momentum = 0.99, float epsilon = 0.001,
     Tensor::Initializer moving_mean_initializer = Tensor::Initializer::ZEROS,
-    Tensor::Initializer moving_variance_initializer =
-      Tensor::Initializer::ZEROS,
+    Tensor::Initializer moving_variance_initializer = Tensor::Initializer::ONES,
     Tensor::Initializer gamma_initializer = Tensor::Initializer::ONES,
-    Tensor::Initializer beta_initializer = Tensor::Initializer::ONES) :
+    Tensor::Initializer beta_initializer = Tensor::Initializer::ZEROS) :
     Layer(),
     epsilon(epsilon),
     momentum(momentum),
     axis(axis),
-    initializers{moving_variance_initializer, moving_variance_initializer,
+    initializers{moving_mean_initializer, moving_variance_initializer,
                  gamma_initializer, beta_initializer},
     wt_idx({0}) {}