[batchnorm] Bug fix for batch norm layer
authorParichay Kapoor <pk.kapoor@samsung.com>
Wed, 8 Sep 2021 04:47:17 +0000 (13:47 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 10 Sep 2021 03:03:26 +0000 (12:03 +0900)
Bug fix for requested variable for batch norm layer.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/bn_layer.cpp

index 91e10a0..3cbe8c6 100644 (file)
@@ -65,7 +65,7 @@ void BatchNormalizationLayer::finalize(InitLayerContext &context) {
     context.getName() + ":moving_mean", false);
   wt_idx[BNParams::var] = context.requestWeight(
     dim, initializers[BNParams::var], WeightRegularizer::NONE, 1.0f,
-    "BN::moving_variance", false);
+    context.getName() + ":moving_variance", false);
   wt_idx[BNParams::gamma] = context.requestWeight(
     dim, initializers[BNParams::gamma], WeightRegularizer::NONE, 1.0f,
     context.getName() + ":gamma", true);