projects
/
platform
/
core
/
ml
/
nntrainer.git
/ commitdiff
commit
grep
author
committer
pickaxe
?
search:
re
summary
|
shortlog
|
log
|
commit
| commitdiff |
tree
raw
|
patch
| inline |
side by side
(parent:
00be0f5
)
[batchnorm] Bug fix for batch norm layer
author
Parichay Kapoor
<pk.kapoor@samsung.com>
Wed, 8 Sep 2021 04:47:17 +0000
(13:47 +0900)
committer
Jijoong Moon
<jijoong.moon@samsung.com>
Fri, 10 Sep 2021 03:03:26 +0000
(12:03 +0900)
Bug fix for requested variable for batch norm layer.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/bn_layer.cpp
patch
|
blob
|
history
diff --git
a/nntrainer/layers/bn_layer.cpp
b/nntrainer/layers/bn_layer.cpp
index
91e10a0
..
3cbe8c6
100644
(file)
--- a/
nntrainer/layers/bn_layer.cpp
+++ b/
nntrainer/layers/bn_layer.cpp
@@
-65,7
+65,7
@@
void BatchNormalizationLayer::finalize(InitLayerContext &context) {
context.getName() + ":moving_mean", false);
wt_idx[BNParams::var] = context.requestWeight(
dim, initializers[BNParams::var], WeightRegularizer::NONE, 1.0f,
-
"BN:
:moving_variance", false);
+
context.getName() + "
:moving_variance", false);
wt_idx[BNParams::gamma] = context.requestWeight(
dim, initializers[BNParams::gamma], WeightRegularizer::NONE, 1.0f,
context.getName() + ":gamma", true);