[batchnorm] Optimize batch norm backwarding
authorParichay Kapoor <pk.kapoor@samsung.com>
Tue, 28 Sep 2021 10:59:50 +0000 (19:59 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Wed, 6 Oct 2021 08:27:10 +0000 (17:27 +0900)
commite471a771aabec083dc8e37df3faebc1bc107f0d6
tree812e4cb8fc5256f239800e5605eac7ae6b15db50
parent62a3695116767aac050031a5a918b09c740f814e
[batchnorm] Optimize batch norm backwarding

Remove the extra full size extra memory requirement as the cost of the
reduced memory. The difference in memory requirmement can be
significant. Earlier memory requirement was b*c*h*w vs now it is just c
where the assumption is that batch norm if is normalizing along
axis=channel.
This is achieved by reordering of the operations.
Note: this change has no performance impact.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/bn_layer.cpp
nntrainer/layers/bn_layer.h