[weight/layer] Move weight regularization out of layers
authorParichay Kapoor <pk.kapoor@samsung.com>
Fri, 29 Jan 2021 11:03:51 +0000 (20:03 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Wed, 3 Mar 2021 00:46:41 +0000 (09:46 +0900)
commit412f8a0832d10c194ba58d8cc6d44a150ff25fcf
tree0c620972ded0ce26d9a36c8fdb5a5d8a0bef63d9
parent6bbb3b794eca0f8e2b8dbc193b87b84b313a5c8c
[weight/layer] Move weight regularization out of layers

Move weight regularization out of layers to weights
and remove same code from the all the layers.
Loss and grads from weight regularization is done by the
weight itself.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/bn_layer.cpp
nntrainer/layers/conv2d_layer.cpp
nntrainer/layers/fc_layer.cpp
nntrainer/layers/layer.cpp
nntrainer/layers/layer_internal.h
nntrainer/models/neuralnet.cpp
nntrainer/tensor/weight.cpp
nntrainer/tensor/weight.h
nntrainer/utils/parse_util.cpp