projects
/
platform
/
core
/
ml
/
nntrainer.git
/ commitdiff
commit
grep
author
committer
pickaxe
?
search:
re
summary
|
shortlog
|
log
|
commit
| commitdiff |
tree
raw
|
patch
| inline |
side by side
(parent:
19b72d1
)
[weight] weight regularization and decay with clip
author
Parichay Kapoor
<pk.kapoor@samsung.com>
Fri, 18 Feb 2022 11:02:51 +0000
(20:02 +0900)
committer
Jijoong Moon
<jijoong.moon@samsung.com>
Mon, 21 Feb 2022 13:44:22 +0000
(22:44 +0900)
Enable weighte regularization and decay with weiht clipping.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/models/neuralnet.cpp
patch
|
blob
|
history
diff --git
a/nntrainer/models/neuralnet.cpp
b/nntrainer/models/neuralnet.cpp
index
ac54e38
..
8eaba4d
100644
(file)
--- a/
nntrainer/models/neuralnet.cpp
+++ b/
nntrainer/models/neuralnet.cpp
@@
-308,6
+308,8
@@
void NeuralNetwork::backwarding(int iteration) {
std::function<void(Weight &, int)> apply_grad_clip_op =
[opt_ = opt.get()](Weight &w, int iteration) -> void {
+ w.calcRegularizationGradient();
+ w.calcWeightDecayGradient();
RunOptimizerContext opt_context(&w, iteration);
opt_->applyGradient(opt_context);
};