[layer] perform layer context check on first forward
authorParichay Kapoor <pk.kapoor@samsung.com>
Thu, 2 Dec 2021 08:37:50 +0000 (17:37 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 3 Dec 2021 05:46:00 +0000 (14:46 +0900)
commit769c1f7b99aa10033171ef1f5e0c69eebb9b3c30
tree87253486c8bd8c6c053e73d51e4d1908bc50142f
parente080f6525d5b27ecacba35c8363df7ffeb9164d4
[layer] perform layer context check on first forward

This patch enables layer context check on the first forward itself,
which revealed a bug in forward which was earlier being shown in the
calcGradient in mol attention layer.
Added the corresponding fix for mol attention layer.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/attention_layer.cpp
nntrainer/layers/layer_context.cpp
nntrainer/layers/mol_attention_layer.cpp
nntrainer/layers/mol_attention_layer.h
nntrainer/layers/time_dist.cpp
test/input_gen/genModelTests_v2.py
test/input_gen/genModelsRecurrent_v2.py
test/unittest/layers/layers_golden_tests.cpp
test/unittest/layers/unittest_layer_node.cpp
test/unittest/layers/unittest_layers_mol_attention.cpp