[test] Add unittest for attention layer
authorParichay Kapoor <pk.kapoor@samsung.com>
Thu, 7 Oct 2021 08:34:31 +0000 (17:34 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 8 Oct 2021 06:30:06 +0000 (15:30 +0900)
commitb3d18482b375d285dbb2af5fc6894bbb05dd93d2
treea4c02c116c7d41dde19526b1d78b3b3b5491a318
parent65c7a2e7bf78a0fca1d17241e9a2cb6a53dd5f2c
[test] Add unittest for attention layer

This patch adds unittest for attention layer.
- Backwarding implementation is fixed for attention layer
- more wider coverage unittests are added
- layer golden test is updated to generate float input data which is
needed for the attention

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/attention_layer.cpp
packaging/unittest_layers_v2.tar.gz
test/input_gen/genLayerTests.py
test/input_gen/recorder.py
test/unittest/layers/unittest_layers_attention.cpp