[layer] revise attention layers to apply softmax as inplace
authorhyeonseok lee <hs89.lee@samsung.com>
Wed, 7 Sep 2022 04:31:24 +0000 (13:31 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Thu, 9 Feb 2023 08:11:05 +0000 (17:11 +0900)
commitcb2a96894aa99285b569cf3573fee5ceae51a47a
tree866fc89d7540e0cd5d473cda18cdd0a05e864ffe
parent71057a2d72817be8511121cc8103e3ad909d0cb5
[layer] revise attention layers to apply softmax as inplace

 - Remove attention_score tensor in attention/multi_head_attention layer to apply softmax as inplace
 - Modify tensor lifespan of fc_out to FORWARD_FUNC_LIFESPAN
 - remove unused enum updated_state

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
nntrainer/layers/attention_layer.cpp
nntrainer/layers/attention_layer.h
nntrainer/layers/mol_attention_layer.cpp
nntrainer/layers/multi_head_attention_layer.cpp
nntrainer/layers/multi_head_attention_layer.h