[ Recurrent ] Implement Dropout for Recurrent Net
authorjijoong.moon <jijoong.moon@samsung.com>
Tue, 22 Jun 2021 00:53:12 +0000 (09:53 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 2 Jul 2021 04:45:09 +0000 (13:45 +0900)
commit369eb18771229c118ecb7e71fd82fcfccd5270e3
tree8c216fa60346dd5f4d0997b4c37da9a553c48f4a
parentf4fabfeb65ad61b3faf0d38c43b6d474ddef5deb
[ Recurrent ] Implement Dropout for Recurrent Net

In this commit, drop out for recurrent network is intrduced.
dropout property is introduced and if the element value of random
tensor is smaller than dropout rate, then it will be set zero.
The element which is not set zero, then it will scale with
1.0/(1.0-dropout).

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
api/ccapi/include/layer.h
nntrainer/layers/gru.cpp
nntrainer/layers/gru.h
nntrainer/layers/layer_internal.h
nntrainer/layers/lstm.cpp
nntrainer/layers/lstm.h
nntrainer/layers/rnn.cpp
nntrainer/layers/rnn.h
nntrainer/tensor/tensor.cpp
nntrainer/tensor/tensor.h
nntrainer/utils/parse_util.cpp