[Layer] add tanh-based approximate gelu activation function
authorSeungbaek Hong <sb92.hong@samsung.com>
Mon, 1 Jul 2024 11:41:52 +0000 (20:41 +0900)
committerMyungJoo Ham <myungjoo.ham@samsung.com>
Thu, 11 Jul 2024 05:51:30 +0000 (14:51 +0900)
commit64bd12e479bc038a1bdf875e0d4bd9e735210858
tree73e02a84fdec792a41566a0f4012f2f4d58a4702
parent147dbe1a564a6c190c459e260e1677079361177d
[Layer] add tanh-based approximate gelu activation function

- add tanh-based approximate gelu(tanh gelu) for vision transformer.
- rename quick gelu to sigmoid gelu(it's a sigmoid-based approximate gelu)

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.com>
nntrainer/layers/acti_func.h
nntrainer/layers/common_properties.h