[Layer] Fix logic: SwiGLU Layer Training Incompatibility
authorDonghyeon Jeong <dhyeon.jeong@samsung.com>
Thu, 27 Jun 2024 04:24:48 +0000 (13:24 +0900)
committerMyungJoo Ham <myungjoo.ham@samsung.com>
Fri, 28 Jun 2024 06:36:14 +0000 (15:36 +0900)
Currently, the SwiGLU layer using OpenCL operations does not support training/backpropagation.
Consequently, we are updating the logic to reflect that it is false.

**Self-evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test:   [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Donghyeon Jeong <dhyeon.jeong@samsung.com>
nntrainer/layers/cl_layers/swiglu_cl.h

index 3001c527ff13023da288e4668bc339ac02758411..a96211ee3ee40e808210d286d6214066c0a6d610 100644 (file)
@@ -69,7 +69,7 @@ public:
   /**
    * @copydoc bool supportBackwarding() const
    */
-  bool supportBackwarding() const override { return true; };
+  bool supportBackwarding() const override { return false; };
 
   /**
    * @copydoc Layer::exportTo(Exporter &exporter, ExportMethods method)