This PR resolves an issue related to the incorrect array size for lora_idx in the fully connected layer.
Specifically, the fix has made the array size four elements long, corresponding to loraA, loraB, loraTmp, and loraOut.
**Self-evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Donghyeon Jeong <dhyeon.jeong@samsung.com>
lora_scaling - scaling factor of LoRA apply, i.e.,
lora_scaling = alpha / lora_rank */
std::array<unsigned int, 2> weight_idx; /**< indices of the weights */
- std::array<unsigned int, 2> lora_idx; /**< indices of the lora weights */
+ std::array<unsigned int, 4> lora_idx; /**< indices of the lora weights */
};
} // namespace nntrainer