[ Bug ] Fix the bug read the weight for batch normalization layer
authorjijoong.moon <jijoong.moon@samsung.com>
Wed, 21 Jun 2023 06:53:52 +0000 (15:53 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Fri, 23 Jun 2023 02:22:12 +0000 (11:22 +0900)
commitec3cfc3d298a7116000318fe5c0d9842495fbcb4
tree8aeca203431ad696b55ac5f686669c3435c45535
parent649db287e0469197fb49678339ed41b3485ead74
[ Bug ] Fix the bug read the weight for batch normalization layer

There is bug when the model loads the data for the batch normalziation
layer.

During the setup the requestWeights in manager, it add the max
execution order for graddient for gradient clipping, but variable
weight also added. This pr fixs it.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
nntrainer/tensor/manager.cpp