[Fix/TFlite] Fix tflite allocation
authorJihoon Lee <jhoon.it.lee@samsung.com>
Mon, 18 Jan 2021 06:58:39 +0000 (15:58 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Mon, 25 Jan 2021 03:15:10 +0000 (12:15 +0900)
commitfb0611f105b093657941e1b984902d79e3ef8f65
tree8656cc478104fbaa2a9890c75c784421e3513599
parent076bdf8ba5d5cde693161ca659a43e2a35713667
[Fix/TFlite] Fix tflite allocation

Now, memory alllocation is handled outside of each layer.
Accordingly, allocating out tensor shouldn't be done inside a layer.

For the same reason, loss layer backwarding needs some fix, for now
it is just commented and will be handled soon

This patch handles the issue for tflite layer

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
nntrainer/layers/loss_layer.cpp
nntrainer/layers/tflite_layer.cpp