[pooling] Do not allocate memory in initialize
authorParichay Kapoor <pk.kapoor@samsung.com>
Mon, 4 Jan 2021 10:13:59 +0000 (19:13 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Mon, 25 Jan 2021 02:59:21 +0000 (11:59 +0900)
commit687190c5d4b38b3bf53012661464ca8621cf45c2
treedb87cc810996ad4d0c043a8e21ff1852588d8ffa
parenta5f0c6d2aba3d5d9e2042606e9f0de085cf6a345
[pooling] Do not allocate memory in initialize

Set batch size in initialize for pooling layer allocates memory.
However, the final batch size is allowed to change in inference/training.
This unnecessarily changes the peak memory requirement.
For now, this memory is allocated with forwarding.
Later this will be handled as a tensor with manager once int data type is supported.

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
nntrainer/layers/pooling2d_layer.cpp
nntrainer/layers/pooling2d_layer.h
nntrainer/models/neuralnet.cpp
nntrainer/models/neuralnet.h
nntrainer/tensor/manager.h