[layer] refactor num of inputs/outputs
authorParichay Kapoor <pk.kapoor@samsung.com>
Thu, 11 Mar 2021 11:50:14 +0000 (20:50 +0900)
committerJijoong Moon <jijoong.moon@samsung.com>
Mon, 15 Mar 2021 02:13:48 +0000 (11:13 +0900)
commitf4a500e17ebfb895005b73eb77b654f6f8ae5769
tree86caf759156b5c23fa1b15752de33388151c79e0
parent6b8f921ec985da1a9414c2b4d1480b9ea76bee5e
[layer] refactor num of inputs/outputs

Refactor layer to properly handle the number of inputs
and outputs, which is maintained across the in/out dimensions,
in/out data.
This is not maintained for in/out layers as these properties will move
out of layers in the next commit.

Also added bugfix in output_layer which used to clear the output dimension
rendering the num outputs 0 for sometime.

NetworkGraph setNumNetBufferSize is also removed as the above
refactoring handles it at the layer level.

See Also #986

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
17 files changed:
nntrainer/graph/network_graph.cpp
nntrainer/graph/network_graph.h
nntrainer/layers/addition_layer.cpp
nntrainer/layers/addition_layer.h
nntrainer/layers/bn_layer.cpp
nntrainer/layers/concat_layer.cpp
nntrainer/layers/concat_layer.h
nntrainer/layers/conv2d_layer.cpp
nntrainer/layers/embedding.cpp
nntrainer/layers/fc_layer.cpp
nntrainer/layers/flatten_layer.cpp
nntrainer/layers/layer.cpp
nntrainer/layers/layer_internal.h
nntrainer/layers/output_layer.cpp
nntrainer/layers/output_layer.h
nntrainer/models/neuralnet.cpp
test/unittest/unittest_nntrainer_layers.cpp