platform/core/ml/nntrainer.git
3 years ago[loss] BugFix for regularization loss
Parichay Kapoor [Mon, 2 Aug 2021 08:55:05 +0000 (17:55 +0900)]
[loss] BugFix for regularization loss

This patch adds bugfix for regularization loss calculation.
Regularization loss must be calculated before the weight is updated in
backwarding, but this was not being done.
This patch provides the corresponding fix.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test] Enable c/cc diabled tests
Parichay Kapoor [Fri, 30 Jul 2021 07:11:08 +0000 (16:11 +0900)]
[test] Enable c/cc diabled tests

This patch enables c/cc disabled tests.
Corresponding required updates in the tests are also added.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[dataset] Add file producer
Jihoon Lee [Mon, 12 Jul 2021 12:28:12 +0000 (21:28 +0900)]
[dataset] Add file producer

This patch adds file producer which abstracts reading a raw file. Also,
this component makes sure that the file accepts various kind of input
shapes.

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[CAPI] Implement ml_train_model_get_input|output_dims
hyeonseok lee [Wed, 21 Jul 2021 11:22:30 +0000 (20:22 +0900)]
[CAPI] Implement ml_train_model_get_input|output_dims

 - Implement ml_train_model_get_input_dimensions
 - Implement ml_train_model_get_output_dimensions
 - Make a unittest for implemented apis

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
3 years ago[tensor_dim] package tensor_dim.h with ccapi
hyeonseok lee [Thu, 29 Jul 2021 10:47:44 +0000 (19:47 +0900)]
[tensor_dim] package tensor_dim.h with ccapi

 - Package tensor_dim with ccapi
 - MAXDIM belongs to tensorDim class

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
3 years ago[CAPI] Expose layer enums
Jihoon Lee [Tue, 27 Jul 2021 02:41:07 +0000 (11:41 +0900)]
[CAPI] Expose layer enums

This patch contains layer enums to be exposed

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[CAPI] Change tensor dim -> information
Jihoon Lee [Wed, 28 Jul 2021 03:11:26 +0000 (12:11 +0900)]
[CAPI] Change tensor dim -> information

This patch updates description of the tensors_info, while explicitly
stating that the information object is newly created

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Cc: Hyeonseok Lee<hs89.lee@samsung.com>
Cc: Parichay Kapoor<pk.kapoor@samsung.com>
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ Layer ] Change OutputLayer to MultiOutLayer
jijoong.moon [Wed, 28 Jul 2021 02:41:10 +0000 (11:41 +0900)]
[ Layer ] Change OutputLayer to MultiOutLayer

Currently the feature of OutputLayer is not match. It is more like
Multiout layer. The type name and others are already multiout.

This commit includes,
  . Class name changed : OutputLayer --> MultiOutLayer
  . File name changed : output_layer.x --> multiout_layer.x

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[CAPI] Add ml_train_model_get_layer to internal
Jihoon Lee [Mon, 26 Jul 2021 09:08:08 +0000 (18:08 +0900)]
[CAPI] Add ml_train_model_get_layer to internal

This patch adds ml_train_model_get_layer to internal api

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[CAPI] Deprecate old dataset apis
Jihoon Lee [Mon, 26 Jul 2021 08:35:32 +0000 (17:35 +0900)]
[CAPI] Deprecate old dataset apis

This patch updates to deprecate old dataset apis

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[Dataset] Add callback generator
Jihoon Lee [Mon, 12 Jul 2021 09:36:49 +0000 (18:36 +0900)]
[Dataset] Add callback generator

This patch add callback producer to abstract generator

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ GRU ] Add dropout mask
jijoong.moon [Fri, 23 Jul 2021 00:02:14 +0000 (09:02 +0900)]
[ GRU ] Add dropout mask

This commit includes:
. Implementation of drop out mask

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[Dataset] Add test for the random dataproducer
Jihoon Lee [Mon, 12 Jul 2021 06:47:44 +0000 (15:47 +0900)]
[Dataset] Add test for the random dataproducer

**Changes proposed in this PR:**
- Add random data producer tests
- Add abstract test for more data producers

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ RNNT LOSS ] Add skeleton code for rnnt loss
jijoong.moon [Thu, 22 Jul 2021 06:35:30 +0000 (15:35 +0900)]
[ RNNT LOSS ] Add skeleton code for rnnt loss

This commit includes:
  . Sekeleton Code for RNNT Loss as an Custom Loss Layer
  . Sementic Unit Test

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[Android] Fix undefined definition
Jihoon Lee [Mon, 26 Jul 2021 10:47:28 +0000 (19:47 +0900)]
[Android] Fix undefined definition

This patch defines ML_API_COMMON_ROOT previously undefined

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ LSTM ] Add dropout mask
jijoong.moon [Thu, 22 Jul 2021 23:55:44 +0000 (08:55 +0900)]
[ LSTM ] Add dropout mask

In this commit,
. drop out mask is added.
. calculation of drop out is implemented.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[ RNN ] Add dropout mask
jijoong.moon [Thu, 22 Jul 2021 23:39:49 +0000 (08:39 +0900)]
[ RNN ] Add dropout mask

This commit includes :
  . implementation of mask for dropout

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[Android] Add missing dependency
Jihoon Lee [Mon, 26 Jul 2021 05:45:57 +0000 (14:45 +0900)]
[Android] Add missing dependency

This patch adds mssing dependency to the android application build

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ Fix ] Fix android build error with meson
jijoong.moon [Mon, 26 Jul 2021 01:27:56 +0000 (10:27 +0900)]
[ Fix ] Fix android build error with meson

This commit includes,
  . Fix android build issues
    : openblas link and download

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[Fix] Initialize member variable
hyeonseok lee [Thu, 22 Jul 2021 04:33:17 +0000 (13:33 +0900)]
[Fix] Initialize member variable

 - Initialize std::function variable in constructor

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
3 years ago[ TEST ] fix compilation error
jijoong.moon [Thu, 22 Jul 2021 13:03:51 +0000 (22:03 +0900)]
[ TEST ] fix compilation error

This commit includes,
  . compilation error fix : unused + wrong type comparison

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[unittest] multi output unittest
hyeonseok lee [Thu, 8 Jul 2021 10:32:09 +0000 (19:32 +0900)]
[unittest] multi output unittest

 - Modify unittest to test multi output case
 - Make setLabel function in neuralnet
   to set labels at all loss layers

Resolve #1367

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
3 years ago[network_graph] revise addlosslayer for multi output
hyeonseok lee [Fri, 25 Jun 2021 12:32:48 +0000 (21:32 +0900)]
[network_graph] revise addlosslayer for multi output

 - reorder addlosslayer to run before topologicalsort
 - Replace node_list rather than delete when the loss is entropy

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
3 years ago[graph_core] Implement input_list, output_list for multi input, output
hyeonseok lee [Fri, 25 Jun 2021 06:39:53 +0000 (15:39 +0900)]
[graph_core] Implement input_list, output_list for multi input, output

- Make input_list, output_list and its getter to support multi input, output

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
3 years ago[layer] Update dropout layer to V2
Parichay Kapoor [Wed, 21 Jul 2021 07:47:32 +0000 (16:47 +0900)]
[layer] Update dropout layer to V2

Update dropout layer to V2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[coverity] Fixes realtd to coverity
Parichay Kapoor [Wed, 21 Jul 2021 03:18:46 +0000 (12:18 +0900)]
[coverity] Fixes realtd to coverity

Add fixes related to coverity.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[svace] Fix svace issues
Parichay Kapoor [Tue, 20 Jul 2021 11:15:50 +0000 (20:15 +0900)]
[svace] Fix svace issues

Fix svace issues for layer_v2 branch

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer_v2] Merge commit for branch layer_v2
Parichay Kapoor [Tue, 20 Jul 2021 09:51:19 +0000 (18:51 +0900)]
[layer_v2] Merge commit for branch layer_v2

This commit forms the merge commit including minor updates
while rebase layer_v2 with main branch so as to apply it.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[LayerV1] Delete for LayerV1
Parichay Kapoor [Thu, 15 Jul 2021 02:05:59 +0000 (11:05 +0900)]
[LayerV1] Delete for LayerV1

This patch deletes LayerV1 headers and its implementations.
Some of the relevant codes from LayerV1 are moved to either LayerNode or
to LayerDevel.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layernode] Add print functionality to LayerNode
Parichay Kapoor [Wed, 14 Jul 2021 12:48:44 +0000 (21:48 +0900)]
[layernode] Add print functionality to LayerNode

Add print functionality from LayerV1 to LayerNode.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Update NNStreamer layer to V2
Parichay Kapoor [Wed, 14 Jul 2021 12:10:50 +0000 (21:10 +0900)]
[layer] Update NNStreamer layer to V2

Update NNStreamer layer to V2 design.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Enable permute layer for V2
Parichay Kapoor [Wed, 14 Jul 2021 12:03:22 +0000 (21:03 +0900)]
[layer] Enable permute layer for V2

Enable permute layer for V2 design.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[unittest] Enable backbone unittests
Parichay Kapoor [Wed, 14 Jul 2021 11:33:17 +0000 (20:33 +0900)]
[unittest] Enable backbone unittests

Enable backbone unittests for modelfile

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Enable time dist layer for V2
Parichay Kapoor [Wed, 14 Jul 2021 10:49:50 +0000 (19:49 +0900)]
[layer] Enable time dist layer for V2

Enable time dist layer for LayerV2 design.
This patch tries to simulate the InitContext and the RunContext inside
the time dist layer so that proper shapes of variables can be passed to
the internal layer. Further, context changing function calls are
replicated on the actual InitContext/RunContext by detecting changes and
making those function calls again.

LayerNode was updated to ensure that a layer is not getting distributed
again and again.
Some more getter APIs were added for TensorDim and LayerContext.

Unittests related to distribute have also been enabled with this patch.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layernode] Remove LayerNode dependence on LayerV1
Parichay Kapoor [Wed, 14 Jul 2021 08:46:55 +0000 (17:46 +0900)]
[layernode] Remove LayerNode dependence on LayerV1

Remove LayerNode dependence on LayerV1.
Further, NetworkGraph dependence on LayerV1 is also removed.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[weight] Add missing initialization in constructor
Parichay Kapoor [Wed, 14 Jul 2021 08:46:00 +0000 (17:46 +0900)]
[weight] Add missing initialization in constructor

Add missing initialization in constructor for the weight class.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Initialize missing variables for LayerImpl
Parichay Kapoor [Wed, 14 Jul 2021 08:45:11 +0000 (17:45 +0900)]
[layer] Initialize missing variables for LayerImpl

Add initialization for the missing variables for LayerImpl class.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[application] Add common tests for simpleshot layers
Parichay Kapoor [Tue, 13 Jul 2021 10:18:13 +0000 (19:18 +0900)]
[application] Add common tests for simpleshot layers

Add common standalone tests for simpleshot layers tests.
This allows basic testing semantics test for simpleshot layers.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[application] SimpleShot application update for V2
Parichay Kapoor [Mon, 12 Jul 2021 09:21:17 +0000 (18:21 +0900)]
[application] SimpleShot application update for V2

SimpleShot application update for layer V2 design.
Further, task_runner has been updated to use only 1 loss kind of layer.

Unittests of layers have not been enabled, they will be enabled with unittest_layers.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[application] Update centroid KNN layer for V2
Parichay Kapoor [Mon, 12 Jul 2021 09:20:47 +0000 (18:20 +0900)]
[application] Update centroid KNN layer for V2

Update centroid knn layer for V2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[application] Update l2norm layer for V2
Parichay Kapoor [Mon, 12 Jul 2021 09:20:06 +0000 (18:20 +0900)]
[application] Update l2norm layer for V2

Update l2norm layer for V2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[application] Update centering layer for V2
Parichay Kapoor [Mon, 12 Jul 2021 09:19:41 +0000 (18:19 +0900)]
[application] Update centering layer for V2

Update centering layer for V2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[test] Added common unittest for custom layers
Parichay Kapoor [Tue, 13 Jul 2021 09:51:48 +0000 (18:51 +0900)]
[test] Added common unittest for custom layers

This patch add common unittests for custom layers.
This demonstrates how layer developers can use the unittests made by
nntrainer developers to test their own layers.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[application] Enable custom application
Parichay Kapoor [Mon, 12 Jul 2021 07:27:05 +0000 (16:27 +0900)]
[application] Enable custom application

Enable custom application for v2 layer design.
This also includes updating the power layer and mae loss layer implementation
with layerv2 and updating their corresponding unittests.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update plugged layer for V2 design
Parichay Kapoor [Mon, 12 Jul 2021 07:26:15 +0000 (16:26 +0900)]
[layer] Update plugged layer for V2 design

Update plugged layer for V2 design.
The corresponding updates in app_context have also been made.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[application] Enable draw classification app
Parichay Kapoor [Mon, 12 Jul 2021 03:04:15 +0000 (12:04 +0900)]
[application] Enable draw classification app

Enable draw classification application.
Corresponding bug fixes are also added.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Enable tflite layer for V2
Parichay Kapoor [Mon, 12 Jul 2021 03:03:34 +0000 (12:03 +0900)]
[layer] Enable tflite layer for V2

This patch enables tflite layer for V2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Remove extra property from embedding
Parichay Kapoor [Sun, 11 Jul 2021 19:15:12 +0000 (04:15 +0900)]
[layer] Remove extra property from embedding

Remove extra property "in_length" from embedding layer.
in_length was used to set and provide the number of inputs to be
provided to the embedding layer. However, this must be fixed
based on the input provided from the previous layer.
This patch removes this property and let this value be inferred
from the given input dimensions.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[applications] Enable working applications with v2
Parichay Kapoor [Sun, 11 Jul 2021 19:05:48 +0000 (04:05 +0900)]
[applications] Enable working applications with v2

Enable working applications with v2.
Adding some minor updates to the applications to match with the v2 implementation.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update concat layer to V2
Parichay Kapoor [Sun, 11 Jul 2021 19:03:58 +0000 (04:03 +0900)]
[layer] Update concat layer to V2

Update concat layer to V2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update embedding for layerv2
Parichay Kapoor [Sun, 11 Jul 2021 18:44:29 +0000 (03:44 +0900)]
[layer] Update embedding for layerv2

Update embedding layer for layerv2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Enable split layer for V2 design
Parichay Kapoor [Sun, 11 Jul 2021 18:21:29 +0000 (03:21 +0900)]
[layer] Enable split layer for V2 design

Enable split layer for v2 design.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update preprocess layers for layerV2
Parichay Kapoor [Sun, 11 Jul 2021 14:05:34 +0000 (23:05 +0900)]
[layer] Update preprocess layers for layerV2

Update preprocess layers for layerv2 design.
Enable corresponding unittests.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update GRU for layer_v2
Parichay Kapoor [Sun, 11 Jul 2021 13:43:48 +0000 (22:43 +0900)]
[layer] Update GRU for layer_v2

Update GRU implementation using layer_v2 design.
Update and enable corresponding unittests.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update LSTM to request temporary tensors
Parichay Kapoor [Sun, 11 Jul 2021 13:12:56 +0000 (22:12 +0900)]
[layer] Update LSTM to request temporary tensors

Update LSTM implementation to reduce the use of temporary tensors
and request temporary tensors for the ones that are necessary.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update LSTM for layer_v2
Parichay Kapoor [Sun, 11 Jul 2021 12:54:42 +0000 (21:54 +0900)]
[layer] Update LSTM for layer_v2

Update LSTM for layer_v2.
Also enable corresonding unittests and add common unittests.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[rnn] Cleanup RNN implementation
Parichay Kapoor [Thu, 8 Jul 2021 07:26:04 +0000 (16:26 +0900)]
[rnn] Cleanup RNN implementation

Cleanup RNN implementation.
Reduce usage of temporary allocated memory and reuse existing memory as much as possible.
Further, request hidden state memory from context than creating by self.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[layer] Update RNN to layerv2
Parichay Kapoor [Thu, 8 Jul 2021 06:50:33 +0000 (15:50 +0900)]
[layer] Update RNN to layerv2

Update RNN to layerv2 design.
Corresponding unittests have been added and enabled.

Signed-off-by: Parichay Kapoor <kparichay@gmail.com>
3 years ago[tests] Add more layer common unittests
Parichay Kapoor [Wed, 7 Jul 2021 13:03:51 +0000 (22:03 +0900)]
[tests] Add more layer common unittests

Add more layer common unittests without involving runContext for the
layers.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[unittest] Enable graph unittests
Parichay Kapoor [Wed, 7 Jul 2021 11:20:34 +0000 (20:20 +0900)]
[unittest] Enable graph unittests

Enable graph unittests which creates resnet like model for testing.
Enable correspoding resnet like add in models unittest.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Update multiout layer for V2
Parichay Kapoor [Wed, 7 Jul 2021 11:15:58 +0000 (20:15 +0900)]
[layer] Update multiout layer for V2

Update multiout layer for V2 design.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Update addition layer to V2
Parichay Kapoor [Wed, 7 Jul 2021 10:31:10 +0000 (19:31 +0900)]
[layer] Update addition layer to V2

Update addition layer for V2 design.
Add corresponding unittests.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Remove num_inputs/num_outputs properties
Parichay Kapoor [Wed, 7 Jul 2021 06:55:21 +0000 (15:55 +0900)]
[layer] Remove num_inputs/num_outputs properties

Remove num_inputs and num_outputs layer properties.
The properties were being set with addition, concat and multioutput
layers. However, their usage had been deprecated as this information was
being extracted by the input_layers connections.

This patch removes these properties.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test] Add semantic tests for activation layer
Parichay Kapoor [Wed, 7 Jul 2021 06:10:31 +0000 (15:10 +0900)]
[test] Add semantic tests for activation layer

Add semantic tests for activation layer

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test/models] Add golden tests for loss layer
Parichay Kapoor [Wed, 7 Jul 2021 02:56:31 +0000 (11:56 +0900)]
[test/models] Add golden tests for loss layer

Add golden tests for loss layer added as a layer.
This patch modified the existing models test architecture to be
described where layer is added externally.

More will be added ones more models tests are enabled.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test/modelfile] Add loss layer unittests
Parichay Kapoor [Wed, 7 Jul 2021 01:48:00 +0000 (10:48 +0900)]
[test/modelfile] Add loss layer unittests

Add loss layer unittests for various loss configurations to be
supported. These tests do not ensure correct graph formation.
This will be done soon with models unittest.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[network] Loss support as a layer
Parichay Kapoor [Wed, 7 Jul 2021 01:44:39 +0000 (10:44 +0900)]
[network] Loss support as a layer

Support for loss as a layer with the API has been done.
However, the verification of this feature was missing.
This patch ensures that the added loss layer is verified and its
required conditions checked before finalizing the graph.

The corresponding unittests will be added soon.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[pooling] Update global max pooling
Parichay Kapoor [Wed, 7 Jul 2021 05:57:32 +0000 (14:57 +0900)]
[pooling] Update global max pooling

Update global max pooling to be working with requested tensor.
Enable the corresponding unittest as well.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[pooling] Update pooling to use helper tensors
Parichay Kapoor [Tue, 6 Jul 2021 11:14:49 +0000 (20:14 +0900)]
[pooling] Update pooling to use helper tensors

Update pooling layer to use helper tensors.
Instead of using vector memory, pooling layer will now use tensors
managed with the nntrainer.

As the memory is supposed to be integer, this patch currently interprets
float memory as integer memory. Note that no tensor operation is
performed on this memory as it would corrupt the data. Manual reading
and writing of data is done as was done with vector, but memory
management is moved out of the pooling layer with this patch.

This can be done formally when other data types are supported with
tensor class.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[pooling] Update to LayerV2
Parichay Kapoor [Tue, 6 Jul 2021 05:55:58 +0000 (14:55 +0900)]
[pooling] Update to LayerV2

Update pooling layer to Layerv2 design.
Corresponding unittests for common and models are added and enabled
respectively.

setBatch() for layer node has been updated.
Further, some minor updates have been added to layer context to check if
they are ready to use.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[flatten] Update to LayerV2
Parichay Kapoor [Tue, 6 Jul 2021 04:51:53 +0000 (13:51 +0900)]
[flatten] Update to LayerV2

Update flatten layer to LayerV2.
Further, enable all the corresponding unittests.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[conv2d] Update to Layerv2
Parichay Kapoor [Mon, 5 Jul 2021 11:40:51 +0000 (20:40 +0900)]
[conv2d] Update to Layerv2

Update to Layerv2 format for convolution.
Open the corresponding modelfile and common unittests.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[tensor] Enable request additional tesnor with batchnorm
Parichay Kapoor [Mon, 5 Jul 2021 07:23:48 +0000 (16:23 +0900)]
[tensor] Enable request additional tesnor with batchnorm

Enable requesting additional tensor with batch normalization layer
This patch also including the manager allocating the additional tensors
as well as updating batch normalization layer requesting additional
tensors.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[batchnorm] Update to LayerV2
Parichay Kapoor [Mon, 5 Jul 2021 07:07:49 +0000 (16:07 +0900)]
[batchnorm] Update to LayerV2

Update batch norm to layer v2 style.
Add corresponding common unittests and enable modelfile
and models unittest.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test] Add common unittest for layers
Parichay Kapoor [Fri, 2 Jul 2021 06:53:57 +0000 (15:53 +0900)]
[test] Add common unittest for layers

Add common unittest for layers.
Also add validation for context and some more check in layer_node.
Further, some checks have been removed from LayerImpl which exists in
LayerNode.
The common tests have been split into two parts:
1. standalone: the environment must be manually created in these tests.
Further, these tests should not depend on any other
implementations/headers other than the ones on which LayerDevel also
depends on, like LayerNode, Manager, etc.
2. Dependent: the environment is created using the other
implementations.
Both these tests ensures that layer can work in correct environment and
that the environment creation in standalone version is also correct.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test] Add common unittest for layers
Parichay Kapoor [Fri, 2 Jul 2021 06:53:57 +0000 (15:53 +0900)]
[test] Add common unittest for layers

Add common unittest for layers.
Also add validation for context and some more check in layer_node.
Further, some checks have been removed from LayerImpl which exists in
LayerNode.
The common tests have been split into two parts:
1. standalone: the environment must be manually created in these tests.
Further, these tests should not depend on any other
implementations/headers other than the ones on which LayerDevel also
depends on, like LayerNode, Manager, etc.
2. Dependent: the environment is created using the other
implementations.
Both these tests ensures that layer can work in correct environment and
that the environment creation in standalone version is also correct.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layercontext] Add unsafe methods
Parichay Kapoor [Wed, 7 Jul 2021 01:11:25 +0000 (10:11 +0900)]
[layercontext] Add unsafe methods

getGradient methods always perform checks about memory being allocated
for the gradients. However for labels, we donot want to allocate memory
as label is allocated by the dataset. Yet, we still want to able to set
it at the output gradient. So, a getGradientUnsafe is added which
allows returing empty tensor reference so that gradient can be set.

This new interface does not provide new functionality for the layer
developer but rather provides an unsafe method with the existing
functionality.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layernode] Update getNumInputs/Outputs
Parichay Kapoor [Thu, 1 Jul 2021 08:37:15 +0000 (17:37 +0900)]
[layernode] Update getNumInputs/Outputs

Separate getNumInputs/Outputs semantics for inputs/outputs
and for connections as a node. Number of inputs/outputs will always be
more than 1, but number of input/output connections can be 0 for
input/output nodes of the graph.
This patch separates the two concepts, and its usage.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[unittest] Enable models unittests
Parichay Kapoor [Thu, 1 Jul 2021 07:25:36 +0000 (16:25 +0900)]
[unittest] Enable models unittests

Enable models unittest for layerv2
Corresponding bugfixes are also added

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[manager] Support input/output tensor allocation
Parichay Kapoor [Thu, 1 Jul 2021 07:24:32 +0000 (16:24 +0900)]
[manager] Support input/output tensor allocation

Support input/output tensor allocation for layerv2 design

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[tests] Enable more modelfile unittests
Parichay Kapoor [Wed, 30 Jun 2021 14:31:49 +0000 (23:31 +0900)]
[tests] Enable more modelfile unittests

Enable more unittests for modelfile which depended on activation layers.
Also updated some of the unittests with the change in behavior of the
network graphs.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Update Activation layer to LayerV2
Parichay Kapoor [Wed, 30 Jun 2021 14:30:54 +0000 (23:30 +0900)]
[layer] Update Activation layer to LayerV2

Update activation layer implementation with layerV2 design.
Also add minor updates to input layer.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[test] Enable modelfile unittest
Parichay Kapoor [Wed, 30 Jun 2021 04:40:25 +0000 (13:40 +0900)]
[test] Enable modelfile unittest

Enable modelfile unittests with newly update LayerV2 to test
building and initialization of the models.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer] Cleanup layer_factory
Parichay Kapoor [Fri, 25 Jun 2021 06:06:47 +0000 (15:06 +0900)]
[layer] Cleanup layer_factory

Cleanup layer_factory and use app_context at all places to create layer.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[api/network] Update api/network for new losses
Parichay Kapoor [Fri, 25 Jun 2021 05:56:30 +0000 (14:56 +0900)]
[api/network] Update api/network for new losses

Update API/network including app_context to work with new losses:
- register all the losses
- update API implementation, and add other loss layers
- deprecate layer factory
- also set other elements in RunLayerContext properly

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[losslayer] Update loss layer with LayerV2 design
Parichay Kapoor [Wed, 23 Jun 2021 07:04:39 +0000 (16:04 +0900)]
[losslayer] Update loss layer with LayerV2 design

This patch update loss layer with LayerV2 design.
In order to limit the interface to the updated Layer interface,
different type of loss layers are split into individual classes
extending LossLayer class. LossLayer class is an abstract class
providing some common utility functions to all the loss functions.

Also added loss to RunLayerContext, which allows setting and getting
loss. However, note that although this loss value can be set/get by any
layer for themselves, it will only be used by the model for loss layers.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
[loss layer v2] loss layer v2

loss layer v2

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layernode] Maintain input_dim with LayerContext
Parichay Kapoor [Fri, 2 Jul 2021 03:50:27 +0000 (12:50 +0900)]
[layernode] Maintain input_dim with LayerContext

Input dimensions has been removed from layer node.
Input dimensions are now directly managed by InitLayerContext.
This has minor overhead (at load if dimensions are edited many times but
is rare) but is less confusing and less prones to bug.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[inputlayer] Update input layer with LayerV2 design
Parichay Kapoor [Wed, 23 Jun 2021 06:08:16 +0000 (15:08 +0900)]
[inputlayer] Update input layer with LayerV2 design

Update input layer to upgrade to LayerV2 design

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[var_grad] Remove cloneTransposeVariableOnly interface
Parichay Kapoor [Wed, 30 Jun 2021 05:59:02 +0000 (14:59 +0900)]
[var_grad] Remove cloneTransposeVariableOnly interface

Remove cloneTransposeVariableOnly interface which is not used anymore.
Also add todo in layer_context.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layerV2] Update build for Layerv2
Parichay Kapoor [Thu, 24 Jun 2021 08:43:00 +0000 (17:43 +0900)]
[layerV2] Update build for Layerv2

Update the build for Layerv2
This includes turning off model related unittests, creating Layer
objects instead of Layerv1, and certain implementations missing in LayerNode.
There are some temporary codes with this patch just to make the CI/build
pass. They will be updated as other layers are converted to Layerv2.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer/factory] Update layer factory for LayerV2
Parichay Kapoor [Tue, 22 Jun 2021 02:32:12 +0000 (11:32 +0900)]
[layer/factory] Update layer factory for LayerV2

Update layer factory for LayerV2 for creation of layers of signature
LayerV2.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer/node] Update layer and node for LayerV2
Parichay Kapoor [Tue, 22 Jun 2021 02:30:43 +0000 (11:30 +0900)]
[layer/node] Update layer and node for LayerV2

Update node_exporter to export weights from run_context
Add corresponding const getters and setters in layer_context
Updated some fixes in var_grad

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[fc] Update fc layer for LayerV2
Parichay Kapoor [Tue, 22 Jun 2021 02:29:57 +0000 (11:29 +0900)]
[fc] Update fc layer for LayerV2

Update fc layer for LayerV2

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layer/optimizer] Reduce usage of getObject() for optimizer
Parichay Kapoor [Thu, 24 Jun 2021 06:59:51 +0000 (15:59 +0900)]
[layer/optimizer] Reduce usage of getObject() for optimizer

Reduce the usage of getObject() for optimizer.
This patch changes optimizer interface to work on each weight
individually and adds a getWeightObject() for the LayerContext.
This allows removal of getObject() from the neural network class for
backwarding.

This patch also disable dynamic_training_optimization.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[network/neuralnet] Reduce dependence on LayerV1
Parichay Kapoor [Thu, 24 Jun 2021 06:22:48 +0000 (15:22 +0900)]
[network/neuralnet] Reduce dependence on LayerV1

Reduce dependence on LayerNode()->getObject() which returns LayerV1
object.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[backbone] Remove support for scalesize
Parichay Kapoor [Thu, 24 Jun 2021 05:06:26 +0000 (14:06 +0900)]
[backbone] Remove support for scalesize

Remove support for scalesize for the backbone.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[layernode] Update throw to retval in setProperty
Parichay Kapoor [Thu, 24 Jun 2021 04:47:23 +0000 (13:47 +0900)]
[layernode] Update throw to retval in setProperty

Update throw to retval in setProperty when the property
is not set by LayerNode. Illegal properties vs non-captured properties
were being mixed in this case.
Now it has been separated by return false. If setProperty() throws, it
must propagate upwards as that error shows the error in the property
given.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[executionMode] Added mode of execution
Parichay Kapoor [Fri, 25 Jun 2021 10:24:16 +0000 (19:24 +0900)]
[executionMode] Added mode of execution

Added mode of execution. This would be passed to the layers
as well as used internally than just keeping bool training to denote
the current mode of execution.
The header can be kept separate so as to be able to import it whereever
needed without being header heavy.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[var_grad] Update trainable to need_gradient
Parichay Kapoor [Fri, 25 Jun 2021 10:05:00 +0000 (19:05 +0900)]
[var_grad] Update trainable to need_gradient

Update trainable logic to need_gradient for var_grad for verbosity.
If need_gradient is set, var_grad will have gradient set for it.
If it is a weight, with need_gradient set, gradients will be applied
every iteration.
If need_gradient is not, gradient will be a null tensor.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[backbone] Update default backbone to be trainable
Parichay Kapoor [Wed, 23 Jun 2021 04:50:11 +0000 (13:50 +0900)]
[backbone] Update default backbone to be trainable

Update default backbone to be trainable by default.
This will handle the case of tflite_layer and nnstreamer_layer
separately when handling trainable for layers like activation etc later.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>