Jihoon Lee [Tue, 13 Jul 2021 06:09:43 +0000 (15:09 +0900)]
[devel] Add layer_devel to devel
Add layer_devel to devel, without this, build fails with upstream/main
build
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 06:20:32 +0000 (15:20 +0900)]
[dataset/cleanup] Remove type from dataset
This patch removes `datasetUsageType` from dataset interface
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 09:58:43 +0000 (18:58 +0900)]
[dataset] split train / val / test databuffer
This patch splits train / val / test dataset.
It is also possible to set dataset separately from the model.
**Major Changes**
1. `auto dataset = createDataset(train_cb, val_cb, test_cb)` -> `auto
dataset_train = createDataset(train_cb)`
1. `NN.setDataset(dataset);` -> `NN.setDataset(DATA_TRAIN,
dataset_train)`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 10:27:50 +0000 (19:27 +0900)]
[dataset] Clean up dataset enums
`nntrainer::DataType` and `nntrainer::BufferType` is duplicated from
ccapi which was adding complication. This patch simply alternate those
types ccapi `DatasetType` / `DatasetDataUsageType`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 06:57:23 +0000 (15:57 +0900)]
[dataset] Remove label data
Before this patch, label.dat was only used to get number of classes,
unfortunately, it was clashing with how databuffer_generator is getting
the number of classes and creating some inconsistency. This patch fixes
the issue by deprecating label data.
* Checked TCT and there is no critical issue that makes tests fail with
this PR.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 22 Jun 2021 11:33:36 +0000 (20:33 +0900)]
[CAPI] Propose save/load api
**Motivations**
1. Fine grained api required to save and load
**Changes proposed in this PR:**
- Add function to model.h / `save`, `load`
- Mark `readModel()` deprecated (later remove...)
- Mark `saveModel()` to change `exportTo`
- Add `ModelSaveLoadFormat`
- Create capi-machine-learning-training-tizen-internal for internal api
support
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 04:47:28 +0000 (13:47 +0900)]
[CAPI] Propose dataset api sets
**Changes proposed in this PR:**
- `ml_train_dataset_create(ml_train_dataset_h *dataset);
- `ml_train_dataset_add_generator(dataset, usage, callback, user_data)`
- `ml_train_dataset_add_file(dataset, usage, file)`
- `ml_train_dataset_set_property_for_usage(dataset, usage)`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 6 Jul 2021 03:29:02 +0000 (12:29 +0900)]
[Pooling] Apply padding property
This patch applies padding property with tests
Also deletes `enum class PaddingType`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 12:09:04 +0000 (21:09 +0900)]
[Conv2D] Apply padding props to conv2d
This patch applies padding property to conv2d
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Tue, 13 Jul 2021 03:00:50 +0000 (12:00 +0900)]
[spec/pkg] Bugfix for dependency
Some of the packages state the `requires` as wrong names (missing
package name as prefix). This does not show in build and unit/app tests
but fails when using these packages externally as those dependent
`requires` packaging dont exist.
For example, `nntrainer-devel-static` package depends on `devel`
package, and not on `nntrainer-devel`. This patch provides the
corresponding fix.
Resolves #1399
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 11:45:20 +0000 (20:45 +0900)]
[Padding] Add padding compute function
This patch implements padding2d::compute with a test
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 30 Jun 2021 10:09:34 +0000 (19:09 +0900)]
[CAPI] Add ml_train_model_get_input|output_dims
This commit contains capi proposal to
`ml_train_model_get_input_dimensions` and
`ml_train_model_get_output_dimensions`
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 12 Jul 2021 02:07:26 +0000 (11:07 +0900)]
[Resnet] Sync resnet app with validated model
This patch syncs resnet application with validated model.
Changes include:
1. s/setIteration/updateIteration
2. normalize input
3. fixed batchsize of 128 -> args received from the sysargs
4. output layer to copy
5. silent failed to read epoch idx and iteration
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 07:35:19 +0000 (16:35 +0900)]
[Padding] Add padding verification
This patch implements `Padding2D::isValid` and tests respectively.
Padding2D property is valid when
1. string is "valid" or "same"
2. comma seperated, non-integer value of size 1, 2, 4
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 03:30:04 +0000 (12:30 +0900)]
[Fix] File name sanitization
As colon (':') is not permitted in windows file system, having files
with 'colon' prohibits cloning our repo. this patch fixes the issue
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 30 Jun 2021 03:26:43 +0000 (12:26 +0900)]
[docs] Add readme for resnet
This patch adds readme for resnet including citations.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 05:35:27 +0000 (14:35 +0900)]
[Padding] Add padding2d prop header
This patch adds padding2d property header.
Padding2D prop will be saved as a string and compute the integer when it
is needed
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 1 Jul 2021 05:13:09 +0000 (14:13 +0900)]
[Fix] Weight initializer stddev calculation
The weight initializer was fitted to only fully connected layer case,
this patch extends weight initializer to properly work with other
weights as well.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 25 Jun 2021 10:31:08 +0000 (19:31 +0900)]
[Resnet] Connect the model with cifar100
**Changes proposed in this PR:**
- Implement cifar100dataloader
- Connect the data loader
- Display arguments/dataset information/time
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 2 Jul 2021 04:12:56 +0000 (13:12 +0900)]
[Test/Bn] Add conv2d model test
This patch adds conv2d bn test which was not properly tested
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 25 Jun 2021 02:12:48 +0000 (11:12 +0900)]
[resnet] Implement test run to resnet
**Changes proposed**
- Add dataloader with interface `next()`.
- Add RandomDataLoader as an example
- Minor bug fixes to the model architecture regarding name
- Add routine to train the model
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Tue, 22 Jun 2021 00:53:12 +0000 (09:53 +0900)]
[ Recurrent ] Implement Dropout for Recurrent Net
In this commit, drop out for recurrent network is intrduced.
dropout property is introduced and if the element value of random
tensor is smaller than dropout rate, then it will be set zero.
The element which is not set zero, then it will scale with
1.0/(1.0-dropout).
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Wed, 30 Jun 2021 07:42:12 +0000 (16:42 +0900)]
[Fix/trivial] Change rpm group description
**Changes proposed in this PR:**
- Change group description to machine learning/ML Framework
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 24 Jun 2021 11:05:45 +0000 (20:05 +0900)]
[Resnet] Create resnet model
This patch creates code to create resnet.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 24 Jun 2021 08:36:45 +0000 (17:36 +0900)]
[Resnet/skeleton] Add helper functions
**Changes proposed in this PR:**
- Add `withKey()`, `resnetBlock()`, `createResnet18()`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 24 Jun 2021 05:13:00 +0000 (14:13 +0900)]
[Skeleton] Add resnet application skeleton
This patch adds resnet application skeleton
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Wed, 23 Jun 2021 02:21:15 +0000 (11:21 +0900)]
[Fix] coverity, svace issues
Coverity
1. Deleted noexception keyword where throw exception can occured.
2. Initialze member variable in constructor
3. Correct bitwise operation
resolves: 1238192, 1238193, 1238195, 1238196, 1238298
Svace
1. Catch unhandled throw
resolve: 464062
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Jihoon Lee [Mon, 21 Jun 2021 11:43:59 +0000 (20:43 +0900)]
[CustomLoss] Update example
**Changes proposed in this PR:**
- Update example using LayerClient
- Add LayerClient example to AppTest
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 21 Jun 2021 11:23:19 +0000 (20:23 +0900)]
[CustomLoss] Implement mae loss layer
This patch implements mae loss layer forward and backward, also moves
other functions to the sourcefile instead of inlinining it.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 23 Jun 2021 06:49:10 +0000 (15:49 +0900)]
[layer_v2] Fixes to merge layer_v2 with main branch
Fixes to merge layer_v2 branch with main branch.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 14:07:09 +0000 (23:07 +0900)]
[LayerNode] Change flatten and distribute to prop
**Changes proposed in this PR:**
- Fix bug on `loadProperties`
- Add flatten and distribute to prop
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Thu, 17 Jun 2021 08:02:57 +0000 (17:02 +0900)]
[Optimizer] Implement getOptimizerVariableDim
- Implement getOptimizerVariableDim in optimizer
- Implement requestOptimizerVariable in manager
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 09:48:00 +0000 (18:48 +0900)]
[LayerNode] Add layer(devel) to the node
**Changes proposed in this PR:**
- Add `Layer` to the node
- Remove `node_exporter.h` from `layer_node.h`
- Change compile time switch to runtime switch of checking layer v1 and
layer v2
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 08:00:28 +0000 (17:00 +0900)]
[AppContext] Integrate layer-devel
This patch integrates layer devel to appcontext and plugin features.
From this patch, having it is able to include LayerV1 and Layer at the
same time.
Also adding a test
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 07:33:33 +0000 (16:33 +0900)]
[manager] Support initialization/allocation of weights
Support initialization and allocation of weights for LayerV2
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 06:13:40 +0000 (15:13 +0900)]
[network] Remove check for double activation
Remove the check for double activation, it is now allowed to have
two activation operations one after the other.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 05:02:19 +0000 (14:02 +0900)]
[layercontext] Minor bugfix for layer context
Minor bugfix for layer context
Removes the extra trainable parameter from layer context
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 05:52:27 +0000 (14:52 +0900)]
[networkgraph] Network graph updated for Layer V2
Network graph updated to work with LayerV2
This involves settings input dimension in InitContext, and setting
up RunContext for each layer before their execution.
Further, some helper functions are also added in LayerNode.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 06:15:31 +0000 (15:15 +0900)]
[Test/Prepare] Add layer semantics test
For layer_v2, there will be more structured way to access tests.
This patch adds a skeleton for layer semantics test.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 01:30:01 +0000 (10:30 +0900)]
[Layer] Add trainable prop
This patch add trainable property to layer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 08:19:49 +0000 (17:19 +0900)]
[manager] Add support for request Inputs/outputs
Added support for requesting inputs and outputs
Also update implementation to use a common function for easier
management
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
hyeonseok lee [Thu, 17 Jun 2021 02:14:07 +0000 (11:14 +0900)]
[layer-internal] refactoring weight, input, output to layer_node
- Implement getter for variable and grad of weight, input, output
- Implement getNumWeights
Self evaluation:
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 10:55:19 +0000 (19:55 +0900)]
[graph] Support creating RunLayerContext
Implement the function to create RunLayerContext given the
InitLayerContext and update it in the given LayerNode.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 08:19:49 +0000 (17:19 +0900)]
[manager] Add support for request Inputs/outputs
Added support for requesting inputs and outputs
Also update implementation to use a common function for easier
management
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 10:33:56 +0000 (19:33 +0900)]
[Properties] Add float and boolean cases
Add float and boolean case to be used later.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 21 May 2021 08:48:15 +0000 (17:48 +0900)]
[Props] Add concept of empty to property
This patch adds empty concept to property instead of having insensible
value inside a property as a default
before this patch,
```cpp
auto a = Property<int>();
EXPECT_EQ(a.get(), -1); // let's assume -1 is default value which might not be sensible.
```
after this patch
```cpp
auto a = Property<int>();
EXPECT_THROW(a.get(), std::invalid_argument);
EXPECT_TRUE(a.empty());
```
so underlying value remains always null or valid.
Now, it became natural to distinguish mandatory props and non-mandatory
props
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 07:22:24 +0000 (16:22 +0900)]
[Layer] Scaffold layer impl
**Major changes**
- Create `LayerImpl`
- Change `LayerDevel` signature
s/int initialize/void finalize
s/int setProperty/void setProperty
- Move createLayer to layerImpl
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 08:41:46 +0000 (17:41 +0900)]
[Layer] Add const to the layer::getName()
getName() is independent of const, so this patch adds const to
layer::getName();
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Wed, 16 Jun 2021 05:53:21 +0000 (14:53 +0900)]
[layer-internal] refactoring getInputDimension getOutputDimension
implement getInputDimension, getOutputDimension in layer_node
Self evaluation:
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 01:52:59 +0000 (10:52 +0900)]
[neuralnet] Bug fix for graph usage
Bug fix patch to use constant iterators for the graph.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 05:15:54 +0000 (14:15 +0900)]
[graph] Remove non-const iterators
Remove non-const iterators for the graph.
Update the corresponding usages for the graph.
Note: the pattern for (auto const &node : graph) has been
removed as this can begin()/end() in the background.
Workaround for this is to use std::as_const(graph) but this
is only available from c++17, so this patch uses
the basic const iterator for loop pattern.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 03:28:04 +0000 (12:28 +0900)]
[graph] Graph iterator for sorted and unsorted
Updated graph iteartor to run over unsorted list of nodes
if the graph is not yet sorted. If the graph is sorted,
the iterator iterates over sorted nodes.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 03:13:50 +0000 (12:13 +0900)]
[graph] Update node_names to unordered_set
Update node_names storing all the names of the nodes
in the graph for uniqueness to use unordered_set.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 03:08:26 +0000 (12:08 +0900)]
[graph] Make adjacency list lazily
Update graph to make adjacenct list lazily when required
during the topological sort.
Until then, keep the node_list which contains simply all the nodes.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 05:32:29 +0000 (14:32 +0900)]
[manager] Support request Tensors and Weights
Added support for request Tensors and Weights
Moved spec of Tensors and Weights to VarGrad and Weights correspondingly
Also created constructors for weights and var_grad with their specs
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
hyeonseok lee [Tue, 15 Jun 2021 12:38:31 +0000 (21:38 +0900)]
[layer_internal] refactoring read, save
implement read, save function in layer_node
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 03:20:12 +0000 (12:20 +0900)]
[Test/Scaffolding] Prepare test for the new layer
For now, layer test is too cluttered so that it is not agile enought to
make any change.
This PR mainly separates the tests into two parts to make the test
scallable itself.
1. Layer only test leaving inside `unittest_layers_{layer_name}`
2. Common tests like setProperty, golden_test in `layers_common_tests.h`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 15 Jun 2021 07:43:08 +0000 (16:43 +0900)]
[Refactor] s/Layer/LayerV1
Change class name of layer to layerV1 to prepare migration
This should pass CI test.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 02:50:09 +0000 (11:50 +0900)]
[context] Layer context creation scaffolding
Added todo list for the layer context creation
Added minor new interfaces for the layer context
Disable in-place optimization and inference memory optimization
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 03:03:56 +0000 (12:03 +0900)]
[addition] Bug fix addition layer calcDerivative
Bug fix for addition layer calcDerivative().
addition layer assign the same tensor derivative memory back to its
other layers. If the other layers connecting with addition layers are
inplace, then this can lead to wrong results.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 04:42:13 +0000 (13:42 +0900)]
[Plugin/Test] Add preset plugin for general test
This extracts common test to be used multiple times.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 10 Jun 2021 10:14:57 +0000 (19:14 +0900)]
[Custom/Loss] Add scaffolding for loss example
This patch generates a skeleton code for mae custom loss example.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 11:13:24 +0000 (20:13 +0900)]
[CC17] Update type to inline static
This patch update `type` to be inline static data member inside an
object for clarity
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Wed, 16 Jun 2021 08:20:26 +0000 (17:20 +0900)]
[ GRU ] Add GRU Unittest
This commit includes,
. unittests of gru layer
. add keras code to compare
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 10 Jun 2021 12:01:01 +0000 (21:01 +0900)]
[ Recurrent ] Fix activation type
using hidden_state_activation_type
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 10 Jun 2021 04:57:47 +0000 (13:57 +0900)]
[ GRU ] implement fowarding / backwarding of GRU Layer
This commit includes,
. forwarding implementation
. calGradient implementation
. calDerivative implementation
. gru_basic unittest
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Fri, 21 May 2021 08:48:15 +0000 (17:48 +0900)]
[Props] Add concept of empty to property
This patch adds empty concept to property instead of having insensible
value inside a property as a default
before this patch,
```cpp
auto a = Property<int>();
EXPECT_EQ(a.get(), -1); // let's assume -1 is default value which might not be sensible.
```
after this patch
```cpp
auto a = Property<int>();
EXPECT_THROW(a.get(), std::invalid_argument);
EXPECT_TRUE(a.empty());
```
so underlying value remains always null or valid.
Now, it became natural to distinguish mandatory props and non-mandatory
props
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Wed, 16 Jun 2021 12:08:38 +0000 (21:08 +0900)]
[ Unit Test ] Fix LSTM / RNN Unittest
This commit includes,
. fix wrong unittest cases of lstm & rnn
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 28 May 2021 03:45:12 +0000 (12:45 +0900)]
[ GRU ] Skeleton Code for GRU
This commit includes,
. Skeleton Code for GRU Layer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Parichay Kapoor [Tue, 15 Jun 2021 07:32:05 +0000 (16:32 +0900)]
[docs] Update docs for tflite version
Updated docs with tensorflow-lite version to be 2.3.0 or higher
and added missing dependencies for local meson build as well.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Tue, 15 Jun 2021 07:43:08 +0000 (16:43 +0900)]
[Refactor] s/Layer/LayerV1
Change class name of layer to layerV1 to prepare migration
This should pass CI test.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 15 Jun 2021 06:35:27 +0000 (15:35 +0900)]
[API/Devel] Retouch devel header
**Changes proposed in this PR:**
- Change function sinature to exploit forward declaration
- s/export_to/exportTo
- Change header guard to be unique
**Self evaluation:**
1. Build test: [ ]Passed [ ]Failed [X]Skipped
2. Run test: [ ]Passed [ ]Failed [X]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 03:17:55 +0000 (12:17 +0900)]
[neuralnet] Neural network bugfix
Two PRs merged have caused main branch to fail.
This patch provides temporary fix.
Proper fix will be done with layer_v2.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 03:17:00 +0000 (12:17 +0900)]
[unittests] Update unittests for new gtest version
Remove gtest version check comments and update unittests for the new
gtest versions.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 10 Jun 2021 04:21:03 +0000 (13:21 +0900)]
[Loss] Apply requireLabel to feed label
Currently, loss layer is the only one that can accept label which was
prohibiting building a custom loss layer.
**Major Changes**
- Apply requireLabel to feed label
- Last layer is not assumed loss layer when feeding label(still it is in
other places, need to be dealt)
**Minor Changes**
- Feeding multiple label is now possible (spliting label is future work)
- [Fix] Count Trainable includes loss layer
- [Fix] Handle cases When all layer is not trainable for
neuralnet::backwarding
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 09:02:56 +0000 (18:02 +0900)]
[layer] Move name to LayerNode
Move layer identifier name to LayerNode.
Now each node in the graph has name as its direct identifier.
This patch also disable unittest_nntrianer_layers for some while
till layer refactoring is taking place. However, other unittests
and apptests must remain enabled to ensure that everything else is
working.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 14 Jun 2021 02:45:16 +0000 (11:45 +0900)]
[split layer] Initialize class member bugfix
Split layer has class member leading_helper_dim
which remains uninitialized and results in bugs
when setBatchSize() is called before initialization of the layer.
This patch fixes the bug.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 10 Jun 2021 04:38:00 +0000 (13:38 +0900)]
[layer context] Propose layer context and layer_devel
This patch proposes layer context and layer_devel.
The layer context provides basic necessities for each layer
like inputs, outputs, weights, tensors. Weights are the trainable tensors
required by the layers which can be requested with the context interfaces.
Correspondingly, any other tensor requirements by the layer can also be
requested with the context.
There are two types of context :
1. InitContext - provides context for initialization with input_dimensions
filled. This contains specifications for all the inputs/outputs/weights/tensors
which will be made available during executiong.
2. RunContext - provides allocated weights/inputs/outputs/tensors along with
some of the basic properties of the layer like trainable, etc.
This design allows layer devel to be stateless (derived layer can be stateful with their
own member objects), and behave as layer ops.
Further, this allows dependencies on weights/var_grads as context simplified the interface
with just tensors.
Also, this patch also removes dependency on the manager from layer_devel header.
layer context for each layer is owned by the corresponding LayerNode.
Many of the layer getters like getInputs/Outputs/Weights/etc can now
be moved to LayerNode which does not concern the custom layer writer,
and simplifies the layer operation code as well.
See also #986
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 10:52:31 +0000 (19:52 +0900)]
[C++17] Bump base version to c++17
This patch bumps baseline version to c++17.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 14 Jun 2021 02:54:32 +0000 (11:54 +0900)]
[deb/pkg] Updates for debian packaging
Update tolerance for sqrt unittest for debian arm64 packaging
increase timeout for appdraw classification app for debian armhf packaging
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 14 May 2021 10:55:58 +0000 (19:55 +0900)]
[Props] Implement input_layers as a property
This patch implements input_layers as a property with following changes
below
**Changes proposed in this PR:**
- Add ConnectionSpec object for generic purpose
- Add InputSepc as a property
- Add following tests
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 10:35:39 +0000 (19:35 +0900)]
[Test/models] Add addition test
This patch adds addition test, while add support for reading multiple
inputs and outputs inside `unittest_nntrainer_models.cpp`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 10:11:51 +0000 (19:11 +0900)]
[Retouch] unittest_models test cases
**Changes proposed in this PR:**
- Reduce scale of inputs
- Add iterations for recurrent layers (and delete _n) family tcs
- Clean duplicated codes in genModelTests.py
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 14 Jun 2021 01:47:45 +0000 (10:47 +0900)]
[deb/pkg] Increase c-api unittest timeout
Increase c-api unittest timeout even further to
pass armhf build on launchpad.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 10 Jun 2021 08:45:20 +0000 (17:45 +0900)]
[Docs] Update custom documentation for loss layer
This patch updates documentation of custom for loss layer.
Also deletes redundant `pow.cpp` that sneaked in.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 09:34:37 +0000 (18:34 +0900)]
[deb] Update lower tolerance bug + increase timeouts
Inverse the check for lower error tolerance for debian build
Also increase timeouts for nntrainer capi/ccapi unittests
as they timeout for armhf architecture on launchpad.
See also #1251
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 10 Jun 2021 04:42:09 +0000 (13:42 +0900)]
[Trivial] Change return -> retval
When stating value in doxygen it is recommended to use @retval, this
patch reflects the issue
**Self evaluation:**
1. Build test: [ ]Passed [ ]Failed [X]Skipped
2. Run test: [ ]Passed [ ]Failed [X]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Thu, 10 Jun 2021 08:43:56 +0000 (17:43 +0900)]
[interpreter] Updated export_to usage with LayerNode
Updated export_to function usage based on LayerNode
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 7 Jun 2021 06:52:50 +0000 (15:52 +0900)]
[layerNode] Refactor get/set number of inputs/outputs
Refactor get/set number of inputs/outputs.
This is now handled by layer_node.
Updating the input/output layers updates the count.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 7 Jun 2021 06:20:16 +0000 (15:20 +0900)]
[layer/node] Move input/output layers to LayerNode
This patch moves input/output layers to LayerNode.
Also create an interface for accessing/setting input/output layers
from in the LayerNode.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 4 Jun 2021 03:51:10 +0000 (12:51 +0900)]
[api] Added api dependency in pc files
Added API dependecy in pc files for the capi and ccapi.
The dependency is of the capi-ml-common which provides ml-api-common.h header.
Further, ml-api-common.h header from nntrainer repo is also removed.
Updated android build to download ml-api-common.h for nnstreamer daily builds.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Fri, 28 May 2021 02:24:46 +0000 (11:24 +0900)]
[ RNN ] Add Multi-Layer RNN Unit Tests
This commit includes,
. Multi-Layered RNN Unit Tests
. Some Fixes to run
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Thu, 10 Jun 2021 08:30:35 +0000 (17:30 +0900)]
[Trivial] Fix NDK warning -Wbraced-scalar-init
This patch fixes trivial NDK warning below
```
././../nntrainer/layers/lstm.cpp:315:25: warning: braces around scalar initializer [-Wbraced-scalar-init]
Tensor dh_nx = Tensor({derivative_.width()});
^~~~~~~~~~~~~~~~~~~~~
././../nntrainer/layers/lstm.cpp:316:25: warning: braces around scalar initializer [-Wbraced-scalar-init]
Tensor dc_nx = Tensor({derivative_.width()});
^~~~~~~~~~~~~~~~~~~~~
././../nntrainer/layers/lstm.cpp:352:26: warning: braces around scalar initializer [-Wbraced-scalar-init]
hs_prev = Tensor({hs_t.width()});
^~~~~~~~~~~~~~
././../nntrainer/layers/lstm.cpp:354:26: warning: braces around scalar initializer [-Wbraced-scalar-init]
cs_prev = Tensor({cs_t.width()});
```
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Thu, 10 Jun 2021 08:29:34 +0000 (17:29 +0900)]
[test/model] Cleanup unittest_nntrainer_models
Cleaned up unittest_nntrainer_models.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 8 Jun 2021 05:30:11 +0000 (14:30 +0900)]
[layer] Support inPlace() as an interface for the layer
Support inPlace() as an interface for the layer.
If returns true, manager can set its inputs/outputs to be
the same memory location, but not necessarily.
Corresponding graph code is also updated.
Resolves #867
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Tue, 8 Jun 2021 06:07:11 +0000 (15:07 +0900)]
[Layer] Add requireLabel() function
Add a simple function `Layer::requireLayer()`.
This function is used to detect if a certain layer requires a layer.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 2 Jun 2021 05:34:14 +0000 (14:34 +0900)]
[APP] Add tensorflow based product rating
Added tensorflow based product rating.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Wed, 2 Jun 2021 07:19:59 +0000 (16:19 +0900)]
[SplitLayer] Support flexible dimension
This patch adds support for the flexible dimension in split layer.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Cc: Parichay Kapoor <pk.kapoor@samsung.com>
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 10 May 2021 02:53:39 +0000 (11:53 +0900)]
[Permute] Add permute layer forward/backward
This patch implements forward/backward with a following test
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Fri, 21 May 2021 02:53:36 +0000 (11:53 +0900)]
[ RNN ] Add unittest case & return_sequence
This PR includes.
. unittest of rnn layer
. support of return sequence of rnn layer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>