Parichay Kapoor [Wed, 30 Jun 2021 14:30:54 +0000 (23:30 +0900)]
[layer] Update Activation layer to LayerV2
Update activation layer implementation with layerV2 design.
Also add minor updates to input layer.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 30 Jun 2021 04:40:25 +0000 (13:40 +0900)]
[test] Enable modelfile unittest
Enable modelfile unittests with newly update LayerV2 to test
building and initialization of the models.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 25 Jun 2021 06:06:47 +0000 (15:06 +0900)]
[layer] Cleanup layer_factory
Cleanup layer_factory and use app_context at all places to create layer.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 25 Jun 2021 05:56:30 +0000 (14:56 +0900)]
[api/network] Update api/network for new losses
Update API/network including app_context to work with new losses:
- register all the losses
- update API implementation, and add other loss layers
- deprecate layer factory
- also set other elements in RunLayerContext properly
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 23 Jun 2021 07:04:39 +0000 (16:04 +0900)]
[losslayer] Update loss layer with LayerV2 design
This patch update loss layer with LayerV2 design.
In order to limit the interface to the updated Layer interface,
different type of loss layers are split into individual classes
extending LossLayer class. LossLayer class is an abstract class
providing some common utility functions to all the loss functions.
Also added loss to RunLayerContext, which allows setting and getting
loss. However, note that although this loss value can be set/get by any
layer for themselves, it will only be used by the model for loss layers.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
[loss layer v2] loss layer v2
loss layer v2
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 2 Jul 2021 03:50:27 +0000 (12:50 +0900)]
[layernode] Maintain input_dim with LayerContext
Input dimensions has been removed from layer node.
Input dimensions are now directly managed by InitLayerContext.
This has minor overhead (at load if dimensions are edited many times but
is rare) but is less confusing and less prones to bug.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 23 Jun 2021 06:08:16 +0000 (15:08 +0900)]
[inputlayer] Update input layer with LayerV2 design
Update input layer to upgrade to LayerV2 design
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 30 Jun 2021 05:59:02 +0000 (14:59 +0900)]
[var_grad] Remove cloneTransposeVariableOnly interface
Remove cloneTransposeVariableOnly interface which is not used anymore.
Also add todo in layer_context.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 24 Jun 2021 08:43:00 +0000 (17:43 +0900)]
[layerV2] Update build for Layerv2
Update the build for Layerv2
This includes turning off model related unittests, creating Layer
objects instead of Layerv1, and certain implementations missing in LayerNode.
There are some temporary codes with this patch just to make the CI/build
pass. They will be updated as other layers are converted to Layerv2.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 02:32:12 +0000 (11:32 +0900)]
[layer/factory] Update layer factory for LayerV2
Update layer factory for LayerV2 for creation of layers of signature
LayerV2.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 02:30:43 +0000 (11:30 +0900)]
[layer/node] Update layer and node for LayerV2
Update node_exporter to export weights from run_context
Add corresponding const getters and setters in layer_context
Updated some fixes in var_grad
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 02:29:57 +0000 (11:29 +0900)]
[fc] Update fc layer for LayerV2
Update fc layer for LayerV2
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 24 Jun 2021 06:59:51 +0000 (15:59 +0900)]
[layer/optimizer] Reduce usage of getObject() for optimizer
Reduce the usage of getObject() for optimizer.
This patch changes optimizer interface to work on each weight
individually and adds a getWeightObject() for the LayerContext.
This allows removal of getObject() from the neural network class for
backwarding.
This patch also disable dynamic_training_optimization.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 24 Jun 2021 06:22:48 +0000 (15:22 +0900)]
[network/neuralnet] Reduce dependence on LayerV1
Reduce dependence on LayerNode()->getObject() which returns LayerV1
object.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 24 Jun 2021 05:06:26 +0000 (14:06 +0900)]
[backbone] Remove support for scalesize
Remove support for scalesize for the backbone.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 24 Jun 2021 04:47:23 +0000 (13:47 +0900)]
[layernode] Update throw to retval in setProperty
Update throw to retval in setProperty when the property
is not set by LayerNode. Illegal properties vs non-captured properties
were being mixed in this case.
Now it has been separated by return false. If setProperty() throws, it
must propagate upwards as that error shows the error in the property
given.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 25 Jun 2021 10:24:16 +0000 (19:24 +0900)]
[executionMode] Added mode of execution
Added mode of execution. This would be passed to the layers
as well as used internally than just keeping bool training to denote
the current mode of execution.
The header can be kept separate so as to be able to import it whereever
needed without being header heavy.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 25 Jun 2021 10:05:00 +0000 (19:05 +0900)]
[var_grad] Update trainable to need_gradient
Update trainable logic to need_gradient for var_grad for verbosity.
If need_gradient is set, var_grad will have gradient set for it.
If it is a weight, with need_gradient set, gradients will be applied
every iteration.
If need_gradient is not, gradient will be a null tensor.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 23 Jun 2021 04:50:11 +0000 (13:50 +0900)]
[backbone] Update default backbone to be trainable
Update default backbone to be trainable by default.
This will handle the case of tflite_layer and nnstreamer_layer
separately when handling trainable for layers like activation etc later.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 08:35:37 +0000 (17:35 +0900)]
[layer] Update getTrainable with supportBackwarding
This patch introduces supportBackwarding.
Currently getTrainable does two jobs -
1. check if the layer is trainable
2. check if the layer can do backwarding
Now, the two is separated.
Further, the trainable property has been moved to LayerNode.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 04:33:51 +0000 (13:33 +0900)]
[layernode] Move actiovation to LayerNode
Move activation from Layer internal to LayerNode
Added bug fix to avoid looping of distribute properties
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 21 Jun 2021 06:18:22 +0000 (15:18 +0900)]
[LayerImpl] Add weight/bias properties to LayerImpl
Add weights/bias related properties such as initializer, regularizer,
etc to LayerImpl. This creates a differentiating factor of LayerImpl
from LayerDevel as LayerImpl provides a base class for layers with
weights/bias properties loaded.
Added description to LayerImpl and LayerNode for the properties they
support, and the work they do/must do.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 21 Jun 2021 02:57:15 +0000 (11:57 +0900)]
[layernode] Move loss to layer node
Move loss to layer node for the registered weights.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 18 Jun 2021 05:16:40 +0000 (14:16 +0900)]
[layer context] Add interfaces for setBatch
Add interfaces to layer context required by layer developer
when they override setBatch()
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 18 Jun 2021 04:26:29 +0000 (13:26 +0900)]
[layernode] Layer interfaces support in LayerNode
Add support of Layer (layer devel) interfaces in LayerNode.
LayerNode interfaces are without arguments, and LayerNode will fill in
the corresponding context needed by the Layer.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 05:52:27 +0000 (14:52 +0900)]
[manager] Memory allocation for non-weight tensors
Added memory allocation for non-weight tensors has been added
including inputs, outputs and tensors.
For now, this does include any optimization except the gradient
based optimization.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 13:21:40 +0000 (22:21 +0900)]
[Dataset] Add random producer
Add onehot random producer which generatese a fake data
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 12:07:50 +0000 (21:07 +0900)]
[Dataset] Introduce data producer
This patch introduces data producer which abstracts a single iteration
creator.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Tue, 20 Jul 2021 09:32:14 +0000 (18:32 +0900)]
[Fix] Logical expression
- Fix logical expression
- Initialize member variable
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
jijoong.moon [Mon, 5 Jul 2021 12:50:58 +0000 (21:50 +0900)]
[ Layer ] implementation of DropOut Layer
In this commit,
. implementation of DropOutLayer
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 10:43:01 +0000 (19:43 +0900)]
[Dataset/CAPI] implement dataset ctor
**Changes proposed in this PR:**
- Implement `ml_train_dataset_create()`
- Implement `ml_train_add_generator()`
- Implement `ml_train_add_file()`
- Add test as per the implementation.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Tue, 20 Jul 2021 05:49:48 +0000 (14:49 +0900)]
[application] MNIST application
Update MNIST application benchmark values based on the updated weight
initialization.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 20 Jul 2021 05:45:20 +0000 (14:45 +0900)]
[activation] Update implementation for in-place
Update implementation to handle in-place and non-inplace scenarios.
With new memory scheme coming in, in-place activation optimization and
derivative optimization is disabled. This patch updates the activation
function implementations to work for in-place and non-in-place
scenarios.
RNN, LSTM and GRU use activation functions internally with always
in-place settings. So, both the modes are supported.
This patch specifically convers the activation function scenario.
The generic support of in-place will be done when manager is updated.
This resolves the MNIST training bug.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 19 Jul 2021 06:30:52 +0000 (15:30 +0900)]
[application] Disable MNIST unittest
MNIST unittest was not running as patch
c87963c5433514bccdf0d13f733dd15de4356105
added some filecheck but returned 0 upon failure.
This patch removes returning on failure with just a warning.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 09:49:58 +0000 (18:49 +0900)]
[capi] Implement dataset_set_property_for_usage
This patch implements `ml_train_dataset_set_property_for_usage` and it's
test
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 08:10:10 +0000 (17:10 +0900)]
[dataset/cleanup] Remove usage from dataset impl
As each dataset now has one usage(train, valid or test).
This patch removes usage handles from dataset impl.
**Major Changes**
1. merge train/test/val members to one
2. clarify some variable names.
3. move global variables to members of an instance.
4. remove databuffer_utils.h as it is no longer used.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 13 Jul 2021 06:09:43 +0000 (15:09 +0900)]
[devel] Add layer_devel to devel
Add layer_devel to devel, without this, build fails with upstream/main
build
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 9 Jul 2021 06:20:32 +0000 (15:20 +0900)]
[dataset/cleanup] Remove type from dataset
This patch removes `datasetUsageType` from dataset interface
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 09:58:43 +0000 (18:58 +0900)]
[dataset] split train / val / test databuffer
This patch splits train / val / test dataset.
It is also possible to set dataset separately from the model.
**Major Changes**
1. `auto dataset = createDataset(train_cb, val_cb, test_cb)` -> `auto
dataset_train = createDataset(train_cb)`
1. `NN.setDataset(dataset);` -> `NN.setDataset(DATA_TRAIN,
dataset_train)`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 10:27:50 +0000 (19:27 +0900)]
[dataset] Clean up dataset enums
`nntrainer::DataType` and `nntrainer::BufferType` is duplicated from
ccapi which was adding complication. This patch simply alternate those
types ccapi `DatasetType` / `DatasetDataUsageType`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 06:57:23 +0000 (15:57 +0900)]
[dataset] Remove label data
Before this patch, label.dat was only used to get number of classes,
unfortunately, it was clashing with how databuffer_generator is getting
the number of classes and creating some inconsistency. This patch fixes
the issue by deprecating label data.
* Checked TCT and there is no critical issue that makes tests fail with
this PR.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 22 Jun 2021 11:33:36 +0000 (20:33 +0900)]
[CAPI] Propose save/load api
**Motivations**
1. Fine grained api required to save and load
**Changes proposed in this PR:**
- Add function to model.h / `save`, `load`
- Mark `readModel()` deprecated (later remove...)
- Mark `saveModel()` to change `exportTo`
- Add `ModelSaveLoadFormat`
- Create capi-machine-learning-training-tizen-internal for internal api
support
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 04:47:28 +0000 (13:47 +0900)]
[CAPI] Propose dataset api sets
**Changes proposed in this PR:**
- `ml_train_dataset_create(ml_train_dataset_h *dataset);
- `ml_train_dataset_add_generator(dataset, usage, callback, user_data)`
- `ml_train_dataset_add_file(dataset, usage, file)`
- `ml_train_dataset_set_property_for_usage(dataset, usage)`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 6 Jul 2021 03:29:02 +0000 (12:29 +0900)]
[Pooling] Apply padding property
This patch applies padding property with tests
Also deletes `enum class PaddingType`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 12:09:04 +0000 (21:09 +0900)]
[Conv2D] Apply padding props to conv2d
This patch applies padding property to conv2d
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Tue, 13 Jul 2021 03:00:50 +0000 (12:00 +0900)]
[spec/pkg] Bugfix for dependency
Some of the packages state the `requires` as wrong names (missing
package name as prefix). This does not show in build and unit/app tests
but fails when using these packages externally as those dependent
`requires` packaging dont exist.
For example, `nntrainer-devel-static` package depends on `devel`
package, and not on `nntrainer-devel`. This patch provides the
corresponding fix.
Resolves #1399
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 11:45:20 +0000 (20:45 +0900)]
[Padding] Add padding compute function
This patch implements padding2d::compute with a test
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 30 Jun 2021 10:09:34 +0000 (19:09 +0900)]
[CAPI] Add ml_train_model_get_input|output_dims
This commit contains capi proposal to
`ml_train_model_get_input_dimensions` and
`ml_train_model_get_output_dimensions`
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 12 Jul 2021 02:07:26 +0000 (11:07 +0900)]
[Resnet] Sync resnet app with validated model
This patch syncs resnet application with validated model.
Changes include:
1. s/setIteration/updateIteration
2. normalize input
3. fixed batchsize of 128 -> args received from the sysargs
4. output layer to copy
5. silent failed to read epoch idx and iteration
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 07:35:19 +0000 (16:35 +0900)]
[Padding] Add padding verification
This patch implements `Padding2D::isValid` and tests respectively.
Padding2D property is valid when
1. string is "valid" or "same"
2. comma seperated, non-integer value of size 1, 2, 4
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 8 Jul 2021 03:30:04 +0000 (12:30 +0900)]
[Fix] File name sanitization
As colon (':') is not permitted in windows file system, having files
with 'colon' prohibits cloning our repo. this patch fixes the issue
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 30 Jun 2021 03:26:43 +0000 (12:26 +0900)]
[docs] Add readme for resnet
This patch adds readme for resnet including citations.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 5 Jul 2021 05:35:27 +0000 (14:35 +0900)]
[Padding] Add padding2d prop header
This patch adds padding2d property header.
Padding2D prop will be saved as a string and compute the integer when it
is needed
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 1 Jul 2021 05:13:09 +0000 (14:13 +0900)]
[Fix] Weight initializer stddev calculation
The weight initializer was fitted to only fully connected layer case,
this patch extends weight initializer to properly work with other
weights as well.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 25 Jun 2021 10:31:08 +0000 (19:31 +0900)]
[Resnet] Connect the model with cifar100
**Changes proposed in this PR:**
- Implement cifar100dataloader
- Connect the data loader
- Display arguments/dataset information/time
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 2 Jul 2021 04:12:56 +0000 (13:12 +0900)]
[Test/Bn] Add conv2d model test
This patch adds conv2d bn test which was not properly tested
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 25 Jun 2021 02:12:48 +0000 (11:12 +0900)]
[resnet] Implement test run to resnet
**Changes proposed**
- Add dataloader with interface `next()`.
- Add RandomDataLoader as an example
- Minor bug fixes to the model architecture regarding name
- Add routine to train the model
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Tue, 22 Jun 2021 00:53:12 +0000 (09:53 +0900)]
[ Recurrent ] Implement Dropout for Recurrent Net
In this commit, drop out for recurrent network is intrduced.
dropout property is introduced and if the element value of random
tensor is smaller than dropout rate, then it will be set zero.
The element which is not set zero, then it will scale with
1.0/(1.0-dropout).
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Wed, 30 Jun 2021 07:42:12 +0000 (16:42 +0900)]
[Fix/trivial] Change rpm group description
**Changes proposed in this PR:**
- Change group description to machine learning/ML Framework
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 24 Jun 2021 11:05:45 +0000 (20:05 +0900)]
[Resnet] Create resnet model
This patch creates code to create resnet.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 24 Jun 2021 08:36:45 +0000 (17:36 +0900)]
[Resnet/skeleton] Add helper functions
**Changes proposed in this PR:**
- Add `withKey()`, `resnetBlock()`, `createResnet18()`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 24 Jun 2021 05:13:00 +0000 (14:13 +0900)]
[Skeleton] Add resnet application skeleton
This patch adds resnet application skeleton
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Wed, 23 Jun 2021 02:21:15 +0000 (11:21 +0900)]
[Fix] coverity, svace issues
Coverity
1. Deleted noexception keyword where throw exception can occured.
2. Initialze member variable in constructor
3. Correct bitwise operation
resolves: 1238192, 1238193, 1238195, 1238196, 1238298
Svace
1. Catch unhandled throw
resolve: 464062
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Jihoon Lee [Mon, 21 Jun 2021 11:43:59 +0000 (20:43 +0900)]
[CustomLoss] Update example
**Changes proposed in this PR:**
- Update example using LayerClient
- Add LayerClient example to AppTest
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 21 Jun 2021 11:23:19 +0000 (20:23 +0900)]
[CustomLoss] Implement mae loss layer
This patch implements mae loss layer forward and backward, also moves
other functions to the sourcefile instead of inlinining it.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 23 Jun 2021 06:49:10 +0000 (15:49 +0900)]
[layer_v2] Fixes to merge layer_v2 with main branch
Fixes to merge layer_v2 branch with main branch.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 14:07:09 +0000 (23:07 +0900)]
[LayerNode] Change flatten and distribute to prop
**Changes proposed in this PR:**
- Fix bug on `loadProperties`
- Add flatten and distribute to prop
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Thu, 17 Jun 2021 08:02:57 +0000 (17:02 +0900)]
[Optimizer] Implement getOptimizerVariableDim
- Implement getOptimizerVariableDim in optimizer
- Implement requestOptimizerVariable in manager
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 09:48:00 +0000 (18:48 +0900)]
[LayerNode] Add layer(devel) to the node
**Changes proposed in this PR:**
- Add `Layer` to the node
- Remove `node_exporter.h` from `layer_node.h`
- Change compile time switch to runtime switch of checking layer v1 and
layer v2
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 08:00:28 +0000 (17:00 +0900)]
[AppContext] Integrate layer-devel
This patch integrates layer devel to appcontext and plugin features.
From this patch, having it is able to include LayerV1 and Layer at the
same time.
Also adding a test
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 07:33:33 +0000 (16:33 +0900)]
[manager] Support initialization/allocation of weights
Support initialization and allocation of weights for LayerV2
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 06:13:40 +0000 (15:13 +0900)]
[network] Remove check for double activation
Remove the check for double activation, it is now allowed to have
two activation operations one after the other.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 22 Jun 2021 05:02:19 +0000 (14:02 +0900)]
[layercontext] Minor bugfix for layer context
Minor bugfix for layer context
Removes the extra trainable parameter from layer context
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 05:52:27 +0000 (14:52 +0900)]
[networkgraph] Network graph updated for Layer V2
Network graph updated to work with LayerV2
This involves settings input dimension in InitContext, and setting
up RunContext for each layer before their execution.
Further, some helper functions are also added in LayerNode.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 06:15:31 +0000 (15:15 +0900)]
[Test/Prepare] Add layer semantics test
For layer_v2, there will be more structured way to access tests.
This patch adds a skeleton for layer semantics test.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 17 Jun 2021 01:30:01 +0000 (10:30 +0900)]
[Layer] Add trainable prop
This patch add trainable property to layer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 08:19:49 +0000 (17:19 +0900)]
[manager] Add support for request Inputs/outputs
Added support for requesting inputs and outputs
Also update implementation to use a common function for easier
management
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
hyeonseok lee [Thu, 17 Jun 2021 02:14:07 +0000 (11:14 +0900)]
[layer-internal] refactoring weight, input, output to layer_node
- Implement getter for variable and grad of weight, input, output
- Implement getNumWeights
Self evaluation:
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 10:55:19 +0000 (19:55 +0900)]
[graph] Support creating RunLayerContext
Implement the function to create RunLayerContext given the
InitLayerContext and update it in the given LayerNode.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 08:19:49 +0000 (17:19 +0900)]
[manager] Add support for request Inputs/outputs
Added support for requesting inputs and outputs
Also update implementation to use a common function for easier
management
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 10:33:56 +0000 (19:33 +0900)]
[Properties] Add float and boolean cases
Add float and boolean case to be used later.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 21 May 2021 08:48:15 +0000 (17:48 +0900)]
[Props] Add concept of empty to property
This patch adds empty concept to property instead of having insensible
value inside a property as a default
before this patch,
```cpp
auto a = Property<int>();
EXPECT_EQ(a.get(), -1); // let's assume -1 is default value which might not be sensible.
```
after this patch
```cpp
auto a = Property<int>();
EXPECT_THROW(a.get(), std::invalid_argument);
EXPECT_TRUE(a.empty());
```
so underlying value remains always null or valid.
Now, it became natural to distinguish mandatory props and non-mandatory
props
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 07:22:24 +0000 (16:22 +0900)]
[Layer] Scaffold layer impl
**Major changes**
- Create `LayerImpl`
- Change `LayerDevel` signature
s/int initialize/void finalize
s/int setProperty/void setProperty
- Move createLayer to layerImpl
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 08:41:46 +0000 (17:41 +0900)]
[Layer] Add const to the layer::getName()
getName() is independent of const, so this patch adds const to
layer::getName();
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
hyeonseok lee [Wed, 16 Jun 2021 05:53:21 +0000 (14:53 +0900)]
[layer-internal] refactoring getInputDimension getOutputDimension
implement getInputDimension, getOutputDimension in layer_node
Self evaluation:
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Parichay Kapoor [Thu, 17 Jun 2021 01:52:59 +0000 (10:52 +0900)]
[neuralnet] Bug fix for graph usage
Bug fix patch to use constant iterators for the graph.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 05:15:54 +0000 (14:15 +0900)]
[graph] Remove non-const iterators
Remove non-const iterators for the graph.
Update the corresponding usages for the graph.
Note: the pattern for (auto const &node : graph) has been
removed as this can begin()/end() in the background.
Workaround for this is to use std::as_const(graph) but this
is only available from c++17, so this patch uses
the basic const iterator for loop pattern.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 03:28:04 +0000 (12:28 +0900)]
[graph] Graph iterator for sorted and unsorted
Updated graph iteartor to run over unsorted list of nodes
if the graph is not yet sorted. If the graph is sorted,
the iterator iterates over sorted nodes.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 03:13:50 +0000 (12:13 +0900)]
[graph] Update node_names to unordered_set
Update node_names storing all the names of the nodes
in the graph for uniqueness to use unordered_set.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 11 Jun 2021 03:08:26 +0000 (12:08 +0900)]
[graph] Make adjacency list lazily
Update graph to make adjacenct list lazily when required
during the topological sort.
Until then, keep the node_list which contains simply all the nodes.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 05:32:29 +0000 (14:32 +0900)]
[manager] Support request Tensors and Weights
Added support for request Tensors and Weights
Moved spec of Tensors and Weights to VarGrad and Weights correspondingly
Also created constructors for weights and var_grad with their specs
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
hyeonseok lee [Tue, 15 Jun 2021 12:38:31 +0000 (21:38 +0900)]
[layer_internal] refactoring read, save
implement read, save function in layer_node
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
Jihoon Lee [Wed, 16 Jun 2021 03:20:12 +0000 (12:20 +0900)]
[Test/Scaffolding] Prepare test for the new layer
For now, layer test is too cluttered so that it is not agile enought to
make any change.
This PR mainly separates the tests into two parts to make the test
scallable itself.
1. Layer only test leaving inside `unittest_layers_{layer_name}`
2. Common tests like setProperty, golden_test in `layers_common_tests.h`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 15 Jun 2021 07:43:08 +0000 (16:43 +0900)]
[Refactor] s/Layer/LayerV1
Change class name of layer to layerV1 to prepare migration
This should pass CI test.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 02:50:09 +0000 (11:50 +0900)]
[context] Layer context creation scaffolding
Added todo list for the layer context creation
Added minor new interfaces for the layer context
Disable in-place optimization and inference memory optimization
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 16 Jun 2021 03:03:56 +0000 (12:03 +0900)]
[addition] Bug fix addition layer calcDerivative
Bug fix for addition layer calcDerivative().
addition layer assign the same tensor derivative memory back to its
other layers. If the other layers connecting with addition layers are
inplace, then this can lead to wrong results.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 04:42:13 +0000 (13:42 +0900)]
[Plugin/Test] Add preset plugin for general test
This extracts common test to be used multiple times.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 10 Jun 2021 10:14:57 +0000 (19:14 +0900)]
[Custom/Loss] Add scaffolding for loss example
This patch generates a skeleton code for mae custom loss example.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 11 Jun 2021 11:13:24 +0000 (20:13 +0900)]
[CC17] Update type to inline static
This patch update `type` to be inline static data member inside an
object for clarity
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Wed, 16 Jun 2021 08:20:26 +0000 (17:20 +0900)]
[ GRU ] Add GRU Unittest
This commit includes,
. unittests of gru layer
. add keras code to compare
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>