platform/core/ml/nntrainer.git
3 years ago[tf match] Verified with 1x1 conv
Parichay Kapoor [Fri, 21 Aug 2020 03:19:48 +0000 (12:19 +0900)]
[tf match] Verified with 1x1 conv

Verified with 1x1 conv but 2x2 conv (2nd conv) creates the issues
This commit helps in reproduction

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[mnist] Update tensorflow training for mnist application
Parichay Kapoor [Thu, 20 Aug 2020 05:39:09 +0000 (14:39 +0900)]
[mnist] Update tensorflow training for mnist application

Update tensorflow training example for mnist application
This with same initialization as nntrainer (shown with zero initialization)
matches the final accuracy as well as loss with nntrainer

See also #133

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[Example] Minor bugfix
Parichay Kapoor [Tue, 25 Aug 2020 07:51:57 +0000 (16:51 +0900)]
[Example] Minor bugfix

Added minor bugfix to application ini file

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years agoFix typo in capi-nntrainer spec
Jihoon Lee [Tue, 25 Aug 2020 08:45:28 +0000 (17:45 +0900)]
Fix typo in capi-nntrainer spec

**Changes proposed in this PR:**
- Fix typo capi-nntrainer that are causing unexpected exit

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[Example] Update docs
Jihoon Lee [Tue, 25 Aug 2020 05:45:15 +0000 (14:45 +0900)]
[Example] Update docs

**Changes proposed in this PR:**
- Update docs
- Add demo footage
- Delete batch from model input shape

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[Example] Add missing model file
Jihoon Lee [Tue, 25 Aug 2020 07:44:46 +0000 (16:44 +0900)]
[Example] Add missing model file

Add missing model file that are excluded from .gitignore by force

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[TCM] Fix no assertion test
Jihoon Lee [Mon, 24 Aug 2020 10:22:51 +0000 (19:22 +0900)]
[TCM] Fix no assertion test

This patch fixes no test that has no assertion

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ Docs ] Add how to run test
Jihoon Lee [Mon, 24 Aug 2020 10:37:27 +0000 (19:37 +0900)]
[ Docs ] Add how to run test

**Changes proposed in this PR:**
- How to run unittest
- How to run sample app test

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
3 years ago[ Application ] change nntrainer_knn to knn_sample accepted/tizen/unified/20200825.033250 submit/tizen/20200824.102155 submit/trunk/20200824.100641
jijoongmoon [Fri, 21 Aug 2020 10:57:22 +0000 (19:57 +0900)]
[ Application ] change nntrainer_knn to knn_sample

Current implementation of KNN does not use nntrainer. However it is
still good transfer learning example using KNN.
Therefore, we change the execution file name from nntrainer_knn to
knn_sample.
main.cpp with NYI is added instead until we implement it with
nntrainer.

Issue #457

Signed-off-by: jijoongmoon <jijoong.moon@samsung.com>
3 years ago[ Coverity ] Fix coverity issues.
jijoong.moon [Mon, 24 Aug 2020 01:50:33 +0000 (10:50 +0900)]
[ Coverity ] Fix coverity issues.

This PR inclues coverity issues fixes.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
3 years ago[loss] Rename cost to loss
Parichay Kapoor [Mon, 24 Aug 2020 04:38:33 +0000 (13:38 +0900)]
[loss] Rename cost to loss

Rename cost to loss

See also #239

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[ini] Update INI configuration
Parichay Kapoor [Mon, 24 Aug 2020 02:48:06 +0000 (11:48 +0900)]
[ini] Update INI configuration

Update ini configuration with the below changes:
- filter -> filters (for convolution)
- pooling_size -> pool_size
- epoch -> epochs
- minibatch -> batch_size

See also #239

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[init] Update weight initializations
Parichay Kapoor [Mon, 24 Aug 2020 02:09:47 +0000 (11:09 +0900)]
[init] Update weight initializations

Add proper bias initialization than just bias_init_zero
Update weight_ini to weight_initializer
Add zero initializer to the list of supported initiailizers

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[INI] This patch refactors the INI naming for saving model file
Parichay Kapoor [Fri, 21 Aug 2020 06:45:13 +0000 (15:45 +0900)]
[INI] This patch refactors the INI naming for saving model file

This patch update the INI naming for saving the model file
from model to save_path

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
3 years ago[license] Fix SPDX license usage
Parichay Kapoor [Mon, 24 Aug 2020 05:29:58 +0000 (14:29 +0900)]
[license] Fix SPDX license usage

Fix SPDX license usage

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[docs] Updated docs to github MD lint
Parichay Kapoor [Mon, 24 Aug 2020 00:36:08 +0000 (09:36 +0900)]
[docs] Updated docs to github MD lint

Updated docs as per github MD lint
Also added minor fixes

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[Tensor] Set param size cannot update size
Parichay Kapoor [Fri, 21 Aug 2020 04:53:07 +0000 (13:53 +0900)]
[Tensor] Set param size cannot update size

Add the restriction of updating the param size for layer weights once set
As of now, this adds the restriction of changing filter size for convolution once set
But this needs to be fixes in convolution. Conv2D should just have 1 weights and bias,
not split by filter size

Resolves #293

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ini] Update Network section to Model
Parichay Kapoor [Fri, 21 Aug 2020 05:09:00 +0000 (14:09 +0900)]
[ini] Update Network section to Model

Update section name from Network to Model
Also update internal naming in loadModel and correspondingly unittests

Resolves #318

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[config] Separate loading model with config from neuralnetwork class
Parichay Kapoor [Wed, 5 Aug 2020 10:31:30 +0000 (19:31 +0900)]
[config] Separate loading model with config from neuralnetwork class

Loading the model from given ini file is separated from main neuralnetwork class
This will allow loading from other type of config files and keep neuralnetwork class cleaner

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[tensor] restriected setDim is now reshape
Parichay Kapoor [Thu, 20 Aug 2020 06:06:26 +0000 (15:06 +0900)]
[tensor] restriected setDim is now reshape

Tensor setDim() is renamed to reshape() as tensor should only be allowed to reshape
Resizing it invalidates the invariant for the other holders of the same data of tensor (see also #412)

This patch renames setDim() to reshape() which throws if there is an attempt to change size
Further getDim() now returns a copy of the dimension to disallow changing dimension of a tensor directly

Also some aliases in Tensor (like getWidth()) have been removed. Rather, width() like functions are kept
to maintain coherency with TensorDim() interface.

Resolves #412

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ Documents ] Add Documentation
jijoongmoon [Wed, 19 Aug 2020 06:24:35 +0000 (15:24 +0900)]
[ Documents ] Add Documentation

. getting-started.md
. how-to-run-exampels.md
. how-to-use-testcases.md
. Applicatin/Classification/README.md

Signed-off-by: jijoongmoon <jijoong.moon@samsung.com>
4 years ago[neuralnet] Train loop index variable reuse remove
Parichay Kapoor [Wed, 19 Aug 2020 06:48:50 +0000 (15:48 +0900)]
[neuralnet] Train loop index variable reuse remove

Multiple nested loop with same variable names are being used. Fix that.
Also, reporting for train epoch should start from 1 and not from 0.

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ MNIST ] Comparison with Tensorflow
jijoong.moon [Wed, 5 Aug 2020 05:19:22 +0000 (14:19 +0900)]
[ MNIST ] Comparison with Tensorflow

This PR includes:
 . mnist tensorflow example to compare
 . data generation for tensorflow
 . input for mnist ini for small size training set
 . fix no_op_prime for activation layer
 . add sqrtDouble
 . remove conv2d rotate_180

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[layers] Make input and hidden consistent for layers
Parichay Kapoor [Wed, 19 Aug 2020 04:25:05 +0000 (13:25 +0900)]
[layers] Make input and hidden consistent for layers

Set input and hidden consistent for all layers.

Resolves #420

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[layers] Remove last layer property
Parichay Kapoor [Wed, 19 Aug 2020 05:41:29 +0000 (14:41 +0900)]
[layers] Remove last layer property

Remove last layer property from all the layers as it is redundant

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[bn_layer] Remove unused gradients
Parichay Kapoor [Wed, 19 Aug 2020 04:06:28 +0000 (13:06 +0900)]
[bn_layer] Remove unused gradients

Remove ununsed gradients for mu and variance for the batch normalization layers to remove unnecessary memory usage.

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[Example] Add display result accepted/tizen/unified/20200820.034726 submit/tizen/20200819.035140
Jihoon Lee [Thu, 13 Aug 2020 02:34:26 +0000 (11:34 +0900)]
[Example] Add display result

**Changes proposed in this PR:**
- Run training asynchronsly
- Parse train result
- Change draw / result appearance

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[ Bug ] Fix calculating training loss
jijoong.moon [Thu, 13 Aug 2020 11:05:49 +0000 (20:05 +0900)]
[ Bug ] Fix calculating training loss

In this PR, calculation of training loss is fixed.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[ Application ] fix model file path of classification example
jijoong.moon [Tue, 18 Aug 2020 05:50:26 +0000 (14:50 +0900)]
[ Application ] fix model file path of classification example

Previously model path is fixed as "../../res/mobilenetv2.tflite"
In this PR, data_path is used for define path.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[capi] Comply with Tizen ACR comments
Parichay Kapoor [Tue, 18 Aug 2020 05:43:34 +0000 (14:43 +0900)]
[capi] Comply with Tizen ACR comments

This patch applies comments from the Tizen ACR
- Comments to have % as prefic
- remove ML_ERROR_CANNOT_ASSIGN_ADDRESS as it unknown and not used
- Remove `experiment API` statements
- 6.x -> 6.0
- Add groups in api-common.h

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[TCM] Add few negative cases
Jihoon Lee [Thu, 13 Aug 2020 07:35:04 +0000 (16:35 +0900)]
[TCM] Add few negative cases

**Changes proposed in this PR:**
- Add losslayer / util negative cases
- Change ASSERT_EXCEPTION -> EXPECT_THROW

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[ SAVCE ] Fix svace issues accepted/tizen/unified/20200813.015359 submit/tizen/20200812.132119
jijoong.moon [Wed, 12 Aug 2020 13:03:40 +0000 (22:03 +0900)]
[ SAVCE ] Fix svace issues

This PR includes fixes about SVACE issues

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years agoAdditional validation and fixes for network::init submit/tizen/20200812.105743
Jihoon Lee [Tue, 11 Aug 2020 06:20:03 +0000 (15:20 +0900)]
Additional validation and fixes for network::init

**Changes proposed in this PR:**
- Add term `realization`. This term denotes that enum type is turned
into an instantiation
- Change `initActivationLayer` -> `realizeActivationType`
- Change `initFlattenLayer` -> `relizeFlattenType`
- Delete `_make_act_layer`
- Add `TensorDim::isEmpty()`
- Clarify `checkValidation` -> `isInitializable` and move beforehand
validation logics
- Fix logic that prevents double activation
- Move layer name validaton phase `addLayer` -> `isInitializable`
- Prevent realization of `ACT_NONE`
- Sort unittest_modelfile order and update comments
- Enable few test cases

Resolves #390
See also #388

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[CAPI] Add comments for tizen privileges
Parichay Kapoor [Wed, 12 Aug 2020 10:04:42 +0000 (19:04 +0900)]
[CAPI] Add comments for tizen privileges

Add commnets for privileges which might be required in case of saving the model
to a file.

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[model] Change the semantics for setting the batchsize
Parichay Kapoor [Fri, 7 Aug 2020 11:45:07 +0000 (20:45 +0900)]
[model] Change the semantics for setting the batchsize

Batchsize as a property was earlier set in input_shape for a layer
as well as for the model. However, this was conflicting and prone to errors.

This patch disables setting batch size while setting the input_shape.
If batch_size is more than 1 in the input_shape, a warning is issued.
And this value is later overriden by the batchsize set for the model.
This same batchsize is also used for the dataset as well.

Resolves #299

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[softmax] Bug fix for softmax
Parichay Kapoor [Tue, 11 Aug 2020 08:20:05 +0000 (17:20 +0900)]
[softmax] Bug fix for softmax

Softmax should apply on 1 axis (last axis in this patch) than all the axis.

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[softmax] Bug fix for softmax
Parichay Kapoor [Tue, 11 Aug 2020 08:18:05 +0000 (17:18 +0900)]
[softmax] Bug fix for softmax

Softmax indexing bug fix

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[TensorDim] Change behaviour of default tensorDim constructor
Parichay Kapoor [Wed, 5 Aug 2020 01:29:05 +0000 (10:29 +0900)]
[TensorDim] Change behaviour of default tensorDim constructor

Default tensor dim constructor sets up tensorDim of all 0 which are invalid to use by nature
Setting any dimension resets all the 0 dimensions to 1
This removes allocation of memory by tensor on default constructor yet allows using default constructor of tensor

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years agoAdd atomicity to loadFromConfig
Jihoon Lee [Mon, 10 Aug 2020 07:26:15 +0000 (16:26 +0900)]
Add atomicity to loadFromConfig

resolves #382

**Changes proposed in this PR:**
- Add swap for nerualNetwork
- Correct doxygen style for neuralnet members
- Add optimizer copy ctor

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoAdd safeguard to prevent loadFromConf twice
Jihoon Lee [Thu, 6 Aug 2020 07:53:05 +0000 (16:53 +0900)]
Add safeguard to prevent loadFromConf twice

**Changes proposed in this PR:**
- This patch disables loadFromConfig after loading

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[capi] Add tizen feature checking accepted/tizen/unified/20200811.050512 submit/tizen/20200811.015558
Parichay Kapoor [Mon, 10 Aug 2020 05:19:44 +0000 (14:19 +0900)]
[capi] Add tizen feature checking

Added tizen feature checking for C-API
Added corresponding bypass for the tizen capi unittests

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ Release ] Change version of nntrainer
jijoong.moon [Mon, 10 Aug 2020 08:02:43 +0000 (17:02 +0900)]
[ Release ] Change version of nntrainer

Release Version 0.1.0.rc1

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[Example] Change user scenario submit/tizen/20200810.070913
Jihoon Lee [Thu, 6 Aug 2020 07:41:52 +0000 (16:41 +0900)]
[Example] Change user scenario

In this patch, user draws smile face and sad face respectively for 5
times.

It is splited half to trainset/validationSet and used for training.

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[Tensor] Enable default copy constructor
Jihoon Lee [Tue, 4 Aug 2020 11:42:07 +0000 (20:42 +0900)]
[Tensor] Enable default copy constructor

This patch enables default copy constructor to fix #281.

- Added sharedConstTensor for safety

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[api] Update since tizen version accepted/tizen/unified/20200810.123036 submit/tizen/20200810.011036
Parichay Kapoor [Fri, 7 Aug 2020 08:09:41 +0000 (17:09 +0900)]
[api] Update since tizen version

Update since tizen version from 6.x to 6.0

See also #414

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[api] Add note that this API is experimental
Parichay Kapoor [Fri, 7 Aug 2020 08:06:28 +0000 (17:06 +0900)]
[api] Add note that this API is experimental

Added note that this API is experimental and not stable
to most of the functions for this to be visible in the SDK function descriptions

See Also #414

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ MNIST ] Fix bug about ADAM implementation submit/tizen/20200807.102403
jijoong.moon [Thu, 30 Jul 2020 11:50:36 +0000 (20:50 +0900)]
[ MNIST ] Fix bug about ADAM implementation

There was a bug to calculate ADAM optimizer.
This PR provides fix.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[ README ] Update README file
jijoong.moon [Wed, 5 Aug 2020 06:11:12 +0000 (15:11 +0900)]
[ README ] Update README file

Update README file to give proper information about NNTrainer

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years agoTizen C-API application bug fix
Jihoon Lee [Tue, 4 Aug 2020 11:31:44 +0000 (20:31 +0900)]
Tizen C-API application bug fix

issue #377 has occured since #403.

Found that current classification assumes to have trainingSet as a
validationSet.

Fixed accordingly, which resolves #377

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoAttach nntrainer api
Jihoon Lee [Tue, 28 Jul 2020 04:59:08 +0000 (13:59 +0900)]
Attach nntrainer api

**Changes proposed in this PR:**
- Attach nntrainer api for training

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoAdd neuralnet test cases that is not working
Jihoon Lee [Thu, 30 Jul 2020 12:41:58 +0000 (21:41 +0900)]
Add neuralnet test cases that is not working

Add test cases. some cases are not working so it is disabled.

After fixing bugs in #388 it can be opend again

**Changes proposed in this PR:**
- Fix some typos
- Add some debug logs

See also #388

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[Example] Add feature extraction part
Jihoon Lee [Tue, 28 Jul 2020 00:59:42 +0000 (09:59 +0900)]
[Example] Add feature extraction part

**Changes proposed in this PR:**
- Add feature extraction using `ml-pipeline-api` (saved to training.dat)
- Handle exit canvas / proceed / reset button
- Add path parsing utils
- Add logger macro

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoRefactor neuralnet::init
Jihoon Lee [Wed, 29 Jul 2020 11:34:24 +0000 (20:34 +0900)]
Refactor neuralnet::init

**Changes proposed in this PR:**
- Add determinancy and check to
- Delete cost / bn_follow from layer
- Enable negative test for modelfile
- fix IniSection::setEntry is wrongly using key for an erase behavior
- Delete `VERIFY_SET_DIMENSION()`

Resolves #316
See also #374

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[Layer] Update forward and backward layer functions
Parichay Kapoor [Mon, 3 Aug 2020 09:08:02 +0000 (18:08 +0900)]
[Layer] Update forward and backward layer functions

Update the forwarding and backwarding layer functions to support
multiple inputs and outputs.
This is crucial to support concat, element wise add, etc kind of layers
as well as skip connection.
Further, forward no longer returns status but rather throws.

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[unittest] Tizen C-API unittest bug fix
Parichay Kapoor [Mon, 3 Aug 2020 10:30:35 +0000 (19:30 +0900)]
[unittest] Tizen C-API unittest bug fix

Train with generator positive unittest bugs are fixed
Now, it is also training

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years agoPrint log file and line for better debuging
Jihoon Lee [Fri, 31 Jul 2020 02:19:07 +0000 (11:19 +0900)]
Print log file and line for better debuging

**Changes proposed in this PR:**
- Add file and line to the logger
- Delete log tag for linux distro (as it is obvious and consumes space)

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[C-API] C-API updated with tizen ACR
Parichay Kapoor [Thu, 23 Jul 2020 05:31:07 +0000 (14:31 +0900)]
[C-API] C-API updated with tizen ACR

Updating C-API with tizen ACR
Also add corresponding changes in the package

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years agoUpdate setDataset / loadDataset
Jihoon Lee [Thu, 30 Jul 2020 02:16:46 +0000 (11:16 +0900)]
Update setDataset / loadDataset

When path is given and invalid, it should fail. This patch updates
accordingly.

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [ ]Passed [ ]Failed [X]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[layers] Remove extra forwarding definition from layers
Parichay Kapoor [Thu, 30 Jul 2020 07:22:07 +0000 (16:22 +0900)]
[layers] Remove extra forwarding definition from layers

Remove extra forwarding declaration and definition from layers
forwarding (in, out, status) is only to be supported by loss
it is not needed by other layers and should not be supported

**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ Coverity ] Fix Coverity Issues
jijoong.moon [Mon, 3 Aug 2020 07:15:26 +0000 (16:15 +0900)]
[ Coverity ] Fix Coverity Issues

This PR includes Coverity issues fixes.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years agoChange exception::invalid_prop -> not_supported
Jihoon Lee [Wed, 29 Jul 2020 11:06:41 +0000 (20:06 +0900)]
Change exception::invalid_prop -> not_supported

Change invalid_prop -> not supported for generality

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[ SVACE ] Fix svace issues. accepted/tizen/unified/20200731.145711 submit/tizen/20200731.001526 submit/trunk/20200730.111511
jijoong.moon [Thu, 30 Jul 2020 02:34:33 +0000 (11:34 +0900)]
[ SVACE ] Fix svace issues.

This PR includes fixes about svace

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years agoFix bug that logging happen twice
Jihoon Lee [Wed, 29 Jul 2020 11:59:45 +0000 (20:59 +0900)]
Fix bug that logging happen twice

This patch fix bugs that logging happen twice in non tizen environment

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoUnify optimizer initialize
Jihoon Lee [Wed, 29 Jul 2020 11:24:57 +0000 (20:24 +0900)]
Unify optimizer initialize

Many layer has setOptimizer implemented while layer can hadle all.

**Changes proposed in this PR:**
- Delete unnecessary setOptimizer
- Unify optimizer::initialize signature

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoChange neuralnet setConfig to throw
Jihoon Lee [Tue, 28 Jul 2020 08:59:09 +0000 (17:59 +0900)]
Change neuralnet setConfig to throw

Change `Neuralnet::setConfig` to throw instead of return status

**This PR also patches:**
- Minor typos
- Delete unused function
- Fix neuralnet ctor delegation order

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[ Meson ] Add app sanity test
Jihoon Lee [Wed, 29 Jul 2020 07:57:07 +0000 (16:57 +0900)]
[ Meson ] Add app sanity test

This patch currently check if example apps are running fine.

To make this better, we should add parameterized app test with golden
result plus some negative cases rather than running 1 test per 1 app,
it is not done because it is not the highest priority.

This patch only ensures that it is running fine for current setup(ini
and other stuff)

So this tests does not guarantee that app is running fine for every
cases.

resolves #375
see also #374

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[ ADAM ] validation of adam optimizer
jijoong.moon [Mon, 27 Jul 2020 02:14:56 +0000 (11:14 +0900)]
[ ADAM ] validation of adam optimizer

validation of adam

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years agoKNN notation change
brainer [Tue, 28 Jul 2020 16:26:02 +0000 (01:26 +0900)]
KNN notation change

Wrong notation in KNN is changed to k-NN

Signed-off-by: brainer <jeansanghyun@gmail.com>
4 years ago[ Coverity ] Fix Coverity Issues accepted/tizen/unified/20200728.135447 submit/tizen/20200728.023042
jijoong.moon [Fri, 24 Jul 2020 12:15:21 +0000 (21:15 +0900)]
[ Coverity ] Fix Coverity Issues

This PR incluide Fixs of Coverity Issues.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[ Unit Test ] Add Verification of Conv2D
jijoong.moon [Thu, 23 Jul 2020 05:47:22 +0000 (14:47 +0900)]
[ Unit Test ] Add Verification of Conv2D

Finalize Conv2D verification with tensorflow

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[API] Make C-API thread safe
Parichay Kapoor [Wed, 22 Jul 2020 03:17:34 +0000 (12:17 +0900)]
[API] Make C-API thread safe

Added thread locking safety for ml_train C-API
There is a global lock to assist with destruction of object
Each object gets its own lock which maintains safety when accessing/modifying that object

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[unittest] Moved unittest activation out of utils
Parichay Kapoor [Tue, 21 Jul 2020 09:32:15 +0000 (18:32 +0900)]
[unittest] Moved unittest activation out of utils

Activation unittests out of util
Set error tolerance level to e-6
This has exposed a bug with batchnormalization unittest - their golden values are 0
So, disabled it for now
@zhoonit please check it

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[activation] Update signature of activation prime
Parichay Kapoor [Tue, 21 Jul 2020 09:29:57 +0000 (18:29 +0900)]
[activation] Update signature of activation prime

Update signature to accomodate derivative in activation_prime
This allows softmax_prime to be managed with current activation_layer class

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years agoBump to 6.0
Jihoon Lee [Wed, 22 Jul 2020 07:28:27 +0000 (16:28 +0900)]
Bump to 6.0

**Changes proposed in this PR:**
- Bump project version to 6.0 in order to use nntrainer

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoFix rpm spec file dependency
Jihoon Lee [Wed, 22 Jul 2020 10:23:17 +0000 (19:23 +0900)]
Fix rpm spec file dependency

This patch fix rpm spec file dependency for `capi-nntrainer`

 Resolves #357

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[neuralNet] Moving batch size property to compile time
Parichay Kapoor [Tue, 21 Jul 2020 06:46:12 +0000 (15:46 +0900)]
[neuralNet] Moving batch size property to compile time

Batch size property should be set before compiling for now
Initializing of layers allocates memory for the layers
Setting batch size property with training call requires layers to be re-initialized
which isnt supported yet

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[neuralnet] Bug fix for setTrainConfig
Parichay Kapoor [Tue, 21 Jul 2020 09:55:07 +0000 (18:55 +0900)]
[neuralnet] Bug fix for setTrainConfig

Add bug fix for setTrainConfig
Break in switch in default case was hiding the error

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[Example] Add drawing part
Jihoon Lee [Tue, 21 Jul 2020 02:35:52 +0000 (11:35 +0900)]
[Example] Add drawing part

This PR adds drawing parti of example/customShortcut

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[sigmoid] Rename sigmoidePrime to sigmoidPrime accepted/tizen/unified/20200721.042553 submit/tizen/20200721.005828
Parichay Kapoor [Mon, 20 Jul 2020 07:37:07 +0000 (16:37 +0900)]
[sigmoid] Rename sigmoidePrime to sigmoidPrime

Rename sigmoidePrime to sigmoid Prime

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[activation] Move activation functions to activation file
Parichay Kapoor [Mon, 20 Jul 2020 07:35:16 +0000 (16:35 +0900)]
[activation] Move activation functions to activation file

Move activation operator functions to activation file as static members

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ Coverity ] Fix coverity issues
jijoong.moon [Mon, 20 Jul 2020 11:23:06 +0000 (20:23 +0900)]
[ Coverity ] Fix coverity issues

This PR includes fixes about coverity issues

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[ Coverity ] Fix Coverity Issues
jijoong.moon [Mon, 20 Jul 2020 04:42:59 +0000 (13:42 +0900)]
[ Coverity ] Fix Coverity Issues

This PR fix coverity issues.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years ago[activation] Update input arguments of softmax
Parichay Kapoor [Mon, 20 Jul 2020 04:49:09 +0000 (13:49 +0900)]
[activation] Update input arguments of softmax

Update input arguments of softmax to Tensor const & from Tensor to match activation
function signature

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[Loss] Update MSR to MSE
Parichay Kapoor [Mon, 20 Jul 2020 05:09:38 +0000 (14:09 +0900)]
[Loss] Update MSR to MSE

Update the name of mean squared error from MSR to MSE

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[unittest] Add unittest for backwarding of loss layers
Parichay Kapoor [Mon, 20 Jul 2020 04:01:14 +0000 (13:01 +0900)]
[unittest] Add unittest for backwarding of loss layers

This patch adds unittest for backwarding of loss layers with fc and activation layers
Major bug fixes for backwarding of all the loss layers to match with tensorflow
Major hot fix for softmax derivative - need to update activation layers semantics to
properly fix softmax layer derivative

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[ Bug ] Fix initialze buffer for adam optimizer
jijoong.moon [Fri, 17 Jul 2020 05:58:33 +0000 (14:58 +0900)]
[ Bug ] Fix initialze buffer for adam optimizer

. Add initialize(std::shared_ptr<UpdatableParam> params, ...)
. Update initailize conv2d for optimizer
. Update to do addition for calcuate derivatives of filter

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
4 years agoFix summary logic error
Jihoon Lee [Mon, 20 Jul 2020 02:07:28 +0000 (11:07 +0900)]
Fix summary logic error

Fix flag to be parsed properly by verbosity

cc. @jijoongmoon

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[CAPI] Implement summary capi
Jihoon Lee [Fri, 17 Jul 2020 07:54:49 +0000 (16:54 +0900)]
[CAPI] Implement summary capi

This PR mainly add `NeuralNetwork::print`. Note that it should be highly
refactored for proper functionality. Confer @todo's in this patch

**Changes proposed in this PR:**
- add `NeuralNetwork::print`
- implement `model_get_summary`
- Add capi test for `model_get_summary`
- move `ml_train_summary_type_e` to `api-common`
- minor bug fix

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoChange nntrainer include in applications
Jihoon Lee [Fri, 17 Jul 2020 08:16:38 +0000 (17:16 +0900)]
Change nntrainer include in applications

nntrainer include should include api_common header as well.
Android.mk in applications has been fixed

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoAdd header guard to nntrainer-api-common.h
Jihoon Lee [Fri, 17 Jul 2020 07:34:29 +0000 (16:34 +0900)]
Add header guard to nntrainer-api-common.h

Add header guard to nntrainer-api-common to prevent redlaration.

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years agoFix neuralnetwork property logic
Jihoon Lee [Fri, 17 Jul 2020 05:46:15 +0000 (14:46 +0900)]
Fix neuralnetwork property logic

exception::invalid_property for not allowed property should not be
ignored for some cases. So, parsing process is patched accordingly

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[fc/activation] Unittest for fc, activation layers and sgd optimizer
Parichay Kapoor [Fri, 17 Jul 2020 03:53:34 +0000 (12:53 +0900)]
[fc/activation] Unittest for fc, activation layers and sgd optimizer

Add combined unittest for fc and activation layers (sigmoid and softmax) backwarding
Note that this is done without the loss
As SGD is used for updates, this also acts as testing optimizer sgd

V2:
Using paramsAt() in unittest
Moved paramsAt() to public from protected

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years agoAdd test for modelfiles
Jihoon Lee [Thu, 16 Jul 2020 01:55:57 +0000 (10:55 +0900)]
Add test for modelfiles

Add automated test scaffolding to test modelfile configuration &
 NN initiation.

**Changes proposed in this PR:**
- Add IniSection class to control ini (for testing)
- Add Ini test fixture for paremeterized ini testing
- Add `unittest_nntrainer_modelfile.cpp`

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[api] remove duplicate declarations
Parichay Kapoor [Fri, 17 Jul 2020 05:12:02 +0000 (14:12 +0900)]
[api] remove duplicate declarations

Remove duplicate declarations arriving due to auto merge

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[c-api] Function namespace correction
Parichay Kapoor [Fri, 17 Jul 2020 05:32:41 +0000 (14:32 +0900)]
[c-api] Function namespace correction

get_summary function namespace correction

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[layer] Sum over batch for gradients
Parichay Kapoor [Thu, 16 Jul 2020 11:44:44 +0000 (20:44 +0900)]
[layer] Sum over batch for gradients

Gradients of the layer should perform sum over batch than average
This is consistent with tensorflow's implementation

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[optimzer/layers] Gradient dimension should match weight dim
Parichay Kapoor [Thu, 16 Jul 2020 11:39:29 +0000 (20:39 +0900)]
[optimzer/layers] Gradient dimension should match weight dim

Gradient dimension should match weight dimension
Currently optimizer applies averaging of gradients, which is not correct
Apply averaging of gradients before calling applyGradients

Resolves #280

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years ago[api] remove duplicate declaration
Parichay Kapoor [Thu, 16 Jul 2020 10:49:43 +0000 (19:49 +0900)]
[api] remove duplicate declaration

Remove duplicate declaration of ml_train_model_run_async

cc. @jijoonmoon

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
4 years agoFollow-up after #308
Jihoon Lee [Thu, 16 Jul 2020 02:50:15 +0000 (11:50 +0900)]
Follow-up after #308

**Changes proposed in this PR:**
- Now friendship between layer and network is over
- Move `setProperty(propType)` to public
- Add custom exception in `_error.h`
- Change `setProperty` `std::out_of_range` ->
`exception::invalid_property`
- Add error code boundary to `nntrainer_exception_boundary`
- Change inherited docs to @copydoc

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

resolves #315

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
4 years ago[API] Finalize the first draft of C-API
Parichay Kapoor [Wed, 8 Jul 2020 12:49:26 +0000 (21:49 +0900)]
[API] Finalize the first draft of C-API

Finalize the first draft of C-API

V2:
Applied review comments
Updated namespace to `ml_train_*`
Added module doc
Added enums for loss and optimizer
Added loss as a parameter for model_compile

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>