Jihoon Lee [Thu, 13 Aug 2020 07:35:04 +0000 (16:35 +0900)]
[TCM] Add few negative cases
**Changes proposed in this PR:**
- Add losslayer / util negative cases
- Change ASSERT_EXCEPTION -> EXPECT_THROW
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Wed, 12 Aug 2020 13:03:40 +0000 (22:03 +0900)]
[ SAVCE ] Fix svace issues
This PR includes fixes about SVACE issues
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Tue, 11 Aug 2020 06:20:03 +0000 (15:20 +0900)]
Additional validation and fixes for network::init
**Changes proposed in this PR:**
- Add term `realization`. This term denotes that enum type is turned
into an instantiation
- Change `initActivationLayer` -> `realizeActivationType`
- Change `initFlattenLayer` -> `relizeFlattenType`
- Delete `_make_act_layer`
- Add `TensorDim::isEmpty()`
- Clarify `checkValidation` -> `isInitializable` and move beforehand
validation logics
- Fix logic that prevents double activation
- Move layer name validaton phase `addLayer` -> `isInitializable`
- Prevent realization of `ACT_NONE`
- Sort unittest_modelfile order and update comments
- Enable few test cases
Resolves #390
See also #388
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 12 Aug 2020 10:04:42 +0000 (19:04 +0900)]
[CAPI] Add comments for tizen privileges
Add commnets for privileges which might be required in case of saving the model
to a file.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 7 Aug 2020 11:45:07 +0000 (20:45 +0900)]
[model] Change the semantics for setting the batchsize
Batchsize as a property was earlier set in input_shape for a layer
as well as for the model. However, this was conflicting and prone to errors.
This patch disables setting batch size while setting the input_shape.
If batch_size is more than 1 in the input_shape, a warning is issued.
And this value is later overriden by the batchsize set for the model.
This same batchsize is also used for the dataset as well.
Resolves #299
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 11 Aug 2020 08:20:05 +0000 (17:20 +0900)]
[softmax] Bug fix for softmax
Softmax should apply on 1 axis (last axis in this patch) than all the axis.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 11 Aug 2020 08:18:05 +0000 (17:18 +0900)]
[softmax] Bug fix for softmax
Softmax indexing bug fix
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 5 Aug 2020 01:29:05 +0000 (10:29 +0900)]
[TensorDim] Change behaviour of default tensorDim constructor
Default tensor dim constructor sets up tensorDim of all 0 which are invalid to use by nature
Setting any dimension resets all the 0 dimensions to 1
This removes allocation of memory by tensor on default constructor yet allows using default constructor of tensor
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Mon, 10 Aug 2020 07:26:15 +0000 (16:26 +0900)]
Add atomicity to loadFromConfig
resolves #382
**Changes proposed in this PR:**
- Add swap for nerualNetwork
- Correct doxygen style for neuralnet members
- Add optimizer copy ctor
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 6 Aug 2020 07:53:05 +0000 (16:53 +0900)]
Add safeguard to prevent loadFromConf twice
**Changes proposed in this PR:**
- This patch disables loadFromConfig after loading
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 10 Aug 2020 05:19:44 +0000 (14:19 +0900)]
[capi] Add tizen feature checking
Added tizen feature checking for C-API
Added corresponding bypass for the tizen capi unittests
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Mon, 10 Aug 2020 08:02:43 +0000 (17:02 +0900)]
[ Release ] Change version of nntrainer
Release Version 0.1.0.rc1
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Thu, 6 Aug 2020 07:41:52 +0000 (16:41 +0900)]
[Example] Change user scenario
In this patch, user draws smile face and sad face respectively for 5
times.
It is splited half to trainset/validationSet and used for training.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 4 Aug 2020 11:42:07 +0000 (20:42 +0900)]
[Tensor] Enable default copy constructor
This patch enables default copy constructor to fix #281.
- Added sharedConstTensor for safety
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Fri, 7 Aug 2020 08:09:41 +0000 (17:09 +0900)]
[api] Update since tizen version
Update since tizen version from 6.x to 6.0
See also #414
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 7 Aug 2020 08:06:28 +0000 (17:06 +0900)]
[api] Add note that this API is experimental
Added note that this API is experimental and not stable
to most of the functions for this to be visible in the SDK function descriptions
See Also #414
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Thu, 30 Jul 2020 11:50:36 +0000 (20:50 +0900)]
[ MNIST ] Fix bug about ADAM implementation
There was a bug to calculate ADAM optimizer.
This PR provides fix.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 5 Aug 2020 06:11:12 +0000 (15:11 +0900)]
[ README ] Update README file
Update README file to give proper information about NNTrainer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Tue, 4 Aug 2020 11:31:44 +0000 (20:31 +0900)]
Tizen C-API application bug fix
issue #377 has occured since #403.
Found that current classification assumes to have trainingSet as a
validationSet.
Fixed accordingly, which resolves #377
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 28 Jul 2020 04:59:08 +0000 (13:59 +0900)]
Attach nntrainer api
**Changes proposed in this PR:**
- Attach nntrainer api for training
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 30 Jul 2020 12:41:58 +0000 (21:41 +0900)]
Add neuralnet test cases that is not working
Add test cases. some cases are not working so it is disabled.
After fixing bugs in #388 it can be opend again
**Changes proposed in this PR:**
- Fix some typos
- Add some debug logs
See also #388
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 28 Jul 2020 00:59:42 +0000 (09:59 +0900)]
[Example] Add feature extraction part
**Changes proposed in this PR:**
- Add feature extraction using `ml-pipeline-api` (saved to training.dat)
- Handle exit canvas / proceed / reset button
- Add path parsing utils
- Add logger macro
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 29 Jul 2020 11:34:24 +0000 (20:34 +0900)]
Refactor neuralnet::init
**Changes proposed in this PR:**
- Add determinancy and check to
- Delete cost / bn_follow from layer
- Enable negative test for modelfile
- fix IniSection::setEntry is wrongly using key for an erase behavior
- Delete `VERIFY_SET_DIMENSION()`
Resolves #316
See also #374
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 3 Aug 2020 09:08:02 +0000 (18:08 +0900)]
[Layer] Update forward and backward layer functions
Update the forwarding and backwarding layer functions to support
multiple inputs and outputs.
This is crucial to support concat, element wise add, etc kind of layers
as well as skip connection.
Further, forward no longer returns status but rather throws.
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 3 Aug 2020 10:30:35 +0000 (19:30 +0900)]
[unittest] Tizen C-API unittest bug fix
Train with generator positive unittest bugs are fixed
Now, it is also training
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Fri, 31 Jul 2020 02:19:07 +0000 (11:19 +0900)]
Print log file and line for better debuging
**Changes proposed in this PR:**
- Add file and line to the logger
- Delete log tag for linux distro (as it is obvious and consumes space)
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Thu, 23 Jul 2020 05:31:07 +0000 (14:31 +0900)]
[C-API] C-API updated with tizen ACR
Updating C-API with tizen ACR
Also add corresponding changes in the package
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 30 Jul 2020 02:16:46 +0000 (11:16 +0900)]
Update setDataset / loadDataset
When path is given and invalid, it should fail. This patch updates
accordingly.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [ ]Passed [ ]Failed [X]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Thu, 30 Jul 2020 07:22:07 +0000 (16:22 +0900)]
[layers] Remove extra forwarding definition from layers
Remove extra forwarding declaration and definition from layers
forwarding (in, out, status) is only to be supported by loss
it is not needed by other layers and should not be supported
**Self evaluation:**
1. Build test: [x]Passed [ ]Failed [ ]Skipped
2. Run test: [x]Passed [ ]Failed [ ]Skipped
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Mon, 3 Aug 2020 07:15:26 +0000 (16:15 +0900)]
[ Coverity ] Fix Coverity Issues
This PR includes Coverity issues fixes.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Wed, 29 Jul 2020 11:06:41 +0000 (20:06 +0900)]
Change exception::invalid_prop -> not_supported
Change invalid_prop -> not supported for generality
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Thu, 30 Jul 2020 02:34:33 +0000 (11:34 +0900)]
[ SVACE ] Fix svace issues.
This PR includes fixes about svace
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Wed, 29 Jul 2020 11:59:45 +0000 (20:59 +0900)]
Fix bug that logging happen twice
This patch fix bugs that logging happen twice in non tizen environment
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 29 Jul 2020 11:24:57 +0000 (20:24 +0900)]
Unify optimizer initialize
Many layer has setOptimizer implemented while layer can hadle all.
**Changes proposed in this PR:**
- Delete unnecessary setOptimizer
- Unify optimizer::initialize signature
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 28 Jul 2020 08:59:09 +0000 (17:59 +0900)]
Change neuralnet setConfig to throw
Change `Neuralnet::setConfig` to throw instead of return status
**This PR also patches:**
- Minor typos
- Delete unused function
- Fix neuralnet ctor delegation order
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 29 Jul 2020 07:57:07 +0000 (16:57 +0900)]
[ Meson ] Add app sanity test
This patch currently check if example apps are running fine.
To make this better, we should add parameterized app test with golden
result plus some negative cases rather than running 1 test per 1 app,
it is not done because it is not the highest priority.
This patch only ensures that it is running fine for current setup(ini
and other stuff)
So this tests does not guarantee that app is running fine for every
cases.
resolves #375
see also #374
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Mon, 27 Jul 2020 02:14:56 +0000 (11:14 +0900)]
[ ADAM ] validation of adam optimizer
validation of adam
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
brainer [Tue, 28 Jul 2020 16:26:02 +0000 (01:26 +0900)]
KNN notation change
Wrong notation in KNN is changed to k-NN
Signed-off-by: brainer <jeansanghyun@gmail.com>
jijoong.moon [Fri, 24 Jul 2020 12:15:21 +0000 (21:15 +0900)]
[ Coverity ] Fix Coverity Issues
This PR incluide Fixs of Coverity Issues.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 23 Jul 2020 05:47:22 +0000 (14:47 +0900)]
[ Unit Test ] Add Verification of Conv2D
Finalize Conv2D verification with tensorflow
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Parichay Kapoor [Wed, 22 Jul 2020 03:17:34 +0000 (12:17 +0900)]
[API] Make C-API thread safe
Added thread locking safety for ml_train C-API
There is a global lock to assist with destruction of object
Each object gets its own lock which maintains safety when accessing/modifying that object
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 21 Jul 2020 09:32:15 +0000 (18:32 +0900)]
[unittest] Moved unittest activation out of utils
Activation unittests out of util
Set error tolerance level to e-6
This has exposed a bug with batchnormalization unittest - their golden values are 0
So, disabled it for now
@zhoonit please check it
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 21 Jul 2020 09:29:57 +0000 (18:29 +0900)]
[activation] Update signature of activation prime
Update signature to accomodate derivative in activation_prime
This allows softmax_prime to be managed with current activation_layer class
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Wed, 22 Jul 2020 07:28:27 +0000 (16:28 +0900)]
Bump to 6.0
**Changes proposed in this PR:**
- Bump project version to 6.0 in order to use nntrainer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Wed, 22 Jul 2020 10:23:17 +0000 (19:23 +0900)]
Fix rpm spec file dependency
This patch fix rpm spec file dependency for `capi-nntrainer`
Resolves #357
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Tue, 21 Jul 2020 06:46:12 +0000 (15:46 +0900)]
[neuralNet] Moving batch size property to compile time
Batch size property should be set before compiling for now
Initializing of layers allocates memory for the layers
Setting batch size property with training call requires layers to be re-initialized
which isnt supported yet
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 21 Jul 2020 09:55:07 +0000 (18:55 +0900)]
[neuralnet] Bug fix for setTrainConfig
Add bug fix for setTrainConfig
Break in switch in default case was hiding the error
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Tue, 21 Jul 2020 02:35:52 +0000 (11:35 +0900)]
[Example] Add drawing part
This PR adds drawing parti of example/customShortcut
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 20 Jul 2020 07:37:07 +0000 (16:37 +0900)]
[sigmoid] Rename sigmoidePrime to sigmoidPrime
Rename sigmoidePrime to sigmoid Prime
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 20 Jul 2020 07:35:16 +0000 (16:35 +0900)]
[activation] Move activation functions to activation file
Move activation operator functions to activation file as static members
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Mon, 20 Jul 2020 11:23:06 +0000 (20:23 +0900)]
[ Coverity ] Fix coverity issues
This PR includes fixes about coverity issues
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 20 Jul 2020 04:42:59 +0000 (13:42 +0900)]
[ Coverity ] Fix Coverity Issues
This PR fix coverity issues.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Parichay Kapoor [Mon, 20 Jul 2020 04:49:09 +0000 (13:49 +0900)]
[activation] Update input arguments of softmax
Update input arguments of softmax to Tensor const & from Tensor to match activation
function signature
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 20 Jul 2020 05:09:38 +0000 (14:09 +0900)]
[Loss] Update MSR to MSE
Update the name of mean squared error from MSR to MSE
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Mon, 20 Jul 2020 04:01:14 +0000 (13:01 +0900)]
[unittest] Add unittest for backwarding of loss layers
This patch adds unittest for backwarding of loss layers with fc and activation layers
Major bug fixes for backwarding of all the loss layers to match with tensorflow
Major hot fix for softmax derivative - need to update activation layers semantics to
properly fix softmax layer derivative
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Fri, 17 Jul 2020 05:58:33 +0000 (14:58 +0900)]
[ Bug ] Fix initialze buffer for adam optimizer
. Add initialize(std::shared_ptr<UpdatableParam> params, ...)
. Update initailize conv2d for optimizer
. Update to do addition for calcuate derivatives of filter
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Mon, 20 Jul 2020 02:07:28 +0000 (11:07 +0900)]
Fix summary logic error
Fix flag to be parsed properly by verbosity
cc. @jijoongmoon
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 17 Jul 2020 07:54:49 +0000 (16:54 +0900)]
[CAPI] Implement summary capi
This PR mainly add `NeuralNetwork::print`. Note that it should be highly
refactored for proper functionality. Confer @todo's in this patch
**Changes proposed in this PR:**
- add `NeuralNetwork::print`
- implement `model_get_summary`
- Add capi test for `model_get_summary`
- move `ml_train_summary_type_e` to `api-common`
- minor bug fix
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 17 Jul 2020 08:16:38 +0000 (17:16 +0900)]
Change nntrainer include in applications
nntrainer include should include api_common header as well.
Android.mk in applications has been fixed
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 17 Jul 2020 07:34:29 +0000 (16:34 +0900)]
Add header guard to nntrainer-api-common.h
Add header guard to nntrainer-api-common to prevent redlaration.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 17 Jul 2020 05:46:15 +0000 (14:46 +0900)]
Fix neuralnetwork property logic
exception::invalid_property for not allowed property should not be
ignored for some cases. So, parsing process is patched accordingly
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Fri, 17 Jul 2020 03:53:34 +0000 (12:53 +0900)]
[fc/activation] Unittest for fc, activation layers and sgd optimizer
Add combined unittest for fc and activation layers (sigmoid and softmax) backwarding
Note that this is done without the loss
As SGD is used for updates, this also acts as testing optimizer sgd
V2:
Using paramsAt() in unittest
Moved paramsAt() to public from protected
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 16 Jul 2020 01:55:57 +0000 (10:55 +0900)]
Add test for modelfiles
Add automated test scaffolding to test modelfile configuration &
NN initiation.
**Changes proposed in this PR:**
- Add IniSection class to control ini (for testing)
- Add Ini test fixture for paremeterized ini testing
- Add `unittest_nntrainer_modelfile.cpp`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Fri, 17 Jul 2020 05:12:02 +0000 (14:12 +0900)]
[api] remove duplicate declarations
Remove duplicate declarations arriving due to auto merge
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 17 Jul 2020 05:32:41 +0000 (14:32 +0900)]
[c-api] Function namespace correction
get_summary function namespace correction
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 16 Jul 2020 11:44:44 +0000 (20:44 +0900)]
[layer] Sum over batch for gradients
Gradients of the layer should perform sum over batch than average
This is consistent with tensorflow's implementation
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 16 Jul 2020 11:39:29 +0000 (20:39 +0900)]
[optimzer/layers] Gradient dimension should match weight dim
Gradient dimension should match weight dimension
Currently optimizer applies averaging of gradients, which is not correct
Apply averaging of gradients before calling applyGradients
Resolves #280
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 16 Jul 2020 10:49:43 +0000 (19:49 +0900)]
[api] remove duplicate declaration
Remove duplicate declaration of ml_train_model_run_async
cc. @jijoonmoon
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 16 Jul 2020 02:50:15 +0000 (11:50 +0900)]
Follow-up after #308
**Changes proposed in this PR:**
- Now friendship between layer and network is over
- Move `setProperty(propType)` to public
- Add custom exception in `_error.h`
- Change `setProperty` `std::out_of_range` ->
`exception::invalid_property`
- Add error code boundary to `nntrainer_exception_boundary`
- Change inherited docs to @copydoc
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
resolves #315
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 8 Jul 2020 12:49:26 +0000 (21:49 +0900)]
[API] Finalize the first draft of C-API
Finalize the first draft of C-API
V2:
Applied review comments
Updated namespace to `ml_train_*`
Added module doc
Added enums for loss and optimizer
Added loss as a parameter for model_compile
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 15 Jul 2020 07:43:22 +0000 (16:43 +0900)]
[CAPI] Add functions to be supported later
Add functions to be supported later in the internal capi header.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 14 Jul 2020 09:48:17 +0000 (18:48 +0900)]
[CAPI] Added more unittests for tizen capi
Add unittest fot the tizen c-api related to datasets
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Fri, 3 Jul 2020 07:51:54 +0000 (16:51 +0900)]
[loss/FC] Unittest for forward propogation
This PR added unittests for forward propagation for FC layer with loss and activations.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 16 Jul 2020 05:14:54 +0000 (14:14 +0900)]
[softmax] Bug fix
Softmax should compute over batch dimension
However current implementation computes over channel dimension
This patch fixes it. This also fixes softmax with cross entropy loss
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Thu, 16 Jul 2020 05:12:43 +0000 (14:12 +0900)]
[Loss] Bug fix for loss
Added bug fix for loss forwarding
- for sigmoid with cross entropy, formula was correct, however implementation was wrong, also inverted sign of the output
- for MSE, average is needed than sum
- for softmax with cross entropy, divide by input width is not needed but still mismatch
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Wed, 15 Jul 2020 10:57:53 +0000 (19:57 +0900)]
[tensor] Bug fix for average
Bug fix for average to use the correct denominator
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Tue, 14 Jul 2020 11:11:16 +0000 (20:11 +0900)]
Enable activation layer in ini
This PR expose activation layer by ini. As #308 handles props
automatically, there is no need to set else more.
See also #210
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 14 Jul 2020 09:38:15 +0000 (18:38 +0900)]
Refactor nntrainer load from config
`loadFromConfig` has many duplicated logics with layer::setProperty and
others
This PR refactor `loadFromConfig` to reuse already present logic.
**Changes proposed in this PR:**
- Add `std::out_of_range` exception to setProperty to represent validity
- Change error to warning when input_dim is not present at the head of
the network(for ini)
- Change `weightIni` -> `weight_ini` in `ini` for consistency
- Change unittest accordingly
- Separate dataset, network parser
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Wed, 15 Jul 2020 08:00:50 +0000 (17:00 +0900)]
[common-api] Add common api header
Add common api header for nntrainer.
This common header includes declarations common to all the APIs
and is used in nntrainer as well.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 14 Jul 2020 04:30:09 +0000 (13:30 +0900)]
[API] Update C-API dataset handling
Update C-API to add dataset handling
Changes added to data buffer:
- Bug fix for data buffer using already freed memory
- Data buffer set properties moved out from neural network
- Data buffer generator now takes list of array for multiple inputs
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Mon, 13 Jul 2020 06:52:44 +0000 (15:52 +0900)]
[ Application ] mnist with tensorflow
This PR includes:
. MNIST Tensorflow Example to compare with nntrainer.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Thu, 9 Jul 2020 00:33:15 +0000 (09:33 +0900)]
[CAPI] Add ml_nnmodel_get_summary
**Changes proposed in this PR:**
- Add ml_nnmodel_get_summary
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 13 Jul 2020 03:41:11 +0000 (12:41 +0900)]
[API] Update API namespace and use enum for opitmizer
Setting optimizer now uses enum than string
The name space of the layers has been updated
Below is the summary the updates:
ml_nnmodel_* -> ml_train_model_*
ml_nnlayer_* -> ml_train_layer_*
ml_nnopt_* -> ml_train_optimizer_*
*_delete() -> *_destroy()
ml_nnmodel_train_with_file() and ml_nnmodel_train_with_generator() have been kept back
These will be updated in the upcoming PR where dataset interface for API is updated
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 14 Jul 2020 09:19:37 +0000 (18:19 +0900)]
[error code] Common error codes re-define issue
many ML_ERROR error codes are defined twice in nntrainer/include/nntrainer_error.h and api/capi/include/platform/ml-api-common.h
this patch reuses the definition from ml-api-common.h in nntrainer_error.h
this allows nntrainer.h to be included in source code as it contains certain callback declarations which are used in library
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Parichay Kapoor [Tue, 14 Jul 2020 09:06:03 +0000 (18:06 +0900)]
[data buffer] Bug fix for data buffer
data buffer updateData() passes the reference of a on-stack variable
which is deleted after the function call exits. However the callee function
remains on a new thread causing problems.
Added bug fix for this.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Tue, 14 Jul 2020 04:01:49 +0000 (13:01 +0900)]
Fix NeuralNetwork::finalize
Delete layer in finalize had wrong index. This PR fix this issue
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Mon, 13 Jul 2020 01:34:51 +0000 (10:34 +0900)]
Add layer print
**Changes proposed in this PR:**
- Add print option to layer
- Add `Layer::print` function
- Add hook point to print function for derived layer.
- Add `ostream &operater<<` for Layer type
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Fri, 10 Jul 2020 11:28:44 +0000 (20:28 +0900)]
[ API ] capi-nntrainer packaging
This PR includes:
. rpm packaging of capi-nntrainer for tzien
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Dongju Chae [Tue, 14 Jul 2020 08:21:17 +0000 (17:21 +0900)]
[Fix/Coverage] Fix the coverage generator to use python3
This patch fixes the coverage generator to use python3.
It seems that Python2 package was removed in nntrainer.spec.
Signed-off-by: Dongju Chae <dongju.chae@samsung.com>
Jihoon Lee [Fri, 10 Jul 2020 07:17:44 +0000 (16:17 +0900)]
Add setProperty for each property type
Since Layer::setProperty is iterating through vectors, only part that
needs overriding is setting each particular property.
This PR add setProperty by type to relieve the issue.
w.r.t #270, this patch enables layers to check if given property type
is valid to user.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Fri, 10 Jul 2020 04:33:48 +0000 (13:33 +0900)]
Add throw_status to map status to throw
Few functions are using cstyle throw, this PR scaffolds around c style
throw to use `std::exception`
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 13 Jul 2020 04:09:22 +0000 (13:09 +0900)]
[API] Bug fix in C-API get_layer
ml_nnmodel_get_layer function was adding another layer in struct NeuralNetwork if layer wasnt found in layers_map
This function was supposed to only add layer in layers_map and not modify struct NeuralNetwork
This PR applies the above bug fix.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
Jihoon Lee [Thu, 9 Jul 2020 11:21:30 +0000 (20:21 +0900)]
Refactor bn layer test fixture
**Changes proposed in this PR:**
- BN layer tf fixture is merged to BNLayer fixture
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Thu, 9 Jul 2020 10:01:49 +0000 (19:01 +0900)]
Update example to c++14 and enable exceptions
**Changes proposed in this PR:**
- Update /Application/jni to use c++14 and enable exceptions
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Jihoon Lee [Tue, 7 Jul 2020 12:44:59 +0000 (21:44 +0900)]
Add UpdatableParam to manage weights / gradients
**Changes proposed in this PR:**
- Add `UpdatableParam`
- Change `Optimizer::apply_gradient` signature
- Attach `UpdatableParam` to manage weights
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
Parichay Kapoor [Mon, 6 Jul 2020 05:51:59 +0000 (14:51 +0900)]
[API] Search layer by name in C-API
Support searching of layer by name in C-API
This is part of the more changes to support this functionality
Related issue - #260
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
jijoong.moon [Thu, 2 Jul 2020 10:08:18 +0000 (19:08 +0900)]
[ MNIST ] mnist application
This PR provides mnist application which includes:
. 1 Input Layer
. 2 Conv2d
. 2 Pooling2d
. 1 Flatten
. 1 Fully Connected Layer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 9 Jul 2020 12:19:01 +0000 (21:19 +0900)]
Fix to run validation process during train
Currently validation is not work. Because the dimension of hidden
tensor is different if we use one batch input.
This PR handle this issue.
Self evaluation:
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 6 Jul 2020 01:24:10 +0000 (10:24 +0900)]
[ Unit Test ] Generate Tensorflow ouptut and Comparison
This PR provides:
. Generate Tensorflow output & gradients output for conv2d, pooling2d
. Compare with nntrainer outputs
. TODO : compare gradient after getGradieint func is implemented.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Parichay Kapoor [Wed, 8 Jul 2020 05:26:04 +0000 (14:26 +0900)]
[API] Update C-API semantics
Update C-API semantics about layer and optimizer
While the layer and optimizer are in use, do allow deletion of layer and optimizer
Further, once the layer and optimizer are set/added to a model, their ownership is transferred to the model
So, no need to delete them separately. Deleting the model will directly delete them as well.
Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>