platform/core/ml/nntrainer.git
2 years ago[BUILD] fix supported gmock version
Jiho Chu [Thu, 30 Jun 2022 02:25:21 +0000 (11:25 +0900)]
[BUILD] fix supported gmock version

google mock API is changed from 1.10.0,
and it does not support MOCK_METHOD under 1.10.0 version.
'profiler' test is only enabled for proper gmock.

MOCK_METHOD macro needs the upper verison(>=1.10) of gmock.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[ Application ] Fix VGG using CCAPI
jijoong.moon [Mon, 27 Jun 2022 09:59:07 +0000 (18:59 +0900)]
[ Application ] Fix VGG using CCAPI

This patch includes the fixes of VGG Application.
- Remove the training data generation with batch size.
  : Recently we update the data generation callback to fill one data
  set a time
- Remove include the nntrainer interanls : neuralnet.h
  : Update to use CCAPI
- Reuse the data generation with ciar_dataloader.h in Resnet

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[TEST] Add unittest related to setProperty for dataset
Hyunil [Wed, 22 Jun 2022 09:01:39 +0000 (18:01 +0900)]
[TEST] Add unittest related to setProperty for dataset

Add unittest for ml_train_dataset_set_property_for_mode_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add set property with single param for dataset
Hyunil [Wed, 22 Jun 2022 08:51:55 +0000 (17:51 +0900)]
[CAPI] Add set property with single param for dataset

Add ml_train_dataset_set_property_for_mode_with_single_param().
ml_train_dataset_set_property_for_mode() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive param as a single string from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[ BUILD ] fix binary size of libnntrainer with android build script
jijoong.moon [Wed, 22 Jun 2022 06:15:03 +0000 (15:15 +0900)]
[ BUILD ] fix binary size of libnntrainer with android build script

This patch fix the wrong size of nntrainer library, libnntrainer.so,
when using the android build script.

- change the ndk-build options with NDK_LIBS_OUT

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[neuralnet] check tflite interpreter is enabled when export
hyeonseok lee [Tue, 21 Jun 2022 09:41:56 +0000 (18:41 +0900)]
[neuralnet] check tflite interpreter is enabled when export

 - Added ifdef statement to check tflite interpreter is enabled
 - Added override specifier in get/setWeights
 - Restore precision ostream format

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[profiler] restore precision
hyeonseok lee [Mon, 20 Jun 2022 07:35:26 +0000 (16:35 +0900)]
[profiler] restore precision

 - After print profile information restore precision

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[fix] fix svace issue
hyeonseok lee [Fri, 17 Jun 2022 06:45:53 +0000 (15:45 +0900)]
[fix] fix svace issue

 - Delete duplicated unittest
 - Added try catch statement to catch exception

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Application] add step learning rate scheduler example
hyeonseok lee [Thu, 16 Jun 2022 08:48:23 +0000 (17:48 +0900)]
[Application] add step learning rate scheduler example

 - Added how to use step learning rate scheduler in mnist application

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ TEST ] add golden data for grucell_fc
jijoong.moon [Wed, 15 Jun 2022 12:53:36 +0000 (21:53 +0900)]
[ TEST ] add golden data for grucell_fc

This patch includes the golden data of grucell_fc unit test.

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Recurrent ] property for dynamic time sequence
jijoong.moon [Wed, 15 Jun 2022 11:37:08 +0000 (20:37 +0900)]
[ Recurrent ] property for dynamic time sequence

.This patch provides property for dynamic time sequence in recurrent
realizer. The sementic of this is "dynamic_time_seq = true/false"

.Add grucell + Fully Connected uint test case for reference of dynamic
time sequence

Related : #1933

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[TEST] Add unittest related to setProperty for optimizer
Hyunil [Thu, 16 Jun 2022 00:35:29 +0000 (09:35 +0900)]
[TEST] Add unittest related to setProperty for optimizer

- Add unittest for ml_train_optimizer_set_property_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add set property with single param for optimizer
Hyunil [Thu, 16 Jun 2022 00:18:01 +0000 (09:18 +0900)]
[CAPI] Add set property with single param for optimizer

Add ml_train_optimizer_set_property_with_single_param().
ml_train_optimizer_set_property() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive param as a single string from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[trivial] fix typo
DonghakPark [Wed, 15 Jun 2022 07:29:57 +0000 (16:29 +0900)]
[trivial] fix typo

- Fix typo
- Fix ist --> <sup>i</sup> for readability

Signed-off-by: Donghak Park donghak.park@samsung.com
2 years ago[CAPI] Add unittest related to setProperty for layer
Hyunil [Tue, 14 Jun 2022 06:10:33 +0000 (15:10 +0900)]
[CAPI] Add unittest related to setProperty for layer

- Add unittest for ml_train_layer_set_property_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add ml_train_layer_set_property_with_single_param() internally for C# DllImport
Hyunil [Mon, 13 Jun 2022 09:17:27 +0000 (18:17 +0900)]
[CAPI] Add ml_train_layer_set_property_with_single_param() internally for C# DllImport

ml_train_layer_set_property() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive param as a single string from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Modify model_compile_with_single_param() and ml_train_model_run_with_single_pa...
Hyunil [Mon, 13 Jun 2022 02:11:18 +0000 (11:11 +0900)]
[CAPI] Modify model_compile_with_single_param() and ml_train_model_run_with_single_param()

Modified the internal APIs created as a C# va_list issue since capi can receive
single string in the form "key=value | key=value" and all objects have loadProperties
in setProperty which can split it with '|'.
so, the original code was reverted and added new code related call capi directly.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Spec] Add flatbuffers-devel BuildRequires as default accepted/tizen/unified/20220616.141953 submit/tizen/20220615.013221
Sangjung Woo [Wed, 15 Jun 2022 01:53:49 +0000 (10:53 +0900)]
[Spec] Add flatbuffers-devel BuildRequires as default

The compiler module always uses flatc command but 'flatbuffers-devel'
dependency is added in case of the Tizen version is lower than 6.5.
Because of this reason, the buildbreak issue occurs when building for
Tizen 7.0. This patch fixes this bug.

Signed-off-by: Sangjung Woo <sangjung.woo@samsung.com>
2 years ago[ Tensor ] remove rank 2 limitation for dot op
jijoong.moon [Fri, 10 Jun 2022 10:50:30 +0000 (19:50 +0900)]
[ Tensor ] remove rank 2 limitation for dot op

This patch removes the limitaion of rank 2 for dot op.
It expectes to compute with 4D tensor as 2D with [BxCxH, W]
dimension.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[unittest] added zoneout mask unittest
hyeonseok lee [Fri, 10 Jun 2022 06:59:00 +0000 (15:59 +0900)]
[unittest] added zoneout mask unittest

 - Test the zoneout mask is generated according to the zoneout rate

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[layer] fix typo
hyeonseok lee [Fri, 6 May 2022 02:21:04 +0000 (11:21 +0900)]
[layer] fix typo

 - Fix typo
 - Delete duplicated code

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ Doc ] Update Readme.md
jijoong.moon [Mon, 13 Jun 2022 00:45:35 +0000 (09:45 +0900)]
[ Doc ] Update Readme.md

Add new doc which explains NNTrainer internal

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[docs] Update configuration-ini.md file
Hyunil [Fri, 3 Jun 2022 06:18:11 +0000 (15:18 +0900)]
[docs] Update configuration-ini.md file

- Remove learning_rate from oprimizer section
- Add learning rate scheduler section
- Remove dataset section
- Add train set, validation set and test set section
- Add many types to layer section
- Add table about type, key, value, default value and description for each layers
- Update configuration file example

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[ Packaging ] Packaging for Tizen 6.0
jijoong.moon [Thu, 2 Jun 2022 10:57:51 +0000 (19:57 +0900)]
[ Packaging ] Packaging for Tizen 6.0

This patch includes fixes to support Tizne 6.0 build
 . Fix .spec & meson.build

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[bug_fix] bug fix zoneout lstmcell layer
hyeonseok lee [Tue, 7 Jun 2022 07:06:00 +0000 (16:06 +0900)]
[bug_fix] bug fix zoneout lstmcell layer

 - This patch enable zoneout even though the zoneout rate is smaller than epsilon

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[CAPI] Add unittest related to compile and train
Hyunil [Tue, 24 May 2022 02:02:45 +0000 (11:02 +0900)]
[CAPI] Add unittest related to compile and train

- Add unittest for ml_train_model_compile_with_single_param()
- Add unittest for ml_train_model_run_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Property] Modify error log
Hyunil [Mon, 23 May 2022 03:32:45 +0000 (12:32 +0900)]
[Property] Modify error log

- Add double quotation mark to log to indicate wrong with "key=value, key=value"
- Add example key and value

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add ml_train_model_run_with_single_param() internally for C# DllImport
Hyunil [Tue, 24 May 2022 05:39:02 +0000 (14:39 +0900)]
[CAPI] Add ml_train_model_run_with_single_param() internally for C# DllImport

ml_train_model_run() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive model compile param as a single string
from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add ml_train_model_compile_with_single_param() internally for C# DllImport
Hyunil [Tue, 17 May 2022 03:07:44 +0000 (12:07 +0900)]
[CAPI] Add ml_train_model_compile_with_single_param() internally for C# DllImport

ml_train_model_compile() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive model compile param as a single string
from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Utils] Add Memory profile feature to Profiler
Jiho Chu [Mon, 30 May 2022 04:56:38 +0000 (13:56 +0900)]
[Utils] Add Memory profile feature to Profiler

This patch implement memory profiling feature.

Profile is refactored to handle memory statics information.
Unnecessary dependency from ProfileListener to Profile
class is removed, and inner information is redesigned to handle
both time and memory profiling.

For the memory profile, below infomation is managed:

event: ALLOC | DEALLOC
current size: total allocated memory size
info: user friendly tag for the analysis
duration: time interval between alloc and dealloc

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Test] Modify profiler test to add Memory event
Jiho Chu [Mon, 30 May 2022 04:54:42 +0000 (13:54 +0900)]
[Test] Modify profiler test to add Memory event

This patch implement profiler test.

Whole codes are refectored to test both 'time' and 'memory' profiling.
It uses fixture test and gmock is used to checking callback function.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[ccapi/nntrainer] Add getters for compiled, initialized and loadedFromConfig
JRazek [Fri, 27 May 2022 17:28:39 +0000 (19:28 +0200)]
[ccapi/nntrainer] Add getters for compiled, initialized and loadedFromConfig

This commit adds bool getters for the states of compiled, initialized and loadedFromConfig in ml::train::Model class.

Signed-off-by: JRazek <jakub.razek2@gmail.com>
2 years ago[ccapi] include common header in meson
hyeonseok lee [Fri, 20 May 2022 03:56:06 +0000 (12:56 +0900)]
[ccapi] include common header in meson

 - Newly created common header file was missing in meson build

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ Application ] Update Logistic Regression to use CCAPI
jijoong.moon [Mon, 16 May 2022 11:10:11 +0000 (20:10 +0900)]
[ Application ] Update Logistic Regression to use CCAPI

This patch includes
 . Update Logistic Regression to use CCAPIs
 . Update Inference process
 . static link about openmp

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ccapi] Add exports interface.
seongwoo [Wed, 18 May 2022 07:43:47 +0000 (16:43 +0900)]
[ccapi] Add exports interface.

This patch adds "exports" interface and implementation of neuralnet which literally exports the model according to given export method.

Also, this patch includes introducing `common.h` to ccapi, which defines ExportMethods.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: seongwoo <mhs4670go@naver.com>
2 years ago[layers] flatten layer consider target_shape
seongwoo [Thu, 12 May 2022 07:59:31 +0000 (16:59 +0900)]
[layers] flatten layer consider target_shape

This patch makes a flatten layer consider target_shape and use it during exporting.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: seongwoo <mhs4670go@naver.com>
2 years ago[ Compiler ] add loss layer realizer
seongwoo [Thu, 12 May 2022 07:50:18 +0000 (16:50 +0900)]
[ Compiler ] add loss layer realizer

This patch includes loss realizer implementation and tests which removes the loss layer for exports.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: seongwoo <mhs4670go@naver.com>
2 years agoUpdate how-to-run-example-android.md
Udit Jain [Fri, 13 May 2022 07:08:22 +0000 (16:08 +0900)]
Update how-to-run-example-android.md

Updated documentation to include some additional things I needed and added some troubleshooting cases.
Signed-off-by: Udit <udit.jain@samsung.com>
2 years ago[unittest] add negative unittest
hyeonseok lee [Thu, 12 May 2022 10:51:29 +0000 (19:51 +0900)]
[unittest] add negative unittest

 - Added negative unittest

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] added assertion in unittest
hyeonseok lee [Thu, 12 May 2022 07:25:42 +0000 (16:25 +0900)]
[unittest] added assertion in unittest

 - For the unittest which has no assertion add assertion to the test

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] remove duplicated unittest
hyeonseok lee [Fri, 13 May 2022 13:06:39 +0000 (22:06 +0900)]
[unittest] remove duplicated unittest

 - remove duplicated unittest

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago:[test] Add some negative tests to unittest
Hyunil [Thu, 12 May 2022 11:04:36 +0000 (20:04 +0900)]
:[test] Add some negative tests to unittest

There are some case where there is a positive test but no negative test

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[ CCAPI ] add copy model configuration & get/set Weight of layer
jijoong.moon [Tue, 26 Apr 2022 12:35:53 +0000 (21:35 +0900)]
[ CCAPI ] add copy model configuration & get/set Weight of layer

This patch enables the API for the copy model configuration and
get/set Weight Tensor data with vector of float pointer.

sementics of this APIs are:
  . copyConfigurarion(ml::train::Model &from)
  . std::vector<float*> getWeights()
  . void setWeights(const std::vector<float*>)

This is only copy the model configuration, it reauires the compile and
initialization before train.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[rnncell] generate testcase for muli in/out of rnncell
hyeonseok lee [Thu, 28 Apr 2022 07:48:01 +0000 (16:48 +0900)]
[rnncell] generate testcase for muli in/out of rnncell

 - Generate multi in/out rnncell and replace testcase of 1 input/output of rnncell

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[rnncell] implement multiout for rnncell
hyeonseok lee [Tue, 26 Apr 2022 09:57:21 +0000 (18:57 +0900)]
[rnncell] implement multiout for rnncell

 - Calculate derivative inlcude duplicated process with calculate gradient.
   Needs to be optimized.
 - Remove hidden state and get previous hidden state from input
 - Remove timestep property
 - Disabled rnncell unittest. Proper unittest will be followed.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[neuralnet] set epoch_idx to zero only if train from scratch
hyeonseok lee [Fri, 22 Apr 2022 05:56:45 +0000 (14:56 +0900)]
[neuralnet] set epoch_idx to zero only if train from scratch

 - Until now, the epoch property had a meaning of how many epoch to run,
   but from now it will have a meaning of final epochs to run

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[model] added getter for current epoch_idx
hyeonseok lee [Fri, 22 Apr 2022 08:28:39 +0000 (17:28 +0900)]
[model] added getter for current epoch_idx

 - Through this function user can get epoch_idx

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[model] support stop training
hyeonseok lee [Fri, 22 Apr 2022 04:53:09 +0000 (13:53 +0900)]
[model] support stop training

 - During the training the function will check every epoch to decide stop
   the training or not.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[lstm] implement calcGradient for bidirectional lstm
hyeonseok lee [Mon, 24 Jan 2022 05:36:41 +0000 (14:36 +0900)]
[lstm] implement calcGradient for bidirectional lstm

 - Implement calcGradient for bidirectional lstm
 - Added test case for bidirectional lstm

close #1726

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ CAPI ] change set_feature_state to compatible with tizen api
jijoong.moon [Fri, 6 May 2022 02:18:38 +0000 (11:18 +0900)]
[ CAPI ] change set_feature_state to compatible with tizen api

This patch make the set_featture_state variable compatible with tizen
ml api. It requires ml feature parameter to set and get, it fixs
accordingly.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Unit Test ] remove mse loss layer in makeCompiledGraph
jijoong.moon [Mon, 2 May 2022 01:53:04 +0000 (10:53 +0900)]
[ Unit Test ] remove mse loss layer  in makeCompiledGraph

This patch remove the mse loss layer in makeCompiledGraph

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Fix] Clean code for resnet application
Jiho Chu [Wed, 13 Apr 2022 11:10:35 +0000 (20:10 +0900)]
[Fix] Clean code for resnet application

- fix typo
- Use const for catched exception
- use EXIT_SUCCESS macro
- use error code

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[ TEST ] add more test cases for batch normalization realizer
jijoong.moon [Wed, 20 Apr 2022 07:58:21 +0000 (16:58 +0900)]
[ TEST ] add more test cases for batch normalization realizer

This patch adds the test case for batch normazliation realizer with
kind of resnet basic block which includes the multiout layer.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Compiler ] Implement bn realizer with test
jijoong.moon [Tue, 19 Apr 2022 07:32:32 +0000 (16:32 +0900)]
[ Compiler ] Implement bn realizer with test

This patch completes the bn realizer for the inference.
This path is only for the inference and therefore it supposed to 1 to
1 connection ( one input and one output for the bn layer ) in the
model graph. That means if there are multiple connections, then
multi-out layer is follows and make sure bn layer has 1 to 1
in/output during compilation.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Compilier ] implement BN realzier
jijoong.moon [Mon, 18 Apr 2022 05:55:34 +0000 (14:55 +0900)]
[ Compilier ] implement BN realzier

Describe a commit content (Until 80 colums per line) in detail ASAP.

**Changes proposed in this PR:**
- Added TOC generator for README.md

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Interpreter ] add batch normalziation realizer
jijoong.moon [Wed, 20 Apr 2022 00:38:31 +0000 (09:38 +0900)]
[ Interpreter ] add batch normalziation realizer

This patch removes the batch normalizaiton layers with bn_realizer in
the GraphRepresentation for TfliteInterperter which is for the
inference.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Compiler ] Implement bn realizer with test
jijoong.moon [Mon, 18 Apr 2022 05:55:34 +0000 (14:55 +0900)]
[ Compiler ] Implement bn realizer with test

This patch completes the bn realizer for the inference.
This path is only for the inference and therefore it supposed to 1 to
1 connection ( one input and one output for the bn layer ) in the
model graph. That means if there are multiple connections, then
multi-out layer is follows and make sure bn layer has 1 to 1
in/output during compilation.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[CAPI] Add check if model is null to ml_train_model_get_summary()
Hyunil [Fri, 22 Apr 2022 01:06:27 +0000 (10:06 +0900)]
[CAPI] Add check if model is null to ml_train_model_get_summary()

- Check if model is null or not before checking in ml_train_model_get_summary_util()

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[ Android ] fix the warnning message
jijoong.moon [Thu, 21 Apr 2022 01:19:13 +0000 (10:19 +0900)]
[ Android ] fix the warnning message

This patch fixs the warnning message for the android

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[zoneout] enable zoneout only when used
hyeonseok lee [Wed, 9 Feb 2022 08:15:03 +0000 (17:15 +0900)]
[zoneout] enable zoneout only when used

 - Allocate and use zoneout mask, lstm_cell_state only when used

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[grucell] remove temporary tensor
hyeonseok lee [Wed, 5 Jan 2022 03:42:45 +0000 (12:42 +0900)]
[grucell] remove temporary tensor

 - Reduce temporary tensor

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[zoneout lstmcell] share zoneout mask tensors
hyeonseok lee [Wed, 5 Jan 2022 03:39:55 +0000 (12:39 +0900)]
[zoneout lstmcell] share zoneout mask tensors

 - Makes zoneout_mask tensors to be shared when it is unrolled

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[optimizer/lr] Add learning rate scheduler to ccapi
Parichay Kapoor [Mon, 13 Dec 2021 10:04:17 +0000 (19:04 +0900)]
[optimizer/lr] Add learning rate scheduler to ccapi

Open learning rate scheduler for the ccapi with the optimizer.
Corresponding changes are made and some tests are updated to use the new
way to set the learning rate.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[lr] Support step wise decay scheduler
Parichay Kapoor [Mon, 13 Dec 2021 05:25:29 +0000 (14:25 +0900)]
[lr] Support step wise decay scheduler

Support step wise decay learning rate scheduler, where the iterations
and learning rate are taken as parameter.
The corresponding unittests are also added.

See Also #1776

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[test] Bug fix for unittest
Parichay Kapoor [Mon, 13 Dec 2021 02:35:13 +0000 (11:35 +0900)]
[test] Bug fix for unittest

ccapi and capi use the same filenames for their ini files which can be
buggy if the unittests are run in parallel or unittests differ.
Make ini untitest names independent.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[modelloader] Support learning rate scheduler
Parichay Kapoor [Fri, 10 Dec 2021 17:47:07 +0000 (02:47 +0900)]
[modelloader] Support learning rate scheduler

Support learning rate scheduler with model loader along with a unittest.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[optimizer] Cleanup optimizers
Parichay Kapoor [Fri, 10 Dec 2021 16:29:12 +0000 (01:29 +0900)]
[optimizer] Cleanup optimizers

cleanup the optimizer interface and existing implementations.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[optimizer] Update to use learning rate schedulers
Parichay Kapoor [Fri, 10 Dec 2021 15:49:03 +0000 (00:49 +0900)]
[optimizer] Update to use learning rate schedulers

Update the optimizer to use the learning rate scheduler in the training
and weight updation.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[optimizer] Use optimizer wrapped
Parichay Kapoor [Fri, 10 Dec 2021 15:00:24 +0000 (00:00 +0900)]
[optimizer] Use optimizer wrapped

Update to use the optimizer wrapped instead of optimizer directly, and
prepare for the using learning rate scheduler.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[optimizer] Add optimizer wrapped
Parichay Kapoor [Fri, 10 Dec 2021 12:29:28 +0000 (21:29 +0900)]
[optimizer] Add optimizer wrapped

Add optimizer wrapped which wraps the opitmizer and the learning rate
scheduler.
In order to be backward compatible, each optimizer must support setting
the learning rate, decay rate and decay steps, even for new optimizers.
To make this extensible without each optimizer storing this information
and merging with the learning rate schedulers, and not creating new
interfaces, optimizer wrapped is added.
Optimizer wraps around optimizer, and owns both the optimizer and
learning rate scheduler. If the properties of LR or decay are passed to
the optimizer, they are intercepted by the optimizer wrapped and passed
to the learning rate scheduler appropriately.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[test] Add finalize tests for lr schedulers
Parichay Kapoor [Fri, 10 Dec 2021 03:59:32 +0000 (12:59 +0900)]
[test] Add finalize tests for lr schedulers

Add tests for finalize member function for the learning rate schedulers.

Signed-off-by: Parichay Kapoor <pk.kapoor@samsung.com>
2 years ago[ Compiler ] add batch norm layer realizer
jijoong.moon [Mon, 18 Apr 2022 01:39:00 +0000 (10:39 +0900)]
[ Compiler ] add batch norm layer realizer

This patch includes skeleton code for bn realizer which removes the
batch normalization layer for inference

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Doc] Change nnstreamer CI server domain name.
gichan [Wed, 20 Apr 2022 01:55:41 +0000 (10:55 +0900)]
[Doc] Change nnstreamer CI server domain name.

Change nnstreamer CI server domain name.
nnstreamer.mooo.com -> ci.nnstreamer.ai

Signed-off-by: gichan <gichan2.jang@samsung.com>
2 years ago[ BLAS ] use openblas function to set num threads accepted/tizen/unified/20220419.142304 submit/tizen/20220418.053521 submit/tizen/20220418.061930
jijoong.moon [Thu, 14 Apr 2022 11:58:01 +0000 (20:58 +0900)]
[ BLAS ] use openblas function to set num threads

This patch use openblas_set_num_thread() to set the number of thread
for BLAS computation, instread of set environment variables

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[capi] added unittest for get layer
hyeonseok lee [Wed, 6 Apr 2022 14:05:53 +0000 (23:05 +0900)]
[capi] added unittest for get layer

 - Added 4 unittest for capi ml_train_model_get_layer
 - get layer api is validate using get summary function
   but this should be replace by layer getProperty(#1875)

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[neuralnet] enhance print function to print graph info
hyeonseok lee [Fri, 8 Apr 2022 11:41:39 +0000 (20:41 +0900)]
[neuralnet] enhance print function to print graph info

 - Enhance print function in neuralnet to print layer connection
   printing input tensor info is omitted.
 - Enhance print function in layer_node to print layer properties

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ TEST ] disable app draw classification if nnstreamer is disabled
jijoong.moon [Wed, 13 Apr 2022 08:50:45 +0000 (17:50 +0900)]
[ TEST ] disable app draw classification if nnstreamer is disabled

This patch disable the test when the nnstreamer is
disabled. Prerequisite of this test is nnstreamer is enabled.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Exporter] move builtin options for tflite into tflite_opnode
jijoong.moon [Mon, 11 Apr 2022 13:28:45 +0000 (22:28 +0900)]
[Exporter] move builtin options for tflite into tflite_opnode

This patch add getter of Flatbuffer builtin options in TfOpNodes.
More case will be added to get the builtin options.

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ MESON ] fix bug when meson test
jijoong.moon [Mon, 11 Apr 2022 10:46:06 +0000 (19:46 +0900)]
[ MESON ] fix bug when meson test

This patch cp the golden data for enable-tflite-interpreter when do
meson test.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[neuralnet] Add null character when load optizmier type from bin file
hyeonseok lee [Wed, 6 Apr 2022 09:51:24 +0000 (18:51 +0900)]
[neuralnet] Add null character when load optizmier type from bin file

 - Added missing null character when load optimizer type from bin file

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[CI] fix android build
jijoong.moon [Wed, 6 Apr 2022 07:58:02 +0000 (16:58 +0900)]
[CI] fix android build

This patch fix ci fail when we build android using
tools/package_andorid.sh script

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[CCAPI] Rearrange enumeration of layer type accepted/tizen/unified/20220405.155809 submit/tizen/20220405.072656
Hyunil [Thu, 31 Mar 2022 09:01:31 +0000 (18:01 +0900)]
[CCAPI] Rearrange enumeration of layer type

- Classified enumeration as neural network, simple transformation and loass layer
- To perform TCT added newly, match position of enumeration between nntrainer-api-common.h and layer.h

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:    [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[README] update reviewers
jijoong.moon [Fri, 1 Apr 2022 01:17:14 +0000 (10:17 +0900)]
[README] update reviewers

This patch update reviewers in README

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[COVERITY] remove unreachec code
jijoong.moon [Fri, 25 Mar 2022 02:51:40 +0000 (11:51 +0900)]
[COVERITY] remove unreachec code

This patch remove unreachable code

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[CAPI] Move getlayer api to public
jijoong.moon [Thu, 24 Mar 2022 01:46:10 +0000 (10:46 +0900)]
[CAPI] Move getlayer api to public

This patch moves the get_layer api to public

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years agojni/Android: build error fix
MyungJoo Ham [Fri, 25 Mar 2022 05:45:55 +0000 (14:45 +0900)]
jni/Android: build error fix

Add ML_API_COMMON=1 for Android.mk build

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agoML-API dependency clean-up.
MyungJoo Ham [Wed, 23 Mar 2022 10:15:05 +0000 (19:15 +0900)]
ML-API dependency clean-up.

If it's not Tizen, ML-API(C) is not mandatory.
Allow to build w/o dependencies on ML-API by default.

Fixes #1853

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agoapi: clean up dependencies
MyungJoo Ham [Wed, 23 Mar 2022 10:14:32 +0000 (19:14 +0900)]
api: clean up dependencies

Remove unnecessary dependencies.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agotest: remove unnecessary capi dependencies
MyungJoo Ham [Wed, 23 Mar 2022 10:13:00 +0000 (19:13 +0900)]
test: remove unnecessary capi dependencies

Many test cases do not require capi.
Reexamine api dependencies and corrected them.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agoapplication: remove unnecessary capi dep.
MyungJoo Ham [Wed, 23 Mar 2022 10:11:55 +0000 (19:11 +0900)]
application: remove unnecessary capi dep.

1. Remove unnecessary capi dependencies.
2. Don't build apps requiring capi if capi is not available.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agomeson: dependency on ml-api should be optional
MyungJoo Ham [Fri, 18 Mar 2022 08:20:33 +0000 (17:20 +0900)]
meson: dependency on ml-api should be optional

Users should be able to build nntrainer in a system without
ML-API and nnstreamer by default.

ML-API and nnstreamer dependency should only be mandated
in related systems (a few embedded systems).

Let's keep it "auto" so that most external users can forget
about nnstreamer and ML-API.

Addresses #1853

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years ago[CAPI] Expose Layer Enums
jijoong.moon [Wed, 23 Mar 2022 11:12:43 +0000 (20:12 +0900)]
[CAPI] Expose Layer Enums

This patch contains layer enums to be exposed.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[COVERITY] remove unreachec code accepted/tizen/unified/20220330.021238 submit/tizen/20220325.053530
jijoong.moon [Fri, 25 Mar 2022 02:51:40 +0000 (11:51 +0900)]
[COVERITY] remove unreachec code

This patch remove unreachable code

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years agoPortability: unittest uninit vars.
MyungJoo Ham [Wed, 16 Mar 2022 07:56:56 +0000 (16:56 +0900)]
Portability: unittest uninit vars.

1. gtest code has maybe-uninitialized warnings. Suppress it.
2. maybe-unitialized warning from unittest code fixed.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agoPortability: g++-11 has different std policy.
MyungJoo Ham [Wed, 16 Mar 2022 07:39:44 +0000 (16:39 +0900)]
Portability: g++-11 has different std policy.

It requires stdexcept and limits for std::invalid_arguments and std::numeric_limits.

Fixes #1857

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years agoPortability: gcc structured binding declaration error
MyungJoo Ham [Wed, 16 Mar 2022 07:38:00 +0000 (16:38 +0900)]
Portability: gcc structured binding declaration error

gcc-7 does not comply with c++-17 fully with structured binding declaration.
This allows maybe-unused globally as a workaround.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years ago[fix] check return value
hyeonseok lee [Thu, 17 Mar 2022 04:33:49 +0000 (13:33 +0900)]
[fix] check return value

 - Added missing statement to check called function works properly

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years agoChange loading meta information behavior
Jihoon Lee [Fri, 11 Mar 2022 15:42:54 +0000 (00:42 +0900)]
Change loading meta information behavior

**Before this PR**

optimizer variable loaded from load_path every time.
Calling model->train(); in a row became unintuitive

1. model->train() load from original load path
thus iteration number roll back to the first one.
2. Same happens for the adam weight
3. model->load(); after model->initialize(); is noop
because loadedWeight becomes true

**After this PR**

1. model load from load_path only at initialize time
2. model->load is not implicitly overriden

**Additional Changes**

1. optimizer weight became part of weights. Now available after initialize()
2. Save format became coherent with load format
3. Some unused variables deleted

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
2 years agoRevert "Load Optimizer Variables"
Jihoon Lee [Thu, 10 Mar 2022 09:40:10 +0000 (18:40 +0900)]
Revert "Load Optimizer Variables"

This reverts commit c669732b1f52f4aad3114839fe1ebba0f5d95f27.

As this commit contains some compatibility breaking changes, this commit
should be merged with where nntrainer is being used.

Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
2 years agoLoad Optimizer Variables
jijoong.moon [Thu, 3 Mar 2022 03:11:04 +0000 (12:11 +0900)]
Load Optimizer Variables

In this PR,
 1. new property of adam optimizer, "laod_var" is added to set loading
    momentum variables
 2. update the read and save binary file
    using  ml::train::OptimizerType
 3. update read in layer_node to skip the optimizer variables if
    load_var is set as "false"
 4. is_laod_var() is added in optimizer_devel

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>