platform/core/ml/nntrainer.git
2 years ago[typo] Fix typo
SeoHyungjun [Mon, 31 Oct 2022 04:35:59 +0000 (13:35 +0900)]
[typo] Fix typo

fix the typo error in README.md file

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
2 years ago[typo] Fix typo error
DonghakPark [Thu, 27 Oct 2022 11:58:41 +0000 (20:58 +0900)]
[typo] Fix typo error

Fix the typo error through spell checker
- Application/
- api/ccapi/
- nntrainer/

Signed-off-by: DonghakPark <donghak.park@samsung.com>
2 years agoFix typo error
Seungbaek Hong [Mon, 31 Oct 2022 07:12:35 +0000 (16:12 +0900)]
Fix typo error

Fix the typo error in getting-started.md file.

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.net>
2 years ago[fix] fix coverity_tizen issue
DonghakPark [Thu, 20 Oct 2022 10:53:07 +0000 (19:53 +0900)]
[fix] fix coverity_tizen issue
- Add try, catch statement to catch exception
- Add NNTR_THROW_IF to check return

Signed-off-by: DonghakPark <donghak.park@samsung.com>
2 years ago[unittest] refine unittest
hyeonseok lee [Tue, 27 Sep 2022 03:02:55 +0000 (12:02 +0900)]
[unittest] refine unittest

 - Use epsilon to compare float variable
 - Use unused enum LayerCreateSetPropertyOptions
 - fix typos

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] remove unittest_nntrainer_layers
hyeonseok lee [Mon, 26 Sep 2022 10:47:47 +0000 (19:47 +0900)]
[unittest] remove unittest_nntrainer_layers

 - Remove unittest_nntrainer_layers.cpp cause this test have been disable.
 - Rename unittest_layers_v2 to unittest

Close #2002

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] migrate unittest from unittest_nntrainer_layers to unittest_layers
hyeonseok lee [Tue, 27 Sep 2022 03:00:20 +0000 (12:00 +0900)]
[unittest] migrate unittest from unittest_nntrainer_layers to unittest_layers

 - Migrate unittest for test negative property from unittest_nntrainer_layer to
   unittest_layers
 - Added LayerPropertySemantics which is inherited from LayerSemantics
   to test negative property only
 - Added LayerPropertySemantics in Simpleshot and pow layer unittest

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[typo] Fix typo error
DonghakPark [Fri, 21 Oct 2022 00:23:06 +0000 (09:23 +0900)]
[typo] Fix typo error

Fix the typo errors in bellow dir
- Applications/
- nntrainer/optimizers/

Signed-off-by: DonghakPark <donghak.park@samsung.com>
2 years ago[Tensor] Add constructor for user swap path
Jiho Chu [Mon, 17 Oct 2022 02:18:17 +0000 (11:18 +0900)]
[Tensor] Add constructor for user swap path

Swap file path could be changed by model property.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Model] Add memory swap path property
Jiho Chu [Fri, 14 Oct 2022 13:19:37 +0000 (22:19 +0900)]
[Model] Add memory swap path property

MemorySwapPath property is added to define path for swap files.
If it's not defined, it creates files in crurent directory.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[README] Fix typo error in README
seungbaek [Tue, 18 Oct 2022 02:46:07 +0000 (11:46 +0900)]
[README] Fix typo error in README

Fixed the typo error in the title of the paper below.
- NNTrainer: Light-Weight On-Device Training Framework, arXiv, 2022

Signed-off-by: seungbaek <sb92.hong@samsung.com>
2 years ago[memory] extract MemoryData class from memory_pool to another file
hyeonseok lee [Fri, 14 Oct 2022 12:07:52 +0000 (21:07 +0900)]
[memory] extract MemoryData class from memory_pool to another file

 - Extract MemoryData class from memory_pool.h and make memory_data.h file
 - Add doxygen

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[neuralnet] check stop_cb by every layer
hyeonseok lee [Fri, 14 Oct 2022 09:28:20 +0000 (18:28 +0900)]
[neuralnet] check stop_cb by every layer

 - Check stop_cb before doing forwarding/calcGrading/calcDerivative by every layer
   to stop training more quickly.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[TEST] fix for cache pool modification
Jiho Chu [Thu, 25 Aug 2022 08:02:01 +0000 (17:02 +0900)]
[TEST] fix for cache pool modification

It modified for cache pool implementation.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Property] Add memory_swap property
Jiho Chu [Tue, 19 Jul 2022 11:33:49 +0000 (20:33 +0900)]
[Property] Add memory_swap property

'memory_swap' property is added, and it can be used to select
which pool to use, memory pool or cache pool.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Memory] Implement memory pool for swap device
Jiho Chu [Tue, 19 Jul 2022 11:30:54 +0000 (20:30 +0900)]
[Memory] Implement memory pool for swap device

Memory pool is modified for swap device.
New memory data class is introduced and stores allocation information.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Tensor] Add swap device
Jiho Chu [Fri, 1 Jul 2022 13:22:43 +0000 (22:22 +0900)]
[Tensor] Add swap device

Swap device class is introduce.
It operates with cache memory and keeps data permantly in the device.
The storage size is fixed when the device opens, sparse data size
is occupied with garbage data. When the device close, the storage
file is removed.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Memory] Add cache pool
Jiho Chu [Fri, 1 Jul 2022 12:27:09 +0000 (21:27 +0900)]
[Memory] Add cache pool

Initial implementaion of cache pool.
It inherites from memory pool to utilize optimized memory information.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years agodebian: fix occasional debuild error
MyungJoo Ham [Tue, 4 Oct 2022 12:36:32 +0000 (21:36 +0900)]
debian: fix occasional debuild error

Sometime debuild has errors with dh_clean:

```
dh_clean: error: find .  \( \( \
\( -path .\*/.git -o -path .\*/.svn -o -path .\*/.bzr -o -path .\*/.hg -o -path .\*/CVS -o -path .\*/.pc -o -path .\*/_darcs \) -prune -o -type f -a \
        \( -name '#*#' -o -name '.*~' -o -name '*~' -o -name DEADJOE \
 -o -name '*.orig' -o -name '*.rej' -o -name '*.bak' \
 -o -name '.*.orig' -o -name .*.rej -o -name '.SUMS' \
 -o -name TAGS -o \( -path '*/.deps/*' -a -name '*.P' \) \
\) -exec rm -f {} + \) -o \
\( -type d -a -name autom4te.cache -prune -exec rm -rf {} + \) \) returned exit code 1
make: *** [debian/rules:28: clean] Error 25
```

Override dh_clean to avoid this.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years ago[trivial] fix ahub issues
hyeonseok lee [Tue, 4 Oct 2022 05:54:42 +0000 (14:54 +0900)]
[trivial] fix ahub issues

 - Remove noexcept in copy function cause it throws when data is non contiguous
 - Initialize char array
 - Delete array not element("[]" was missing)

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[padding1d] change padding1d key to padding
hyeonseok lee [Fri, 30 Sep 2022 02:33:20 +0000 (11:33 +0900)]
[padding1d] change padding1d key to padding

 - Change Padding1D key to padding to use same key with Padding2D.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[neuralnet] add log about training
hyeonseok lee [Thu, 29 Sep 2022 09:19:26 +0000 (18:19 +0900)]
[neuralnet] add log about training

 - Added log when start/finish training
 - Added log when get current epoch is called

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Profile] Add memory statiscs & annotation
Jiho Chu [Thu, 25 Aug 2022 04:36:08 +0000 (13:36 +0900)]
[Profile] Add memory statiscs & annotation

It add two functions:
- provide PROFILE_MEM_ANNOTATE macro
- print average and maximum usage of memory.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[lstm] make lstm_core as class
hyeonseok lee [Thu, 8 Sep 2022 08:53:59 +0000 (17:53 +0900)]
[lstm] make lstm_core as class

 - Make lstm_core as class so let lstm/lstm_cell/zoneout_lstmcell inherit this.

close #1997

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Release] NNTrainer 0.4.0 Release
jijoong.moon [Mon, 26 Sep 2022 05:29:12 +0000 (14:29 +0900)]
[Release] NNTrainer 0.4.0 Release

NNTrainer v0.4.0 is released.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[bugfix] initialize local variable accepted/tizen/unified/20220927.132348
hyeonseok lee [Mon, 26 Sep 2022 04:57:21 +0000 (13:57 +0900)]
[bugfix] initialize local variable

 - Initialize local variable

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] added negative testcase
hyeonseok lee [Wed, 21 Sep 2022 03:14:28 +0000 (12:14 +0900)]
[unittest] added negative testcase

 - Added negative testcase

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] rename negative testcase name to _n place at the end
hyeonseok lee [Wed, 21 Sep 2022 02:40:14 +0000 (11:40 +0900)]
[unittest] rename negative testcase name to _n place at the end

 - Since the indicator _p/_n is at the middle of testcase name all of test cases
   are regarded as positive testcases(default). So place indicator at the end.
 - Added '_' right before 'n' to indicate negative testcase.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ README ] add new features in README
jijoong.moon [Fri, 16 Sep 2022 06:22:21 +0000 (15:22 +0900)]
[ README ] add new features in README

Add new features:
  . positional encoding layer
  . Multi-head attention layer
  . layer normalization layer
  . kld loss
  . learning rate schedule

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[layer] convert throw to nntr_throw in layer finalize accepted/tizen/unified/20220919.021604
hyeonseok lee [Mon, 5 Sep 2022 05:32:57 +0000 (14:32 +0900)]
[layer] convert throw to nntr_throw in layer finalize

 - Instead of directly using throw convert it to use NNTR_THROW_IF in layer finalize

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] generate transformer unittest
hyeonseok lee [Mon, 22 Aug 2022 08:00:41 +0000 (17:00 +0900)]
[unittest] generate transformer unittest

 - Generate transformer encoder layer unittest
 - Generate transformer decoder layer unittest
 - Generate transformer unittest

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] layer normalization
hyeonseok lee [Tue, 2 Aug 2022 12:03:30 +0000 (21:03 +0900)]
[unittest] layer normalization

 - generate layer normalization layer unittest

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[layer normalization] implement layer normalization
hyeonseok lee [Wed, 27 Jul 2022 10:02:51 +0000 (19:02 +0900)]
[layer normalization] implement layer normalization

 - implement layer normalization layer based on batch normalization

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[unittest] generate positional encoding unittest
hyeonseok lee [Thu, 25 Aug 2022 14:07:27 +0000 (23:07 +0900)]
[unittest] generate positional encoding unittest

 - Generate positional encoding layer/model unittest.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[positional encoding] implement positional encoding layer
hyeonseok lee [Thu, 25 Aug 2022 14:03:54 +0000 (23:03 +0900)]
[positional encoding] implement positional encoding layer

 - Positional encoding just needed to be calculated only once
   so make its lifespan as max lifespan

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[multi head attention] added unittest
hyeonseok lee [Fri, 15 Jul 2022 05:51:02 +0000 (14:51 +0900)]
[multi head attention] added unittest

 - Added layer/model unittest for multi head attention
 - Change == operator overloading to pass if both tensor has nan value

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[multi_head_attention] implement calcDerivative, calcGradient
hyeonseok lee [Fri, 15 Jul 2022 05:46:20 +0000 (14:46 +0900)]
[multi_head_attention] implement calcDerivative, calcGradient

 - Implement multi head attention calcDerivative, calcGradient
 - Needs to support bool type attention mask

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[multi head attention] implement calcCommonDerivative
hyeonseok lee [Wed, 13 Jul 2022 02:25:08 +0000 (11:25 +0900)]
[multi head attention] implement calcCommonDerivative

 - implement multi head attention calcCommonDerivative

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[multi head attention] implement forwarding
hyeonseok lee [Wed, 13 Jul 2022 02:22:43 +0000 (11:22 +0900)]
[multi head attention] implement forwarding

 - implement multi head attention forwarding

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[multi head attention] implement finalize
hyeonseok lee [Tue, 19 Jul 2022 03:22:05 +0000 (12:22 +0900)]
[multi head attention] implement finalize

 - Implement multi head attention finalize
 - Remove finalizeCommon
 - Remove ProvideAttentionMask proeperty, inout_idx member variable
   cause NNTrainer assumes that input of multi head attention will be at least 3

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[api] added newly implemented layer enum
hyeonseok lee [Thu, 25 Aug 2022 14:41:52 +0000 (23:41 +0900)]
[api] added newly implemented layer enum

 - Added newly implemented layer enum to nntrainer-api-common.h

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Trivial] Fix Typo
DonghakPark [Mon, 29 Aug 2022 06:22:25 +0000 (15:22 +0900)]
[Trivial] Fix Typo

Fix Typo in nntrainer/compiler/*

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: DonghakPark <donghak.park@samsung.com>
2 years ago[ LSTM ] Optimize LSTM Gradient calculation
jijoong.moon [Thu, 4 Aug 2022 04:22:57 +0000 (13:22 +0900)]
[ LSTM ] Optimize LSTM Gradient calculation

Gradient computaion of LSTM takes over 60% of computation. This patch
includes the optimization using the tensor dimenson which only has
width.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Applicatin] Add profile for VGG
jijoong.moon [Wed, 3 Aug 2022 06:46:51 +0000 (15:46 +0900)]
[Applicatin] Add profile for VGG

This patch adds the profile in VGG Application

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[multi head attention] Added multi head attention scaffold
hyeonseok lee [Wed, 13 Jul 2022 02:17:29 +0000 (11:17 +0900)]
[multi head attention] Added multi head attention scaffold

 - Added calcCommonDerivative
 - Added finalizeCommon

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ Layer ] pallelizaiton along batch for polling forward computation
jijoong.moon [Wed, 3 Aug 2022 06:47:56 +0000 (15:47 +0900)]
[ Layer ] pallelizaiton along batch for polling forward computation

This patch includes parallelization along batch in forward computation
of Pooling 2D layer

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ LAYERS ] LSTM : parallelization along batch direction (calGradient)
jijoong.moon [Wed, 3 Aug 2022 06:44:36 +0000 (15:44 +0900)]
[ LAYERS ] LSTM : parallelization along batch direction (calGradient)

This patch includes parallelization along batch direction for
calculation of LSTM Gradient.
Also thread id is added in thread callback parameter to use it internally.

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Property] Remove if-else statement in DroupOutRate::isValid()
hyunil park [Fri, 19 Aug 2022 09:51:37 +0000 (18:51 +0900)]
[Property] Remove if-else statement in DroupOutRate::isValid()

- Remove if-else statement

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyunil park <hyunil46.park@samsung.com>
2 years ago[ UTIL ] add frequency data in mem_usage.sh
jijoong.moon [Fri, 19 Aug 2022 01:50:06 +0000 (10:50 +0900)]
[ UTIL ] add frequency data in mem_usage.sh

Sometimes, cpu frequency check is required.
This patch includes the cpu frequency logging in mem_usage.sh

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[neuralnet] adjust epoch_idx when stop_cb is called
hyeonseok lee [Thu, 18 Aug 2022 12:51:12 +0000 (21:51 +0900)]
[neuralnet] adjust epoch_idx when stop_cb is called

 - Assume that stop_cb immediately called so reduce current epoch_idx by 1

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Applicatin] Add profile for VGG
jijoong.moon [Wed, 3 Aug 2022 06:46:51 +0000 (15:46 +0900)]
[Applicatin] Add profile for VGG

This patch adds the profile in VGG Application

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[jni] revise android build
hyeonseok lee [Wed, 3 Aug 2022 04:31:32 +0000 (13:31 +0900)]
[jni] revise android build

 - remove jni/Android.mk file. Will use jni/Android.mk.in
 - Revise docs to use tools/package_android.sh when build for android

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years agoFix inappropriate SPDX license tag
MyungJoo Ham [Thu, 11 Aug 2022 05:52:04 +0000 (14:52 +0900)]
Fix inappropriate SPDX license tag

SPDX-License_Identifier --> SPDX-License-Identifier

Added one more # to the first line for doxygen.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
2 years ago[trivial] Reorganize README.md files
DonghakPark [Fri, 5 Aug 2022 04:23:52 +0000 (13:23 +0900)]
[trivial] Reorganize README.md files

This is Proposal PR with issue #1974

- Remove Exmaple contents in nntrainer/README.md
- Add Example contents in nntrainer/Application/README.md
- Close #1974

Signed-off-by: DonghakPark <donghak.park@samsung.com>
2 years ago[ Activation ] improve tanh compuataion
jijoong.moon [Mon, 18 Jul 2022 04:43:16 +0000 (13:43 +0900)]
[ Activation ] improve tanh compuataion

This patch improves the computation of tanh.
Rather than calling tanh fuction, it is faster when
sigmoid is used.

tanh(x) = 2.0*sigmoid(2.0*x) -1.0;

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Layer ] Conv2d Gradient Computation with Multi-Threads
jijoong.moon [Mon, 18 Jul 2022 04:40:43 +0000 (13:40 +0900)]
[ Layer ] Conv2d Gradient Computation with Multi-Threads

This patch includes multi-threading for gradient computation of conv2d
layer.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Application] Add AlexNet(Fused) Application, Merge cifar_dataloader into utils
DonghakPark [Mon, 25 Jul 2022 04:09:06 +0000 (13:09 +0900)]
[Application] Add AlexNet(Fused) Application, Merge cifar_dataloader into utils

- Add AlexNet(Fused) Application
- Update meson.build (add Alexnet subdir)
- ADD main.cpp (AlexNet), alex.ini, README.md
- Merge cifar_dataloader into utils/datagen/cifar
- Close #1969

Signed-off-by: Donghak Park <donghak.park@samsung.com>
2 years ago[conv] support causal padding in conv1d
hyeonseok lee [Tue, 5 Jul 2022 11:36:36 +0000 (20:36 +0900)]
[conv] support causal padding in conv1d

 - Replace padding2d to padding1d in conv1d
 - Enable causal property in conv1d
 - Added unittest for causal padding in conv1d

Close #1947

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Application] Update README.md File
DonghakPark [Thu, 14 Jul 2022 06:45:58 +0000 (15:45 +0900)]
[Application] Update README.md File
- Update README res/mnist.ini part (Split Optimizer, LearningRateScheduler)
- Update align code comment

Signed-off-by : DongHak Park <donghak.park@samsung.com>

2 years ago[ Layers ] paralleize Forwarding of conv2d
jijoong.moon [Thu, 7 Jul 2022 13:25:45 +0000 (22:25 +0900)]
[ Layers ] paralleize Forwarding of conv2d

This patch includes the batch direction parallelization of forwarding
in Conv2D layer.

**Changes proposed in this PR:**
- Added TOC generator for README.md

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[trivial] fix ahub issue
hyeonseok lee [Mon, 18 Jul 2022 02:43:09 +0000 (11:43 +0900)]
[trivial] fix ahub issue

 - Added try catch statement
 - Delete structually dead code

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ Utils ] create NNTrThread Features
jijoong.moon [Thu, 7 Jul 2022 09:32:37 +0000 (18:32 +0900)]
[ Utils ] create NNTrThread Features

This patch includes the NNTrThreads Features. This will be used for the
Multi-Thread Feature of nntrainer, such as Thread Pool, for loop
multi-threading along batch direction.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Layers ] Add Parallelization along batch direction
jijoong.moon [Thu, 7 Jul 2022 09:24:54 +0000 (18:24 +0900)]
[ Layers ] Add Parallelization along batch direction

This patch demostrate the batch direction parallelization with conv2d
calcDerivatives.
. add the meson option with 'nntr-num-threads' key and int value.
. add extra compile option, NNTR_NUM_THREADS ( default value is 1 )

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[Application] Rename main_sample.cpp to main.cpp in KNN
Hyunil [Mon, 11 Jul 2022 06:32:22 +0000 (15:32 +0900)]
[Application] Rename main_sample.cpp to main.cpp in KNN

- Rename the filename main_sample.cpp to main.cpp
- Modify meson.build and Android.mk

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Application] Remove unused file of KNN
Hyunil [Mon, 11 Jul 2022 06:21:20 +0000 (15:21 +0900)]
[Application] Remove unused file of KNN

- Remove main.cpp

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[conv] support dilation property
hyeonseok lee [Tue, 28 Jun 2022 05:21:44 +0000 (14:21 +0900)]
[conv] support dilation property

 - Support dilation property in conv1d/conv2d layer
 - Added unittest with dilation

Close #1922

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ Trivial ] Fix the "Deprecated-declarations" error
jijoong.moon [Fri, 1 Jul 2022 14:16:31 +0000 (23:16 +0900)]
[ Trivial ] Fix the "Deprecated-declarations" error

This patch updates 'INSTANTIATE_TEST_CASE_P' which will be deprecated
with 'INISTATIATE_TEST_SUITE_P'.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[BUILD] fix supported gmock version
Jiho Chu [Thu, 30 Jun 2022 02:25:21 +0000 (11:25 +0900)]
[BUILD] fix supported gmock version

google mock API is changed from 1.10.0,
and it does not support MOCK_METHOD under 1.10.0 version.
'profiler' test is only enabled for proper gmock.

MOCK_METHOD macro needs the upper verison(>=1.10) of gmock.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[ Application ] Fix VGG using CCAPI
jijoong.moon [Mon, 27 Jun 2022 09:59:07 +0000 (18:59 +0900)]
[ Application ] Fix VGG using CCAPI

This patch includes the fixes of VGG Application.
- Remove the training data generation with batch size.
  : Recently we update the data generation callback to fill one data
  set a time
- Remove include the nntrainer interanls : neuralnet.h
  : Update to use CCAPI
- Reuse the data generation with ciar_dataloader.h in Resnet

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[TEST] Add unittest related to setProperty for dataset
Hyunil [Wed, 22 Jun 2022 09:01:39 +0000 (18:01 +0900)]
[TEST] Add unittest related to setProperty for dataset

Add unittest for ml_train_dataset_set_property_for_mode_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add set property with single param for dataset
Hyunil [Wed, 22 Jun 2022 08:51:55 +0000 (17:51 +0900)]
[CAPI] Add set property with single param for dataset

Add ml_train_dataset_set_property_for_mode_with_single_param().
ml_train_dataset_set_property_for_mode() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive param as a single string from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[ BUILD ] fix binary size of libnntrainer with android build script
jijoong.moon [Wed, 22 Jun 2022 06:15:03 +0000 (15:15 +0900)]
[ BUILD ] fix binary size of libnntrainer with android build script

This patch fix the wrong size of nntrainer library, libnntrainer.so,
when using the android build script.

- change the ndk-build options with NDK_LIBS_OUT

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[neuralnet] check tflite interpreter is enabled when export
hyeonseok lee [Tue, 21 Jun 2022 09:41:56 +0000 (18:41 +0900)]
[neuralnet] check tflite interpreter is enabled when export

 - Added ifdef statement to check tflite interpreter is enabled
 - Added override specifier in get/setWeights
 - Restore precision ostream format

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[profiler] restore precision
hyeonseok lee [Mon, 20 Jun 2022 07:35:26 +0000 (16:35 +0900)]
[profiler] restore precision

 - After print profile information restore precision

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[fix] fix svace issue
hyeonseok lee [Fri, 17 Jun 2022 06:45:53 +0000 (15:45 +0900)]
[fix] fix svace issue

 - Delete duplicated unittest
 - Added try catch statement to catch exception

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[Application] add step learning rate scheduler example
hyeonseok lee [Thu, 16 Jun 2022 08:48:23 +0000 (17:48 +0900)]
[Application] add step learning rate scheduler example

 - Added how to use step learning rate scheduler in mnist application

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ TEST ] add golden data for grucell_fc
jijoong.moon [Wed, 15 Jun 2022 12:53:36 +0000 (21:53 +0900)]
[ TEST ] add golden data for grucell_fc

This patch includes the golden data of grucell_fc unit test.

Resolves:

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[ Recurrent ] property for dynamic time sequence
jijoong.moon [Wed, 15 Jun 2022 11:37:08 +0000 (20:37 +0900)]
[ Recurrent ] property for dynamic time sequence

.This patch provides property for dynamic time sequence in recurrent
realizer. The sementic of this is "dynamic_time_seq = true/false"

.Add grucell + Fully Connected uint test case for reference of dynamic
time sequence

Related : #1933

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[TEST] Add unittest related to setProperty for optimizer
Hyunil [Thu, 16 Jun 2022 00:35:29 +0000 (09:35 +0900)]
[TEST] Add unittest related to setProperty for optimizer

- Add unittest for ml_train_optimizer_set_property_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add set property with single param for optimizer
Hyunil [Thu, 16 Jun 2022 00:18:01 +0000 (09:18 +0900)]
[CAPI] Add set property with single param for optimizer

Add ml_train_optimizer_set_property_with_single_param().
ml_train_optimizer_set_property() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive param as a single string from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[trivial] fix typo
DonghakPark [Wed, 15 Jun 2022 07:29:57 +0000 (16:29 +0900)]
[trivial] fix typo

- Fix typo
- Fix ist --> <sup>i</sup> for readability

Signed-off-by: Donghak Park donghak.park@samsung.com
2 years ago[CAPI] Add unittest related to setProperty for layer
Hyunil [Tue, 14 Jun 2022 06:10:33 +0000 (15:10 +0900)]
[CAPI] Add unittest related to setProperty for layer

- Add unittest for ml_train_layer_set_property_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add ml_train_layer_set_property_with_single_param() internally for C# DllImport
Hyunil [Mon, 13 Jun 2022 09:17:27 +0000 (18:17 +0900)]
[CAPI] Add ml_train_layer_set_property_with_single_param() internally for C# DllImport

ml_train_layer_set_property() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive param as a single string from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Modify model_compile_with_single_param() and ml_train_model_run_with_single_pa...
Hyunil [Mon, 13 Jun 2022 02:11:18 +0000 (11:11 +0900)]
[CAPI] Modify model_compile_with_single_param() and ml_train_model_run_with_single_param()

Modified the internal APIs created as a C# va_list issue since capi can receive
single string in the form "key=value | key=value" and all objects have loadProperties
in setProperty which can split it with '|'.
so, the original code was reverted and added new code related call capi directly.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Spec] Add flatbuffers-devel BuildRequires as default accepted/tizen/unified/20220616.141953 submit/tizen/20220615.013221
Sangjung Woo [Wed, 15 Jun 2022 01:53:49 +0000 (10:53 +0900)]
[Spec] Add flatbuffers-devel BuildRequires as default

The compiler module always uses flatc command but 'flatbuffers-devel'
dependency is added in case of the Tizen version is lower than 6.5.
Because of this reason, the buildbreak issue occurs when building for
Tizen 7.0. This patch fixes this bug.

Signed-off-by: Sangjung Woo <sangjung.woo@samsung.com>
2 years ago[ Tensor ] remove rank 2 limitation for dot op
jijoong.moon [Fri, 10 Jun 2022 10:50:30 +0000 (19:50 +0900)]
[ Tensor ] remove rank 2 limitation for dot op

This patch removes the limitaion of rank 2 for dot op.
It expectes to compute with 4D tensor as 2D with [BxCxH, W]
dimension.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[unittest] added zoneout mask unittest
hyeonseok lee [Fri, 10 Jun 2022 06:59:00 +0000 (15:59 +0900)]
[unittest] added zoneout mask unittest

 - Test the zoneout mask is generated according to the zoneout rate

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[layer] fix typo
hyeonseok lee [Fri, 6 May 2022 02:21:04 +0000 (11:21 +0900)]
[layer] fix typo

 - Fix typo
 - Delete duplicated code

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[ Doc ] Update Readme.md
jijoong.moon [Mon, 13 Jun 2022 00:45:35 +0000 (09:45 +0900)]
[ Doc ] Update Readme.md

Add new doc which explains NNTrainer internal

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[docs] Update configuration-ini.md file
Hyunil [Fri, 3 Jun 2022 06:18:11 +0000 (15:18 +0900)]
[docs] Update configuration-ini.md file

- Remove learning_rate from oprimizer section
- Add learning rate scheduler section
- Remove dataset section
- Add train set, validation set and test set section
- Add many types to layer section
- Add table about type, key, value, default value and description for each layers
- Update configuration file example

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[ Packaging ] Packaging for Tizen 6.0
jijoong.moon [Thu, 2 Jun 2022 10:57:51 +0000 (19:57 +0900)]
[ Packaging ] Packaging for Tizen 6.0

This patch includes fixes to support Tizne 6.0 build
 . Fix .spec & meson.build

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
2 years ago[bug_fix] bug fix zoneout lstmcell layer
hyeonseok lee [Tue, 7 Jun 2022 07:06:00 +0000 (16:06 +0900)]
[bug_fix] bug fix zoneout lstmcell layer

 - This patch enable zoneout even though the zoneout rate is smaller than epsilon

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
2 years ago[CAPI] Add unittest related to compile and train
Hyunil [Tue, 24 May 2022 02:02:45 +0000 (11:02 +0900)]
[CAPI] Add unittest related to compile and train

- Add unittest for ml_train_model_compile_with_single_param()
- Add unittest for ml_train_model_run_with_single_param()

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Property] Modify error log
Hyunil [Mon, 23 May 2022 03:32:45 +0000 (12:32 +0900)]
[Property] Modify error log

- Add double quotation mark to log to indicate wrong with "key=value, key=value"
- Add example key and value

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add ml_train_model_run_with_single_param() internally for C# DllImport
Hyunil [Tue, 24 May 2022 05:39:02 +0000 (14:39 +0900)]
[CAPI] Add ml_train_model_run_with_single_param() internally for C# DllImport

ml_train_model_run() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive model compile param as a single string
from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[CAPI] Add ml_train_model_compile_with_single_param() internally for C# DllImport
Hyunil [Tue, 17 May 2022 03:07:44 +0000 (12:07 +0900)]
[CAPI] Add ml_train_model_compile_with_single_param() internally for C# DllImport

ml_train_model_compile() has a va_list and this is a problem with DllImport in C#.
va_list must be passed dynamically for each architect, this is not guaranteed to
work for some architects. This api receive model compile param as a single string
from C# DllImport.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Hyunil <hyunil46.park@samsung.com>
2 years ago[Utils] Add Memory profile feature to Profiler
Jiho Chu [Mon, 30 May 2022 04:56:38 +0000 (13:56 +0900)]
[Utils] Add Memory profile feature to Profiler

This patch implement memory profiling feature.

Profile is refactored to handle memory statics information.
Unnecessary dependency from ProfileListener to Profile
class is removed, and inner information is redesigned to handle
both time and memory profiling.

For the memory profile, below infomation is managed:

event: ALLOC | DEALLOC
current size: total allocated memory size
info: user friendly tag for the analysis
duration: time interval between alloc and dealloc

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[Test] Modify profiler test to add Memory event
Jiho Chu [Mon, 30 May 2022 04:54:42 +0000 (13:54 +0900)]
[Test] Modify profiler test to add Memory event

This patch implement profiler test.

Whole codes are refectored to test both 'time' and 'memory' profiling.
It uses fixture test and gmock is used to checking callback function.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
2 years ago[ccapi/nntrainer] Add getters for compiled, initialized and loadedFromConfig
JRazek [Fri, 27 May 2022 17:28:39 +0000 (19:28 +0200)]
[ccapi/nntrainer] Add getters for compiled, initialized and loadedFromConfig

This commit adds bool getters for the states of compiled, initialized and loadedFromConfig in ml::train::Model class.

Signed-off-by: JRazek <jakub.razek2@gmail.com>
2 years ago[ccapi] include common header in meson
hyeonseok lee [Fri, 20 May 2022 03:56:06 +0000 (12:56 +0900)]
[ccapi] include common header in meson

 - Newly created common header file was missing in meson build

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>