platform/core/ml/nntrainer.git
16 months ago[TEST] Add cache loader test
Jiho Chu [Wed, 14 Dec 2022 05:59:16 +0000 (14:59 +0900)]
[TEST] Add cache loader test

This patch provides cache loader class's unittest.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[SWAP] Add cache loader class
Jiho Chu [Wed, 14 Dec 2022 05:55:46 +0000 (14:55 +0900)]
[SWAP] Add cache loader class

This patch adds CacheLoader class.
It provides methods to load cache elements by execution order, and it
uses task executor to manage asynchronous loading.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[Profile] Add cache policy column
Jiho Chu [Thu, 15 Dec 2022 08:18:06 +0000 (17:18 +0900)]
[Profile] Add cache policy column

It adds columns for cache policy information.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[SWAP] Add memory swap policy
Jiho Chu [Thu, 15 Dec 2022 07:41:30 +0000 (16:41 +0900)]
[SWAP] Add memory swap policy

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[SWAP] extract CacheElem class to new file
Jiho Chu [Wed, 14 Dec 2022 05:57:36 +0000 (14:57 +0900)]
[SWAP] extract CacheElem class to new file

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[SWAP] task executor: Remove stop and clean feature
Jiho Chu [Thu, 15 Dec 2022 08:42:40 +0000 (17:42 +0900)]
[SWAP] task executor: Remove stop and clean feature

It's changed to use work queue to manage executions.
There is some overhead for stop and cancel behaviors, so
they are removed temporaly now, due to the performance.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[API] set default value on createModel function
Seungbaek Hong [Mon, 26 Dec 2022 02:45:13 +0000 (11:45 +0900)]
[API] set default value on createModel function

the default value of the "type" parameter of the "createModel"
function is set to NEURAL_NET.

the createModel function receives the model type as a parameter, but
the only value that can be set is NEURAL_NET.
(KNN is also not supported by the createModel function in API)

Thus, it is inefficient to set the parameter value every time because
there is no other option.

**Self evaluation:**
1. Build test: [x]Passed []Failed []Skipped
2. Run test: [x]Passed []Failed []Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.com>
16 months ago[Utils] Modify memory usage script
Jiho Chu [Tue, 27 Dec 2022 07:16:17 +0000 (16:16 +0900)]
[Utils] Modify memory usage script

It uses /proc/$PID/smaps information to check current memory usage.

Please refer doc:
https://man7.org/linux/man-pages/man5/proc.5.html

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months agoModify trace timing
Jiho Chu [Thu, 5 Jan 2023 05:15:19 +0000 (14:15 +0900)]
Modify trace timing

This patch modify trace timing.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[Utils] Add trace feature
Jiho Chu [Fri, 23 Dec 2022 07:01:20 +0000 (16:01 +0900)]
[Utils] Add trace feature

Trace feature enables easily trace information through training
step. It can include memory and time tracing, and futher informations
could be included.
The memory trace is defaultly written to memory_trace_$PID.log,
while time trace is written to time_trace_$PID.log.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
16 months ago[Trivial] Modify for code consistency
SeoHyungjun [Wed, 18 Jan 2023 07:00:32 +0000 (16:00 +0900)]
[Trivial] Modify for code consistency

The flatten_realizer.cpp and activation_realizer.cpp declare
the same local variable. However, the actual code order was
written differently. Minor modifications were made for code
consistency.

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
16 months ago[trivial] modify flatten_realizer script for consistent codes
Seungbaek Hong [Mon, 2 Jan 2023 07:06:49 +0000 (16:06 +0900)]
[trivial] modify flatten_realizer script for consistent codes

modify "flatten_realizer.cpp" script for consistent codes of
"flatten_realizer.cpp" and "activation_realizer.cpp".

"flatten_realizer" and "activation realizer" contain exactly same
implementation but written in slightly different ways.

It's a very trivial matter, but I thought it would be better to
write the same implementation in the same code, so I modified it.

**Self evaluation:**
1. Build test: [x]Passed []Failed []Skipped
2. Run test: [x]Passed []Failed []Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.com>
16 months ago[layer] enable identity layer to support inplace
hyeonseok lee [Wed, 21 Dec 2022 08:34:15 +0000 (17:34 +0900)]
[layer] enable identity layer to support inplace

 - For now identity layer is not considered for support inplace in network graph.
 - This commit will enable identity layer to support inplace.
 - Added helper function for identity layer

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
17 months ago[test] add test cases that a specific layer is non-trainable
Seungbaek Hong [Thu, 22 Dec 2022 10:32:26 +0000 (19:32 +0900)]
[test] add test cases that a specific layer is non-trainable

Add test cases that a specific layer is non-trainable.

- Test based on the pytoch model with two fc hidden layers.
- Add a test when the first hidden layer is set to non-trainable.
- Add a test when the second hidden layer is set to non-trainable.

**Self evaluation:**
1. Build test: [x]Passed []Failed []Skipped
2. Run test: [x]Passed []Failed []Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.com>
17 months ago[Tensorflow] Update resnet18 example ( TF1 to TF2 )
DonghakPark [Wed, 21 Dec 2022 03:28:24 +0000 (12:28 +0900)]
[Tensorflow] Update resnet18 example ( TF1 to TF2 )

In Tensorflow 2.x tf.compat or tf.session deprecated
- update random setting
- Tested on tensorflow 2.11

Signed-off-by: DonghakPark <donghak.park@samsung.com>
17 months ago[Application] mnist: Fix tf example
Jiho Chu [Tue, 20 Dec 2022 10:24:25 +0000 (19:24 +0900)]
[Application] mnist: Fix tf example

Modified for tensorflow 2.10.0.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
17 months ago[model] Change the default value of parameter in inference function
Seungbaek Hong [Fri, 16 Dec 2022 07:36:41 +0000 (16:36 +0900)]
[model] Change the default value of parameter in inference function

Change the default value of the "free_mem" parameter to false in the inference function.

Since the "free memory" option can be used only when the model is in the training mode, the desired results may not be obtained if the "free_mem" parameter is set to true when we run inference without training the model.

Therefore, it is better to set "free_mem" option to true only when this option is needed.

**Self evaluation:**
1. Build test: [x]Passed []Failed []Skipped
2. Run test: [x]Passed []Failed []Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.net>
17 months ago[FlatBuffers] Update Tensorflow Lite FlatBuffer Schema
DonghakPark [Thu, 15 Dec 2022 06:32:59 +0000 (15:32 +0900)]
[FlatBuffers] Update Tensorflow Lite FlatBuffer Schema

Update Tensorflow Lite FlatBuffer Schema Version 3a --> Version 3b
- Rename fields in SignatureDef.
- Has Compatibility with Version 3 and 3a

For update Tensorflow version

Signed-off-by: DonghakPark <donghak.park@samsung.com>
17 months ago[swap_device] change lseek return type from int to off_t
hyeonseok lee [Tue, 6 Dec 2022 10:30:00 +0000 (19:30 +0900)]
[swap_device] change lseek return type from int to off_t

 - Return value of lseek is offset of requested position. The 4byte int type cannot represent
   more than 3GB so any request more than 3GB offset cause overflow.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
17 months ago[utils] add getLocaltime function
Seungbaek Hong [Mon, 19 Dec 2022 08:15:29 +0000 (17:15 +0900)]
[utils] add getLocaltime function

Because windows doesn't support "localtime_r" function, The "getLocaltime"
function was added to use the "localtime_s" function in windows and the
"localtime_r" function in linux.

**Self evaluation:**
1. Build test: [x]Passed []Failed []Skipped
2. Run test: [x]Passed []Failed []Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.net>
17 months ago[Trivial] Fix incorrect pointer usage
SeoHyungjun [Tue, 20 Dec 2022 10:53:58 +0000 (19:53 +0900)]
[Trivial] Fix incorrect pointer usage

The parameters of the functions \'ml_tensors_data_destroy\' and \'ml_tensors_info_destroy\' are both of the types \'ml_tensors_data_h\'.
However, several code are using address values.
Added dereference operator to match parameter type.

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
17 months ago[Trivial] Fix svace issue
SeoHyungjun [Fri, 16 Dec 2022 07:40:55 +0000 (16:40 +0900)]
[Trivial] Fix svace issue

Running increment operator after erasing iterator is a logical error.
If erase is required, change the iterator to the return value. (The return value of erase is the next iterator.)
If not, run increment operator.

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
17 months ago[Trivial] Fix svace issue
SeoHyungjun [Fri, 16 Dec 2022 07:14:16 +0000 (16:14 +0900)]
[Trivial] Fix svace issue

The for loop runs only once due to a return.
To resolve this issue, we added a conditional statement to run return when it is not ML_ERROR_NONE.

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
17 months ago[DataGen] Fix RandomDataLoader parms
DonghakPark [Fri, 9 Dec 2022 03:18:40 +0000 (12:18 +0900)]
[DataGen] Fix RandomDataLoader parms

Fix RandomDataLoader parm (iteration_for_one_epoch) --> data_size
- in RandomDataLoader it make data as much as data_size
- if developer set iteration_for_one_epoch as iterate size it doesn't work
- for example developer want to make BATCH 64 and ITER 1 it should be 64 in iteration_for_one_epoch so it doesn't make sense
- for more clearly change it's name

Signed-off-by: DonghakPark <donghak.park@samsung.com>
17 months ago[typo] Fix typo
SeoHyungjun [Fri, 9 Dec 2022 03:02:12 +0000 (12:02 +0900)]
[typo] Fix typo

- fix a typo in README.md file
- add reviewers

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
18 months ago[typo] Fix typo
SeoHyungjun [Fri, 9 Dec 2022 01:44:10 +0000 (10:44 +0900)]
[typo] Fix typo
fix the typo error in network_graph.h

the cbegin function returns a forward const iterator.
so, removed \'reverse\' from retval description.

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
18 months ago[Application] resnet: Add profiling
Jiho Chu [Thu, 1 Dec 2022 09:52:40 +0000 (18:52 +0900)]
[Application] resnet: Add profiling

This patch addes profiling codes for resnet.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[Resnet] Fix FakeDataLoader bug
Jiho Chu [Fri, 2 Dec 2022 07:54:50 +0000 (16:54 +0900)]
[Resnet] Fix FakeDataLoader bug

This fix data loader bug, which is related to iteration number.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[README] Delete duplicate layer description and Fix typo errors.
Seungbaek Hong [Thu, 1 Dec 2022 02:32:15 +0000 (11:32 +0900)]
[README] Delete duplicate layer description and Fix typo errors.

- Delete pooling2D layer description because it was duplicated twice.
- Fix some typo errors.

Signed-off-by: Seungbaek Hong sb92.hong@samsung.net
Signed-off-by: Seungbaek Hong <sb92.hong@samsung.net>
18 months ago[TEST] Add load/unload cache pool test
Jiho Chu [Fri, 28 Oct 2022 08:14:12 +0000 (17:14 +0900)]
[TEST] Add load/unload cache pool test

This patch adds load/unload operations test.

Loading/Unloading memory data are tested for both execution order and
actives.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[SWAP] Add load/unload methods to cachepool
Jiho Chu [Fri, 28 Oct 2022 03:11:04 +0000 (12:11 +0900)]
[SWAP] Add load/unload methods to cachepool

This patch adds load and unload methods.

loadExec/unloadExec are for validate/invalidate cache elements by
execution order.
loadActives/unloadActives are for swap in/out whole active
cache data, which will be used in pause/resume behaviors.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[API] add ml_train_model_get_weight api
jijoong.moon [Tue, 29 Nov 2022 02:45:02 +0000 (11:45 +0900)]
[API] add ml_train_model_get_weight api

This patch add the
``` c
ml_train_model_get_weight(ml_train_model_h model, const
char *layer_name, ml_tensors_data_h *weight, ml_tensors_info_h *info)
```

Now, developers are able to get the weight data & weight information.

ml_train_layer_get_weight will be added later.

Resolves: #2045

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
18 months ago[utils] Add getRealpath function
Seungbaek Hong [Wed, 23 Nov 2022 10:44:10 +0000 (19:44 +0900)]
[utils] Add getRealpath function

Since the "realpath" function is not available in the Windows operating system, the "getRealpath" function has been added and replaced so that it can be used in the Windows operating system.

- Add "getRealpath" function to utils.

**Self evaluation:**
1. Build test: [x]Passed []Failed []Skipped
2. Run test: [x]Passed []Failed []Skipped

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.net>
18 months ago[TEST] Add task unittest
Jiho Chu [Mon, 7 Nov 2022 12:11:27 +0000 (21:11 +0900)]
[TEST] Add task unittest

This patch implements unittest for task behaviors.

The mock class for TaskExecutor is introduced to check work thread is
correctly working, and completion check is also working correctly.
Many tests are added to check asynchronous tasks' correctness.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[Tensor] Add TaskExecutor class
Jiho Chu [Mon, 7 Nov 2022 02:09:32 +0000 (11:09 +0900)]
[Tensor] Add TaskExecutor class

This patch adds TaskExecutor class.

This class manages task's execution by their type.
Synchoronous tasks are exeucuted immediately, while asynchoronous tasks
are executed in thread manner.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[Tensor] Add Task class
Jiho Chu [Mon, 7 Nov 2022 02:08:40 +0000 (11:08 +0900)]
[Tensor] Add Task class

This patch adds Task class.

Task has several properties for its state and working information.
There are two types for Task, Synchronous and Asynchronous. and, their
behaviors will be differ when they are executed by TaskExecutor.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
18 months ago[ln] optimize layer normalization layer input memory
hyeonseok lee [Tue, 29 Nov 2022 02:51:15 +0000 (11:51 +0900)]
[ln] optimize layer normalization layer input memory

 - As bn layer does ln layer also does not need input in backwarding.
 - This commit will make lifespan of ln layer input as FORWARD_FUNC_LIFESPAN.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
18 months ago[Logger] Modify initialization error message of logger instance
hyunil park [Fri, 18 Nov 2022 00:47:26 +0000 (09:47 +0900)]
[Logger] Modify initialization error message of logger instance

- Runtime error occurs when an app is executed in a place other
  than the home directory excepted Android and Tizen
- Currently, Android and Tizen don't make logger
- If an user runs an app using nntrainer in a place where
  the logger cannot be initialized, the path is shown for help
- In case of Ubunut, logger unable to initialize on '/usr/bin/',
  /usr/lib/' and etc

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyunil park <hyunil46.park@samsung.com>
18 months agoRevert "[ CI ] Temperal Fix for the CI about NNStreamer Backone"
gichan [Wed, 23 Nov 2022 03:10:39 +0000 (12:10 +0900)]
Revert "[ CI ] Temperal Fix for the CI about NNStreamer Backone"

This reverts commit 423097f793a7af0269255cba192fd4acf18fce62.

Signed-off-by: gichan <gichan2.jang@samsung.com>
18 months agoRevert "[ CI ] Temporal Fix for the CI about NNStreamer Backone"
gichan [Wed, 23 Nov 2022 03:10:23 +0000 (12:10 +0900)]
Revert "[ CI ] Temporal Fix for the CI about NNStreamer Backone"

This reverts commit e41011408c1dc51737e3884644c704fccee871a3.

Signed-off-by: gichan <gichan2.jang@samsung.com>
18 months ago[Trivial] Fix coverity issue
SeoHyungjun [Wed, 23 Nov 2022 08:29:13 +0000 (17:29 +0900)]
[Trivial] Fix coverity issue
Added a conditional statement to work only if the result of find_if() is not the end of the iterator
to solve the `INVALIDATE_ITERATOR` issue

**Self evaluation:**
1. Build test: [X]Passed []Failed []Skipped
2. Run test: [X]Passed []Failed []Skipped

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
18 months ago[Trivial] Fix svace issues
SeoHyungjun [Wed, 23 Nov 2022 07:21:47 +0000 (16:21 +0900)]
[Trivial] Fix svace issues
- Add 'is_virtual' to default constructor of 'TfOpNode' in tflite_opnode.cpp
- Add 'fbb' to default constructor of 'Exporter' in node_exporter.cpp

**Self evaluation:**
1. Build test: [X]Passed []Failed []Skipped
2. Run test: [X]Passed []Failed []Skipped

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
18 months ago[tensor] Add default delete on tensor map
hyeonseok lee [Wed, 23 Nov 2022 03:12:34 +0000 (12:12 +0900)]
[tensor] Add default delete on tensor map

 - Omit of default delete of shared_ptr<MemoryData> causes memory leak.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
18 months ago[split layer] make a unittest to test split input dimension by split number
hyeonseok lee [Wed, 26 Oct 2022 06:32:59 +0000 (15:32 +0900)]
[split layer] make a unittest to test split input dimension by split number

 - Make a unittest to test split input dimension by split number
 - Conv2d layer is added to make model has a weight

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
18 months ago[split_layer] enhance split layer to split input dimension by given number
hyeonseok lee [Wed, 26 Oct 2022 06:08:48 +0000 (15:08 +0900)]
[split_layer] enhance split layer to split input dimension by given number

 - Until now split layer split input dimension by input dimension size which makes output dimension to 1.
   But for now we can set split number which makes output dimension to (input dimension / split number).
 - In this commit given split numbers are expected to divide input dimension equally.
   Otherwise it will raise an error.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
18 months ago[ CI ] Temporal Fix for the CI about NNStreamer Backone
jijoong.moon [Thu, 10 Nov 2022 05:41:28 +0000 (14:41 +0900)]
[ CI ] Temporal Fix for the CI about NNStreamer Backone

Temporarily, we have to turn off the nnstreamer backbone until some issue is fixed.

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
18 months ago[ CI ] Temperal Fix for the CI about NNStreamer Backone
jijoong.moon [Thu, 10 Nov 2022 02:00:40 +0000 (11:00 +0900)]
[ CI ] Temperal Fix for the CI about NNStreamer Backone

Temporarily, we have to turn off the nnstreamer backbone until some
issue is fixed.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
19 months ago[TEST] Add Cache pool test
Jiho Chu [Thu, 29 Sep 2022 02:48:27 +0000 (11:48 +0900)]
[TEST] Add Cache pool test

Legacy MemoryPool test is modifed to both MemoryPool and CachePool.
Parameterized test is used for this. CachePool specific tests are also
created.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[TENSOR] modified for CachePool test
Jiho Chu [Thu, 29 Sep 2022 02:39:16 +0000 (11:39 +0900)]
[TENSOR] modified for CachePool test

- Several methos are chagend to virtual to use mock test.
- Throw same exception with MemoryPool

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[compiler] Revisit tflite_interpreter for TF Lite Export
DonghakPark [Fri, 21 Oct 2022 05:42:43 +0000 (14:42 +0900)]
[compiler] Revisit tflite_interpreter for TF Lite Export

This commit is for TF Lite Export

Revisit FC Layer reorder
- in case of (FC - Conv -FC)

Disable Inplace in Test Case
- should be fix

Signed-off-by: DonghakPark <donghak.park@samsung.com>
19 months ago[compiler] Revisit FullyConnected Layer Weights Transpose and Reorder for TF Lite...
DonghakPark [Mon, 26 Sep 2022 07:09:07 +0000 (16:09 +0900)]
[compiler] Revisit FullyConnected Layer Weights Transpose and Reorder for TF Lite Export

This commit is for TFlite export (REVISIT)
Transpose FullyConnected Layer's Weights (NCHW --> NHWC)

update at : 2022/09/26

Related : nnstreamer#1912
Signed-off-by: DonghakPark <donghak.park@samsung.com>
19 months ago[compiler] FullyConnevted Layer Weights Transpose for TFLite Export
DonghakPark [Fri, 2 Sep 2022 08:53:22 +0000 (17:53 +0900)]
[compiler] FullyConnevted Layer Weights Transpose for TFLite Export

This commit  is for TFlite export
Transpose FullyConnected Layer's weights (NCHW --> NHWC)

update at : 2022/09/13

Related : #1912

Signed-off-by: DonghakPark <donghak.park@samsung.com>
19 months ago[compiler] Revisit tflite interpreter
seongwoo [Thu, 19 May 2022 06:23:57 +0000 (15:23 +0900)]
[compiler] Revisit tflite interpreter

This patch revisits tflite interpreter.

1. Refactor the way of exporting tflite binary.
2. Support operators included in Resnet network.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: seongwoo <mhs4670go@naver.com>
19 months ago[Profile] Add annotate for memory profiling
Jiho Chu [Thu, 25 Aug 2022 07:02:36 +0000 (16:02 +0900)]
[Profile] Add annotate for memory profiling

It adds several annotations for memory profiling.
Each of them can help to teach profile steps.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[INI] Add memory swap properties
Jiho Chu [Wed, 26 Oct 2022 11:11:08 +0000 (20:11 +0900)]
[INI] Add memory swap properties

This patch is adding memory swap properties.

memory_swap: enable memory swap feature
memory_swap_path: path to save swap file

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[Context] Propergate property to model
Jiho Chu [Fri, 21 Oct 2022 04:56:40 +0000 (13:56 +0900)]
[Context] Propergate property to model

This patch is for app properties propergation.
App properties is described in global configuration file (ini),
and the properties are under section, which only have a meaning for
human readability. It means section deos not have any real role for
behavior. Propergated properties can be used in anywhere in the network.
In this patch include 'memmory_swap' and 'memory_swap_path' which is
used for Model flex properties.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[typo] Fix typo
SeoHyungjun [Mon, 31 Oct 2022 04:35:59 +0000 (13:35 +0900)]
[typo] Fix typo

fix the typo error in README.md file

Signed-off-by: SeoHyungjun <hyungjun.seo@samsung.com>
19 months ago[typo] Fix typo error
DonghakPark [Thu, 27 Oct 2022 11:58:41 +0000 (20:58 +0900)]
[typo] Fix typo error

Fix the typo error through spell checker
- Application/
- api/ccapi/
- nntrainer/

Signed-off-by: DonghakPark <donghak.park@samsung.com>
19 months agoFix typo error
Seungbaek Hong [Mon, 31 Oct 2022 07:12:35 +0000 (16:12 +0900)]
Fix typo error

Fix the typo error in getting-started.md file.

Signed-off-by: Seungbaek Hong <sb92.hong@samsung.net>
19 months ago[fix] fix coverity_tizen issue
DonghakPark [Thu, 20 Oct 2022 10:53:07 +0000 (19:53 +0900)]
[fix] fix coverity_tizen issue
- Add try, catch statement to catch exception
- Add NNTR_THROW_IF to check return

Signed-off-by: DonghakPark <donghak.park@samsung.com>
19 months ago[unittest] refine unittest
hyeonseok lee [Tue, 27 Sep 2022 03:02:55 +0000 (12:02 +0900)]
[unittest] refine unittest

 - Use epsilon to compare float variable
 - Use unused enum LayerCreateSetPropertyOptions
 - fix typos

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
19 months ago[unittest] remove unittest_nntrainer_layers
hyeonseok lee [Mon, 26 Sep 2022 10:47:47 +0000 (19:47 +0900)]
[unittest] remove unittest_nntrainer_layers

 - Remove unittest_nntrainer_layers.cpp cause this test have been disable.
 - Rename unittest_layers_v2 to unittest

Close #2002

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
19 months ago[unittest] migrate unittest from unittest_nntrainer_layers to unittest_layers
hyeonseok lee [Tue, 27 Sep 2022 03:00:20 +0000 (12:00 +0900)]
[unittest] migrate unittest from unittest_nntrainer_layers to unittest_layers

 - Migrate unittest for test negative property from unittest_nntrainer_layer to
   unittest_layers
 - Added LayerPropertySemantics which is inherited from LayerSemantics
   to test negative property only
 - Added LayerPropertySemantics in Simpleshot and pow layer unittest

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
19 months ago[typo] Fix typo error
DonghakPark [Fri, 21 Oct 2022 00:23:06 +0000 (09:23 +0900)]
[typo] Fix typo error

Fix the typo errors in bellow dir
- Applications/
- nntrainer/optimizers/

Signed-off-by: DonghakPark <donghak.park@samsung.com>
19 months ago[Tensor] Add constructor for user swap path
Jiho Chu [Mon, 17 Oct 2022 02:18:17 +0000 (11:18 +0900)]
[Tensor] Add constructor for user swap path

Swap file path could be changed by model property.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[Model] Add memory swap path property
Jiho Chu [Fri, 14 Oct 2022 13:19:37 +0000 (22:19 +0900)]
[Model] Add memory swap path property

MemorySwapPath property is added to define path for swap files.
If it's not defined, it creates files in crurent directory.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[README] Fix typo error in README
seungbaek [Tue, 18 Oct 2022 02:46:07 +0000 (11:46 +0900)]
[README] Fix typo error in README

Fixed the typo error in the title of the paper below.
- NNTrainer: Light-Weight On-Device Training Framework, arXiv, 2022

Signed-off-by: seungbaek <sb92.hong@samsung.com>
19 months ago[memory] extract MemoryData class from memory_pool to another file
hyeonseok lee [Fri, 14 Oct 2022 12:07:52 +0000 (21:07 +0900)]
[memory] extract MemoryData class from memory_pool to another file

 - Extract MemoryData class from memory_pool.h and make memory_data.h file
 - Add doxygen

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
19 months ago[neuralnet] check stop_cb by every layer
hyeonseok lee [Fri, 14 Oct 2022 09:28:20 +0000 (18:28 +0900)]
[neuralnet] check stop_cb by every layer

 - Check stop_cb before doing forwarding/calcGrading/calcDerivative by every layer
   to stop training more quickly.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
19 months ago[TEST] fix for cache pool modification
Jiho Chu [Thu, 25 Aug 2022 08:02:01 +0000 (17:02 +0900)]
[TEST] fix for cache pool modification

It modified for cache pool implementation.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[Property] Add memory_swap property
Jiho Chu [Tue, 19 Jul 2022 11:33:49 +0000 (20:33 +0900)]
[Property] Add memory_swap property

'memory_swap' property is added, and it can be used to select
which pool to use, memory pool or cache pool.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[Memory] Implement memory pool for swap device
Jiho Chu [Tue, 19 Jul 2022 11:30:54 +0000 (20:30 +0900)]
[Memory] Implement memory pool for swap device

Memory pool is modified for swap device.
New memory data class is introduced and stores allocation information.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[Tensor] Add swap device
Jiho Chu [Fri, 1 Jul 2022 13:22:43 +0000 (22:22 +0900)]
[Tensor] Add swap device

Swap device class is introduce.
It operates with cache memory and keeps data permantly in the device.
The storage size is fixed when the device opens, sparse data size
is occupied with garbage data. When the device close, the storage
file is removed.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
19 months ago[Memory] Add cache pool
Jiho Chu [Fri, 1 Jul 2022 12:27:09 +0000 (21:27 +0900)]
[Memory] Add cache pool

Initial implementaion of cache pool.
It inherites from memory pool to utilize optimized memory information.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
20 months agodebian: fix occasional debuild error
MyungJoo Ham [Tue, 4 Oct 2022 12:36:32 +0000 (21:36 +0900)]
debian: fix occasional debuild error

Sometime debuild has errors with dh_clean:

```
dh_clean: error: find .  \( \( \
\( -path .\*/.git -o -path .\*/.svn -o -path .\*/.bzr -o -path .\*/.hg -o -path .\*/CVS -o -path .\*/.pc -o -path .\*/_darcs \) -prune -o -type f -a \
        \( -name '#*#' -o -name '.*~' -o -name '*~' -o -name DEADJOE \
 -o -name '*.orig' -o -name '*.rej' -o -name '*.bak' \
 -o -name '.*.orig' -o -name .*.rej -o -name '.SUMS' \
 -o -name TAGS -o \( -path '*/.deps/*' -a -name '*.P' \) \
\) -exec rm -f {} + \) -o \
\( -type d -a -name autom4te.cache -prune -exec rm -rf {} + \) \) returned exit code 1
make: *** [debian/rules:28: clean] Error 25
```

Override dh_clean to avoid this.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
20 months ago[trivial] fix ahub issues
hyeonseok lee [Tue, 4 Oct 2022 05:54:42 +0000 (14:54 +0900)]
[trivial] fix ahub issues

 - Remove noexcept in copy function cause it throws when data is non contiguous
 - Initialize char array
 - Delete array not element("[]" was missing)

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[padding1d] change padding1d key to padding
hyeonseok lee [Fri, 30 Sep 2022 02:33:20 +0000 (11:33 +0900)]
[padding1d] change padding1d key to padding

 - Change Padding1D key to padding to use same key with Padding2D.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[neuralnet] add log about training
hyeonseok lee [Thu, 29 Sep 2022 09:19:26 +0000 (18:19 +0900)]
[neuralnet] add log about training

 - Added log when start/finish training
 - Added log when get current epoch is called

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[Profile] Add memory statiscs & annotation
Jiho Chu [Thu, 25 Aug 2022 04:36:08 +0000 (13:36 +0900)]
[Profile] Add memory statiscs & annotation

It add two functions:
- provide PROFILE_MEM_ANNOTATE macro
- print average and maximum usage of memory.

Signed-off-by: Jiho Chu <jiho.chu@samsung.com>
20 months ago[lstm] make lstm_core as class
hyeonseok lee [Thu, 8 Sep 2022 08:53:59 +0000 (17:53 +0900)]
[lstm] make lstm_core as class

 - Make lstm_core as class so let lstm/lstm_cell/zoneout_lstmcell inherit this.

close #1997

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[Release] NNTrainer 0.4.0 Release
jijoong.moon [Mon, 26 Sep 2022 05:29:12 +0000 (14:29 +0900)]
[Release] NNTrainer 0.4.0 Release

NNTrainer v0.4.0 is released.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
20 months ago[bugfix] initialize local variable accepted/tizen/unified/20220927.132348
hyeonseok lee [Mon, 26 Sep 2022 04:57:21 +0000 (13:57 +0900)]
[bugfix] initialize local variable

 - Initialize local variable

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[unittest] added negative testcase
hyeonseok lee [Wed, 21 Sep 2022 03:14:28 +0000 (12:14 +0900)]
[unittest] added negative testcase

 - Added negative testcase

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[unittest] rename negative testcase name to _n place at the end
hyeonseok lee [Wed, 21 Sep 2022 02:40:14 +0000 (11:40 +0900)]
[unittest] rename negative testcase name to _n place at the end

 - Since the indicator _p/_n is at the middle of testcase name all of test cases
   are regarded as positive testcases(default). So place indicator at the end.
 - Added '_' right before 'n' to indicate negative testcase.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
20 months ago[ README ] add new features in README
jijoong.moon [Fri, 16 Sep 2022 06:22:21 +0000 (15:22 +0900)]
[ README ] add new features in README

Add new features:
  . positional encoding layer
  . Multi-head attention layer
  . layer normalization layer
  . kld loss
  . learning rate schedule

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
21 months ago[layer] convert throw to nntr_throw in layer finalize accepted/tizen/unified/20220919.021604
hyeonseok lee [Mon, 5 Sep 2022 05:32:57 +0000 (14:32 +0900)]
[layer] convert throw to nntr_throw in layer finalize

 - Instead of directly using throw convert it to use NNTR_THROW_IF in layer finalize

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[unittest] generate transformer unittest
hyeonseok lee [Mon, 22 Aug 2022 08:00:41 +0000 (17:00 +0900)]
[unittest] generate transformer unittest

 - Generate transformer encoder layer unittest
 - Generate transformer decoder layer unittest
 - Generate transformer unittest

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[unittest] layer normalization
hyeonseok lee [Tue, 2 Aug 2022 12:03:30 +0000 (21:03 +0900)]
[unittest] layer normalization

 - generate layer normalization layer unittest

Self evaluation:

Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[layer normalization] implement layer normalization
hyeonseok lee [Wed, 27 Jul 2022 10:02:51 +0000 (19:02 +0900)]
[layer normalization] implement layer normalization

 - implement layer normalization layer based on batch normalization

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[unittest] generate positional encoding unittest
hyeonseok lee [Thu, 25 Aug 2022 14:07:27 +0000 (23:07 +0900)]
[unittest] generate positional encoding unittest

 - Generate positional encoding layer/model unittest.

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[positional encoding] implement positional encoding layer
hyeonseok lee [Thu, 25 Aug 2022 14:03:54 +0000 (23:03 +0900)]
[positional encoding] implement positional encoding layer

 - Positional encoding just needed to be calculated only once
   so make its lifespan as max lifespan

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[multi head attention] added unittest
hyeonseok lee [Fri, 15 Jul 2022 05:51:02 +0000 (14:51 +0900)]
[multi head attention] added unittest

 - Added layer/model unittest for multi head attention
 - Change == operator overloading to pass if both tensor has nan value

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[multi_head_attention] implement calcDerivative, calcGradient
hyeonseok lee [Fri, 15 Jul 2022 05:46:20 +0000 (14:46 +0900)]
[multi_head_attention] implement calcDerivative, calcGradient

 - Implement multi head attention calcDerivative, calcGradient
 - Needs to support bool type attention mask

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[multi head attention] implement calcCommonDerivative
hyeonseok lee [Wed, 13 Jul 2022 02:25:08 +0000 (11:25 +0900)]
[multi head attention] implement calcCommonDerivative

 - implement multi head attention calcCommonDerivative

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[multi head attention] implement forwarding
hyeonseok lee [Wed, 13 Jul 2022 02:22:43 +0000 (11:22 +0900)]
[multi head attention] implement forwarding

 - implement multi head attention forwarding

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[multi head attention] implement finalize
hyeonseok lee [Tue, 19 Jul 2022 03:22:05 +0000 (12:22 +0900)]
[multi head attention] implement finalize

 - Implement multi head attention finalize
 - Remove finalizeCommon
 - Remove ProvideAttentionMask proeperty, inout_idx member variable
   cause NNTrainer assumes that input of multi head attention will be at least 3

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[api] added newly implemented layer enum
hyeonseok lee [Thu, 25 Aug 2022 14:41:52 +0000 (23:41 +0900)]
[api] added newly implemented layer enum

 - Added newly implemented layer enum to nntrainer-api-common.h

Signed-off-by: hyeonseok lee <hs89.lee@samsung.com>
21 months ago[Trivial] Fix Typo
DonghakPark [Mon, 29 Aug 2022 06:22:25 +0000 (15:22 +0900)]
[Trivial] Fix Typo

Fix Typo in nntrainer/compiler/*

**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: DonghakPark <donghak.park@samsung.com>
21 months ago[ LSTM ] Optimize LSTM Gradient calculation
jijoong.moon [Thu, 4 Aug 2022 04:22:57 +0000 (13:22 +0900)]
[ LSTM ] Optimize LSTM Gradient calculation

Gradient computaion of LSTM takes over 60% of computation. This patch
includes the optimization using the tensor dimenson which only has
width.

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
21 months ago[Applicatin] Add profile for VGG
jijoong.moon [Wed, 3 Aug 2022 06:46:51 +0000 (15:46 +0900)]
[Applicatin] Add profile for VGG

This patch adds the profile in VGG Application

**Self evaluation:**
1. Build test:  [X]Passed [ ]Failed [ ]Skipped
2. Run test:  [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>