Jihoon Lee [Tue, 28 Apr 2020 02:06:39 +0000 (11:06 +0900)]
Split unittest for util func to diffrent source. (#61)
Split util_func unittest to `unittest_util_func.cpp`.
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [ ]Passed [X]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Tue, 28 Apr 2020 01:35:32 +0000 (10:35 +0900)]
Add Reviewer
Add Reviewer in README.md
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Mon, 27 Apr 2020 07:47:11 +0000 (16:47 +0900)]
[README] Add some explanation
* Update prerequisite
* Propose GIAG Build Environment
* Make example section
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Mon, 27 Apr 2020 05:56:05 +0000 (14:56 +0900)]
Split files for DataBufferFromFile & DataBufferformCallback
data_buffer.h & cpp is too big to manage. Its inherited clases are
saved in seperate files.
- data_buffer_func.h & data_buffer_func.cpp
- data_buffer_file.h & data_buffer_file.cpp
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Jihoon Lee [Mon, 27 Apr 2020 08:24:30 +0000 (17:24 +0900)]
Move images to docs/images
This PR moves images to docs/images to clean up docs folder for later
use.
Self evaluation:
Build test: [ ]Passed [ ]Failed [X]Skipped
Run test: [ ]Passed [ ]Failed [X]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
jijoong.moon [Thu, 23 Apr 2020 06:11:58 +0000 (15:11 +0900)]
Introduce DataBufferFromCallback Class
With this class, it is possible to train with user specific data
generation callback funciton. This is the default to get training
data unless there is [DataSet] key in configuration file.
- in Application/Classification/jni/main_func.cpp
NN.train(getMiniBatch_train, getMiniBatch_val, getMiniBatch_train)
Then, data buffer thread call these functions to get newest data with
size of mini batch.
The format this function should be :
/*
* @brief Callback function to get user specific data
* @param[in] X data 3D float vector type
* @param[in] Y label 3D float vector type
* @param[out] status status for error handle
* @retval true / false generate all data for this epoch
*/
bool func(std::vector<std::vector<std::vector<float>>>& X,
std::vector<std::vector<std::vector<float>>>& Y, int &status)
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 24 Apr 2020 01:39:34 +0000 (10:39 +0900)]
Add Tensor unit tests
Add Tensor unit tests
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 23 Apr 2020 11:50:02 +0000 (20:50 +0900)]
Add Unit Test Cases for math utilities function
Add unit test cases for math utilities functions
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoongmoon [Wed, 22 Apr 2020 15:25:06 +0000 (00:25 +0900)]
Modify softmax calculation to handle large values.
Currently it is not numerically stable.
it is more stable with this pr.
Signed-off-by: jijoongmoon <jijoong.moon@samsung.com>
jijoongmoon [Tue, 21 Apr 2020 16:26:58 +0000 (01:26 +0900)]
Throw error exception from thread
thow error exceptions from data buffer thread to main.
By this pr, program stops normally with proper error handling.
Signed-off-by: jijoongmoon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 21 Apr 2020 11:16:23 +0000 (20:16 +0900)]
Introduce DataBufferFromDataFile Class
In order to handle various input generation cases, make DataBuffer
Class is base class and inherit this, several children classes is
introduced.
. DataBufferFromDataFile
. DataBufferFromCallback : NYI
. DataBufferFromFramework : NYI
. others. : NYI
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 20 Apr 2020 08:40:21 +0000 (17:40 +0900)]
Add train function in NeuralNetwork Class.
Add train member function of NeuralNetwork Class.
- NeuralNetwork Class has DataBuffer instance to handle datas.
- New Keywords are introduced for specify data set
. TrainData, ValidData, TestData, LabelData
- DataBuffer instance is initialized with parameters from Neural
Network instance.
- In train function,
. Data Buffer instance is running ( collecting or generating data )
. get the data from Data Buffer instance.
. Backwarding
. Display progress ( Data Buffer member funciton )
. Validate if it is enabled.
- Add Classification Example with train member function
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoongmoon [Sun, 19 Apr 2020 14:47:31 +0000 (23:47 +0900)]
Initialize data buffer when neural network initializes.
Initilaize data buffer and set parameters from neural network.
Signed-off-by: jijoongmoon <jijoong.moon@samsung.com>
Wook Song [Mon, 20 Apr 2020 10:51:45 +0000 (19:51 +0900)]
[README] Update Getting Started
This patch updates Getting Started section to reflect the recent
changes.
Signed-off-by: Wook Song <wook16.song@samsung.com>
jijoong.moon [Sun, 19 Apr 2020 23:48:26 +0000 (08:48 +0900)]
Add more badges
badges
- repo size
- issues
- pull requests
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 17 Apr 2020 06:47:42 +0000 (15:47 +0900)]
Add Unit Test Cases for Neural Network Initialization
Nerual Network Initialization Unit Test including layer
initialization.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 17 Apr 2020 07:40:00 +0000 (16:40 +0900)]
lower case for layer type.
Instead of using camel type, snakes type is used.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 16 Apr 2020 23:14:59 +0000 (08:14 +0900)]
Update clang-format
Update clang-format for better formatting
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 16 Apr 2020 11:48:44 +0000 (20:48 +0900)]
Add Unit Test for Neural Network
- Neural Network Initialization
- set Data Buffer
- set Optimizer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoongmoon [Wed, 15 Apr 2020 14:56:23 +0000 (23:56 +0900)]
Add Coverage Test badge for nntrainer
Add Coverage Test badge for nntrainer in README.md
Signed-off-by: jijoongmoon <jijoong.moon@samsung.com>
Wook Song [Tue, 14 Apr 2020 15:07:44 +0000 (00:07 +0900)]
README: Correct typos and grammar
This is a trivial patch that corrects typos and grammar.
Signed-off-by: Wook Song <wook16.song@samsung.com>
jijoong.moon [Mon, 13 Apr 2020 09:46:12 +0000 (18:46 +0900)]
Add Unit Test Coverage
Add Unittest coverage
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 10 Apr 2020 09:23:54 +0000 (18:23 +0900)]
Naming Rules & change namespace
- Naming Rules
. namespace : lower-case (nntrianer)
. type names ( struct, class .. etc. ) : Pascal (TypeNames)
. variable names : snakes ( variable_names )
. enum : Upper Case with underbar
- using namespace nntrainer
. remove other namespace
- add #ifdef __cplusplus
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 10 Apr 2020 02:11:46 +0000 (11:11 +0900)]
Add Unit Tests for the NNTrainer internal
Add Unit Test to evaluate NNTrainer interals
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 9 Apr 2020 04:36:56 +0000 (13:36 +0900)]
Remove Duplicated Codes @UpdateData
Remove Duplicated Codes @UpdataData
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 9 Apr 2020 00:28:39 +0000 (09:28 +0900)]
Remove Duplicated Code @WeightInitialization
Remove Duplicated Code @WeightInitialization Fuc.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 8 Apr 2020 07:04:18 +0000 (16:04 +0900)]
Add Optimizer Class & separate math functions
- Add Optimizer Class and remove duplicated codes.
. Hyperparameters related with Optimizer class placed Optimizer
Class.
. OptParam, WeightDecayParam, OptType, WeightDecayType
- Seperate math functions in util_func.h/cpp
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 7 Apr 2020 23:21:04 +0000 (08:21 +0900)]
Remove global variables to parse the configuration file
There is no reason to be global because only one function use these
global variables.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 7 Apr 2020 11:01:19 +0000 (20:01 +0900)]
Fix test bug
- Add test configuration file in test directory
- modify code to read the path correctly
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 7 Apr 2020 01:45:27 +0000 (10:45 +0900)]
Eable Logging
Modify codes and buld for Logging
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 7 Apr 2020 01:36:08 +0000 (10:36 +0900)]
[API] Add Logger for NNTrainer
In order to log, implement Logger class to use when there is no
logging system. If there is, we are going to use them. For example,
we are going to use tizen dlog for the tizen and android log system
for android with same interface such as nn_log[i,w,e,d].
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 6 Apr 2020 06:34:35 +0000 (15:34 +0900)]
[API] Add Neural Network Model Construction with Configuration File
- Add Construction of Nerual Network Model with Configuration File.
. ml_nnmodel_construct_with_conf()
. Add one positive and one negative tests
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 6 Apr 2020 09:24:58 +0000 (18:24 +0900)]
Add testing during packaging
Add run test during build.
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 3 Apr 2020 05:16:11 +0000 (14:16 +0900)]
[API] Prototype of C API & test for Neural Network Model
- Add C API to construct / destruct Neural Network Model object
. ml_nnmodel_construct()
. ml_nnmodel_destruct()
- Add test cases
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 1 Apr 2020 06:37:06 +0000 (15:37 +0900)]
Remove gym-http-api files
Remove 'include/gym/gym.h'
Remove 'gym-bininding.cpp'
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 30 Mar 2020 09:23:44 +0000 (18:23 +0900)]
Enabling Android build
Enable Android build including Applications build
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 31 Mar 2020 07:20:35 +0000 (16:20 +0900)]
[SVACE] Check the index of test case if it is exceed the file size
The value of an arithmetic expression (I * this->input_size + I *
this->class_num) * sizeof(float) is subject to overflow due to a
failure to cast operands to a larger data type before perfoming arithmetic
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 1 Apr 2020 07:19:17 +0000 (16:19 +0900)]
Modify CODEOWNERS
Add new members & fix id
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 30 Mar 2020 03:09:03 +0000 (12:09 +0900)]
Add Debian packaging files
Add debian packaging files
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
Wook Song [Fri, 27 Mar 2020 08:07:02 +0000 (17:07 +0900)]
meson: Add fallback mechanism to find iniparser
In case that there are no cmake files or no pkg-config files for
iniparser, this patch adds fallback mechanism to find it.
Signed-off-by: Wook Song <wook16.song@samsung.com>
jijoong.moon [Fri, 27 Mar 2020 05:39:29 +0000 (14:39 +0900)]
Change Build Framework & add application package
- From now on, we are going to use meson as an build framework instead
of cmake.
- Add nntrainer-applications packages
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 27 Mar 2020 05:34:06 +0000 (14:34 +0900)]
Fix compile warnning messages
Fix the compile warnning messages. Depending on compile options, it
could be errors.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 27 Mar 2020 05:49:49 +0000 (14:49 +0900)]
fix image url
add blob/master/
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 26 Mar 2020 03:42:06 +0000 (12:42 +0900)]
Upload Pictures for README.md
Upload pictures for README.md at Application directories.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 25 Mar 2020 03:39:40 +0000 (12:39 +0900)]
Modify applyFunction for Tensor data type
Currently only element by element calculation is available with
applyFunction. However, there are cases to deal with whole data of
tensor like softmax. Due to this reason, applyFunction is modifed.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 24 Mar 2020 07:15:00 +0000 (16:15 +0900)]
Fix nntrainer home URL
Change nntrainer home URL to github.com/nnstreamer/nntrainer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 24 Mar 2020 02:15:38 +0000 (11:15 +0900)]
Remove Categorical enum in cost function
No Need Categorical Cost Fucntion. It is duplication of cross entropy.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 20 Mar 2020 06:32:34 +0000 (15:32 +0900)]
Fix code for Clang format 4.0
fix code for clang format 4.0
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 20 Mar 2020 05:40:44 +0000 (14:40 +0900)]
fix .gitignore
remove newline error
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 20 Mar 2020 01:00:43 +0000 (10:00 +0900)]
add .gitignore
add .gitignore
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 20 Mar 2020 00:45:50 +0000 (09:45 +0900)]
Add CODEOWNERS
CODEOWNERS are
@AIP/nntrainer @myungjoo-ham @jijoong-moon @geunsik-lim @wook16-song
@helloahn @pk-kapoor @dongju-chae @sangjung-woo @gichan2-jang
@jy1210-jung @yongjoo1-ahn
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 19 Mar 2020 23:17:24 +0000 (08:17 +0900)]
add TensorDim Class
Add TensorDim class to handle tensor demension.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 20 Mar 2020 00:07:34 +0000 (09:07 +0900)]
Make Examples work properly
Modify Exampels for the Updated NNtrainer
- Classification
- KNN
- LogisticRegression
- ReinforcementLearning
- Training
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 19 Mar 2020 23:03:06 +0000 (08:03 +0900)]
Modify LogisticRegression & Training Example for updated nntrainer
Make Logistic Regression and Training Example work properly with
current nntrainer.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 19 Mar 2020 04:43:19 +0000 (13:43 +0900)]
Update Training & Classification Examples according to new format
CMakeLists.txt & configuration format is changed. Therefore, changes
are necessary.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 19 Mar 2020 04:34:48 +0000 (13:34 +0900)]
Add namespace Tensors for Tensor
for better readibility, using Tensors::Tensor
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 10:35:43 +0000 (19:35 +0900)]
Add nntrainer.spec for tizen packaging
. add nntrainer.spec
. add nntrainer.manifest
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 11:00:40 +0000 (20:00 +0900)]
fix error when calculate sum
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 06:50:13 +0000 (15:50 +0900)]
Remove iniparser directory.
It would be better to maintain package without iniparser srouce
directory. It will use iniparser installed in platform.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 03:32:32 +0000 (12:32 +0900)]
Add using Cublas for matrix multiplication ( SGEMM )
Instead of CBlas, it is possble to use cublas for sgemm.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 00:19:45 +0000 (09:19 +0900)]
Add test example for the assign activation function per layer
test case for assigning activation per layer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 00:18:02 +0000 (09:18 +0900)]
Add progress bar to trace training process
Add progress bar to trace processing.
Using Training set to validate training algorighm. It should be over
95% at least.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 18 Mar 2020 00:13:42 +0000 (09:13 +0900)]
Let assign activation fuction per layer
make it possible to assign activation fuction per each layer.
Also, make softmax as an activation function and only softmax and
sigmoid is avaliable for the output layer.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 17 Mar 2020 00:40:34 +0000 (09:40 +0900)]
Fix Weight Decay L2Norm
There is wrong implementation about weight decay l2norm.
In this pr, it is fixed
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 16 Mar 2020 04:50:12 +0000 (13:50 +0900)]
Add Weight Decay (L2Norm)
Implement Weight Decay ( L2Norm Only )
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 16 Mar 2020 03:44:59 +0000 (12:44 +0900)]
add Weight initialization Method
Add Weight Initialization Method
"lecun_normal" : LeCun Normal Initialization
"lecun_uniform" : LeCun Uniform Initialization
"xavier_normal" : Xavier Normal Initialization
"xavier_uniform" : Xavier Uniform Initialization
"he_normal" : He Normal Initialization
"he_uniform" : He Uniform Initialization
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 16 Mar 2020 00:15:50 +0000 (09:15 +0900)]
Add Standardization in InputLayer
Add Standardization in InputLayer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Sun, 15 Mar 2020 23:55:56 +0000 (08:55 +0900)]
Add normalization at InputLayer
Add normalization option for inputlayer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Sun, 15 Mar 2020 23:40:27 +0000 (08:40 +0900)]
Make softmax optional
make softmax optional when the output is calculated
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Sun, 15 Mar 2020 23:02:03 +0000 (08:02 +0900)]
Update Bias when the optimizer is adam
Until now, Adam optimizer updates weight only. So that, adds Bias
update accordingly.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 13 Mar 2020 08:53:30 +0000 (17:53 +0900)]
Add Relu Activation Function
Add Relu Activation Function
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 13 Mar 2020 07:29:32 +0000 (16:29 +0900)]
Modify databuffer to generate random input variables at first time
Modify databuffer to generate random input variables at first time
- Add Conditional variable to synchronize thread
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 13 Mar 2020 07:28:08 +0000 (16:28 +0900)]
Fix the compilation Error for Batch Normalization
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 26 Feb 2020 04:46:03 +0000 (13:46 +0900)]
Batch Normalization Draft
- This is unstable version of Batch normalization
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 13 Mar 2020 04:33:47 +0000 (13:33 +0900)]
Add Keras & Tensorflow Example for transfer learning
mobilenetv2 output (features) using cifar10 data
: 10 Classes x 100 image each --> trainingSet.dat
* trainingSet.dat * ValSet.dat
62720 62720
+--------------+ +--------------+
| | | |
10x100 | | 10x10 | |
| | | |
| | | |
+--------------+ +--------------+
- Layers
: InputFeatures(62720)->10 hidden FC + softmax -> 10 Classes
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 26 Feb 2020 04:42:46 +0000 (13:42 +0900)]
Add Subtract with float value in tensor & add sum according to axis
Add Subtract(float value)
Add sum(int axis)
: sum according to axis
- 0 : batch
- 1 : height
- 2 : width
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 26 Feb 2020 04:39:45 +0000 (13:39 +0900)]
Change Configuration format fot batch normalization
Change Configuration format
- Delete Width, Hieght
- Add HiddenSize
Add BatchNormalizationLayer Scaleton Code
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 18 Feb 2020 08:14:48 +0000 (17:14 +0900)]
Add Softmax token in ini file
Softmax token is add to use to enable softmax at outputlayer
This is boolean and if it is set by true, then softmax is enabled.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 18 Feb 2020 07:47:28 +0000 (16:47 +0900)]
Fix calculation of Logistics Regression
Fix calculation of Logistics Regression at outputlayer
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 17 Feb 2020 11:49:16 +0000 (20:49 +0900)]
Add DataBuffer for big data
- Add DataBuffer Class to read/write big data from file
- Multi-Threaded
- Automatic Updated
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 13 Feb 2020 22:49:47 +0000 (07:49 +0900)]
Calculation of Validation Loss & Accuracy
Calculate Validation Loss & Accuracy
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 13 Feb 2020 07:37:05 +0000 (16:37 +0900)]
Add Classification Example (cifar10 data set)
Classification Example is added. Data Set is cifar10.
The configuraion is in Classification.ini.
The base model for feature extractor is mobilenet v2 with pretrained
imagenet data. ( 244x244x3 ) Input feature size is 1280x7x7 = 62720.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 13 Feb 2020 01:50:11 +0000 (10:50 +0900)]
Implement Cross Entropy Cost Function
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 12 Feb 2020 05:13:54 +0000 (14:13 +0900)]
Using CBLAS for Tensor Calculation
Implement Tensor Calculation for Tensor Calculation
Can use with "-DUSE_BLAS"
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 11 Feb 2020 06:05:39 +0000 (15:05 +0900)]
Decayed Learning Rate
Implement Decayed Learning Rate for better convergence
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Tue, 11 Feb 2020 02:12:42 +0000 (11:12 +0900)]
Fix Maintainer in README.md
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 10 Feb 2020 04:48:00 +0000 (13:48 +0900)]
Add README.md
README includes Descriptions of NNtrainer, how-to's and Open Source License.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 10 Feb 2020 01:36:13 +0000 (10:36 +0900)]
Use float data type instead of double.
For the efficiency, we are going to use float data type.
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 7 Feb 2020 05:35:13 +0000 (14:35 +0900)]
add .gitmodules
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 7 Feb 2020 05:12:14 +0000 (14:12 +0900)]
Change Class Name Matrix to Tensor
Tensor is more appropreate than Matrix.
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Fri, 7 Feb 2020 04:49:28 +0000 (13:49 +0900)]
Code Refectoring
Code Refactoring
- Move to applicatoins into Application Directory
- Remove NeuralNet Directory
- Make include & src directory for the neural network
- Modifiy build configuration to build with libnntrainer.so
- Add nntrainer install & add pc.in for pkgconfig
- Add install path
**Changes proposed in this PR:**
- Added TOC generator for README.md
Resolves:
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 5 Feb 2020 08:04:33 +0000 (17:04 +0900)]
Make nntrainer shared library
Change Directory structure and make nntrainer shared library,
nntrainer.so
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Mon, 20 Jan 2020 01:07:01 +0000 (10:07 +0900)]
Add License file
Add Apache 2.0 full license file
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Thu, 5 Dec 2019 23:56:23 +0000 (08:56 +0900)]
Add Copyright for Appache 2.0 License
Add Copyright
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 4 Dec 2019 12:24:16 +0000 (21:24 +0900)]
add doxygen doc for DeepQ
add doxygen doc for DeepQ
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 4 Dec 2019 11:32:40 +0000 (20:32 +0900)]
add doxygen doc for KNN example
add doxygen doc for KNN example
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 4 Dec 2019 11:23:25 +0000 (20:23 +0900)]
add doxygen doc for Logistic regression
add doxygen doc for Logistic regression
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 4 Dec 2019 10:59:36 +0000 (19:59 +0900)]
add doxygen document for Training
add doxygen documents for Training
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 4 Dec 2019 10:16:53 +0000 (19:16 +0900)]
Add Doxygen Documentation for Environment
Add Doxygen Doc. for Environment
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>
jijoong.moon [Wed, 4 Dec 2019 08:15:47 +0000 (17:15 +0900)]
Add doxygen Documentation for NeuralNet
Add doxygen Documentation format of NeuralNet Directory
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: jijoong.moon <jijoong.moon@samsung.com>