platform/core/multimedia/inference-engine-mlapi.git
2 years agoadd custom device capacity for SNPE engine 20/269220/2
Inki Dae [Tue, 11 Jan 2022 11:29:44 +0000 (20:29 +0900)]
add custom device capacity for SNPE engine

[Version] : 0.4.1-0
[Issue type] : bug fix

Added custom device capacity for SNPE engine. SNPE engine supports
inference on DSP device.

Change-Id: I6561916a7debbb874fb40455437ba7eb6c074a90
Signed-off-by: Inki Dae <inki.dae@samsung.com>
2 years agoFix coverity issue(UNINIT) accepted/tizen/unified/20220110.140037 submit/tizen/20220105.080154 submit/tizen/20220105.081745
Hyunsoo Park [Wed, 5 Jan 2022 08:09:30 +0000 (17:09 +0900)]
Fix coverity issue(UNINIT)

Change-Id: I68042a27111ae7cd0de362ce318fb84118c81b3c
Signed-off-by: Hyunsoo Park <hance.park@samsung.com>
Fix merge-conflict.
Signed-off-by: Inki Dae <inki.dae@samsung.com>
2 years agoadd NNTRAINER backend support 37/268937/3
Inki Dae [Wed, 5 Jan 2022 04:13:58 +0000 (13:13 +0900)]
add NNTRAINER backend support

[Version] : 0.4.0-0
[Issue type] : new feature

Added NNTRAINER backend support. NNTRAINER is an training engine
including inference feature for its internal model.

Change-Id: If20ccbf8b709f0af6ac6b71f53c5995b6ec05a4c
Signed-off-by: Inki Dae <inki.dae@samsung.com>
2 years agosrc: use ml_single_open_full api 79/268679/2
Inki Dae [Wed, 29 Dec 2021 09:23:42 +0000 (18:23 +0900)]
src: use ml_single_open_full api

[Version] : 1.3.2-0
[Issue type] : bug fix

Replaced ml_single_open with ml_single_open_full api
for various target devices with SNPE engine.

In case of SNPE tensor filter of NNStreamer, target device
is decided by user-given custom property. So use
ml_single_open_full api which allows custom property for it.

Change-Id: I2a6f1ab2b619c59164e4043fcfb03dd0cea97ad6
Signed-off-by: Inki Dae <inki.dae@samsung.com>
2 years agoRefactoring InferenceMLAPI::Load() 67/267067/7
Seungbae Shin [Thu, 25 Nov 2021 03:30:02 +0000 (12:30 +0900)]
Refactoring InferenceMLAPI::Load()

[Version] : 0.3.1
[Issue type] : Refactoring

Change-Id: I370b08981fcdd79f916dfe9cc5ea4225ecf66764

2 years agoadd SNPE tensor filter support 34/267034/1
Inki Dae [Wed, 24 Nov 2021 09:11:54 +0000 (18:11 +0900)]
add SNPE tensor filter support

[Version] : 0.3.0-0
[Issue type] : new feature

Change-Id: If3c8591938e35b0d84bf0c2c2f12bb0e50b84cd5
Signed-off-by: Inki Dae <inki.dae@samsung.com>
2 years agofix NNTrainer inference issue 66/266466/6
Inki Dae [Fri, 12 Nov 2021 09:41:41 +0000 (18:41 +0900)]
fix NNTrainer inference issue

[Version] : 0.2.0-0
[Issue type] : bug fix

Fixed NNTrainer inference issue.

NNTrainer backend needs ml tensor info for input and output tensors
when loding a given model. So this patch creates the info before
requesting the model loading, and then sets them ml_single_open function
as its arguments.

Change-Id: I4e444b7b1d87c37249ddf2ac8b7c56aa7119602a
Signed-off-by: Inki Dae <inki.dae@samsung.com>
3 years agoadd nntrianer backend support 41/266241/3
Inki Dae [Tue, 9 Nov 2021 11:14:01 +0000 (20:14 +0900)]
add nntrianer backend support

[Version] : 0.1.2-2
[Issue type] : new feature

Change-Id: Ic120d1d149058b5d95a48249a08baedada7d359f
Signed-off-by: Inki Dae <inki.dae@samsung.com>
3 years agoConsider user-given property info first 40/266240/3
Inki Dae [Tue, 8 Jun 2021 03:49:48 +0000 (12:49 +0900)]
Consider user-given property info first

[Version] : 0.0.2-2
[Issue type] : bug fix

Considered user-given property info first if exists.

In case of ONERT, it doesn't provide input and output tensor names
so we have to use the names given by user instead.

This patch checks if user-given property information exists,
and gets the name from the property information if exists.

Change-Id: If2903026fe15dc3664591c0e4e472cf5cb2991e4
Signed-off-by: Inki Dae <inki.dae@samsung.com>
3 years agoFix svace issue 98/257598/1 accepted/tizen/6.5/unified/20211028.122207 accepted/tizen/unified/20210509.123827 accepted/tizen/unified/20210608.131214 submit/tizen/20210428.062907 submit/tizen/20210506.010918 submit/tizen/20210507.005054 submit/tizen/20210513.034723 submit/tizen/20210513.045159 submit/tizen/20210604.014750 submit/tizen_6.5/20211028.162401 tizen_6.5.m2_release
Inki Dae [Wed, 28 Apr 2021 05:15:49 +0000 (14:15 +0900)]
Fix svace issue

This patch initializes uninitialized two variables.

Change-Id: I00aadafa97738a6f7c13417856c906556b92ee9a
Signed-off-by: Inki Dae <inki.dae@samsung.com>
3 years agoChange members of inference_engine_layer_property structure, submit/tizen/20210422.072212
Tae-Young Chung [Wed, 10 Mar 2021 09:12:57 +0000 (18:12 +0900)]
Change members of inference_engine_layer_property structure,
and change vector<inference_engine_tensor_buffer> to map<string, inference_engine_tensor_buffer>

This is based on
https://review.tizen.org/gerrit/#/c/platform/core/multimedia/inference-engine-interface/+/254892/
https://review.tizen.org/gerrit/#/c/platform/core/api/mediavision/+/254953/

Change-Id: I93eaa87c9ed5492bb308cb1ec0a35e86fd5b06dd
Signed-off-by: Tae-Young Chung <ty83.chung@samsung.com>
3 years agoAdd SetCLTuner interface
Inki Dae [Thu, 4 Feb 2021 01:04:07 +0000 (10:04 +0900)]
Add SetCLTuner interface

Added SetCLTuner interface for CLTuner feature support of inference
engine interface framework which has a pure virtual function interface
of SetCLTUner function so the interface implementataion is required.

Change-Id: Ie20991e6562864bca285383443880367ea00b522
Signed-off-by: Inki Dae <inki.dae@samsung.com>
3 years ago[ML-API] Use capi-ml-inference package instead of capi-nnstreamer 85/253785/1 accepted/tizen/unified/20210218.080606 submit/tizen/20210217.032056
Sangjung Woo [Wed, 17 Feb 2021 05:05:54 +0000 (14:05 +0900)]
[ML-API] Use capi-ml-inference package instead of capi-nnstreamer

The name of capi-nnstreamer and its pc file is changed to
capi-ml-inference. To build without any errors, this patch updates its
related information.

Change-Id: I07bd118b90232d517fdb48f8bc1ef05b469bd477
Signed-off-by: Sangjung Woo <sangjung.woo@samsung.com>
4 years agoFix build error on ARM64 19/247219/1 accepted/tizen/unified/20201112.124212 submit/tizen/20201109.053646 submit/tizen/20201110.032259
Inki Dae [Mon, 9 Nov 2020 05:03:49 +0000 (14:03 +0900)]
Fix build error on ARM64

Change-Id: I64476c001ca6ddb200fd2667c2b9d4f7d91ceabd
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix undefined symbole issue 03/246503/4 submit/tizen/20201104.021236 submit/tizen/20201109.032237
Inki Dae [Fri, 30 Oct 2020 07:17:40 +0000 (16:17 +0900)]
Fix undefined symbole issue

ml_single_invoke_no_alloc has been changed to ml_single_invoke_fast
so fix it.

Change-Id: I2a7d85c0adbc8b28a4139e5038e85ca1f3a4e03e
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix memory leak issue 69/246369/2
Sangjung Woo [Mon, 26 Oct 2020 02:24:57 +0000 (11:24 +0900)]
Fix memory leak issue

* Apply ml_single_invoke_no_alloc() ML API instead of
  ml_single_invoke().
* Remove unnecessary memory copies.

Change-Id: I41c6eaf0afe35a4dd481ac57e942dd45f0fb1e4a
Signed-off-by: Sangjung Woo <sangjung.woo@samsung.com>
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoUpdate various tensor filters support 50/246250/2
Inki Dae [Wed, 14 Oct 2020 06:38:48 +0000 (15:38 +0900)]
Update various tensor filters support

Change-Id: I2ea104cae60ba5a9049fcc8eaa1b0ec78a220112
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoUse pre-defined tensor data handles 49/246249/1
Inki Dae [Tue, 15 Sep 2020 08:30:27 +0000 (17:30 +0900)]
Use pre-defined tensor data handles

For invoke request, we don't have to get input tensor information
because we can get the information at GetInputTensorBuffers
and GetOutputTensorBuffers, and use them instead.

Change-Id: I6d2ac7fcb8d4ed129deb54eca7739038571b230e
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd ARMNN and TFLITE backend support 48/246248/1
Inki Dae [Fri, 11 Sep 2020 07:20:22 +0000 (16:20 +0900)]
Add ARMNN and TFLITE backend support

Change-Id: I2fde5e660950a3e2d9d2d1722c351cdd448f86a4
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoMerge "Support two more tensor type" into tizen accepted/tizen_6.0_unified accepted/tizen_6.0_unified_hotfix tizen_6.0 tizen_6.0_hotfix accepted/tizen/6.0/unified/20201030.121108 accepted/tizen/6.0/unified/hotfix/20201103.051119 accepted/tizen/unified/20200831.002556 submit/tizen/20200828.025650 submit/tizen/20200828.100528 submit/tizen_6.0/20201029.205103 submit/tizen_6.0_hotfix/20201102.192503 submit/tizen_6.0_hotfix/20201103.114803 tizen_6.0.m2_release
Inki Dae [Wed, 26 Aug 2020 06:35:27 +0000 (06:35 +0000)]
Merge "Support two more tensor type" into tizen

4 years agoSupport two more tensor type 00/242200/1
Hyo Jong Kim [Tue, 25 Aug 2020 02:29:42 +0000 (11:29 +0900)]
Support two more tensor type

Support INT64 and UINT64

Change-Id: Iec1445e27bfeb245e1b38bd83679ea8f27e7bcf7
Signed-off-by: Hyo Jong Kim <hue.kim@samsung.com>
4 years agoSupport multiple output tensor 98/242198/2
Hyo Jong Kim [Tue, 25 Aug 2020 02:20:46 +0000 (11:20 +0900)]
Support multiple output tensor

Get the information and the number of output tensor
Set the output tensor according to that number

Change-Id: Ie803aa0aee194091006db29bd86a3d24a4f922df
Signed-off-by: Hyo Jong Kim <hue.kim@samsung.com>
4 years agoFix svace issue[WGID=443239] 14/239014/1 accepted/tizen/unified/20200722.144612 submit/tizen/20200721.063155
Inki Dae [Tue, 21 Jul 2020 05:42:58 +0000 (14:42 +0900)]
Fix svace issue[WGID=443239]

Change-Id: I79e3b1c283f9813efb48caffd7c5c30b8c2afc95
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix build error aarch64 and x86_64 27/237227/1 accepted/tizen/unified/20200628.221629 submit/tizen/20200626.060446 submit/tizen/20200626.070253
Inki Dae [Fri, 26 Jun 2020 06:00:59 +0000 (15:00 +0900)]
Fix build error aarch64 and x86_64

Change-Id: I4aa50fce7ed7b9c822a505be5d21f17bb5f040e9
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoCheck if model file path is valid or not 27/236327/3 submit/tizen/20200626.050805
Inki Dae [Tue, 16 Jun 2020 08:21:46 +0000 (17:21 +0900)]
Check if model file path is valid or not

Change-Id: Id621bac742d9d2a5109462ffd284b956b0feae21
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoChange in-house NN Runtime backend name 22/236322/1
Inki Dae [Tue, 16 Jun 2020 07:47:35 +0000 (16:47 +0900)]
Change in-house NN Runtime backend name

Official name of NNFW is ONE(On-device Neural Engine)
so use it instead of NNFW.

Change-Id: I8bdb279451570074f11a85386c6725afe73ceab9
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd NNFW runtime support 68/236168/4
Inki Dae [Mon, 15 Jun 2020 09:25:23 +0000 (18:25 +0900)]
Add NNFW runtime support

This patch corrects the use of fixed values to support NNFW runtime.
For this, this patch updates tensor buffer information according to
a given model and add ConvertTesnorType function which is used
to convert tensor data type for NNStreamer to the one for MediaVision
Inference engine.

Change-Id: I28d80698feebe9efbb076bb8d979b0ce0d849fee
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoSet model path according to MLAPI backend 63/235963/3
Inki Dae [Thu, 11 Jun 2020 07:29:14 +0000 (16:29 +0900)]
Set model path according to MLAPI backend

NNFW - in-house NN Runtime - needs NNPackage type of package
which is a directory containing a model file and its meta file.
For more details, you can refer to
https://github.com/Samsung/ONE/tree/master/nnpackage/examples/one_op_in_tflite

ML Single API framework of NNStreamer receives a full path of a given model file
from user - in our case, inference-engine-mlapi backend - and find metadata
in the directory that the a given model file is located.

So inference-engine-mlapi backend should pass a full path of
the a given model file to ML Single API framework.

Change-Id: I6bdd871d5b683dbd6e60fce0f6dbd052985cd514
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoSet supported_device_types according to MLAPI backend type 62/235962/1
Inki Dae [Thu, 11 Jun 2020 07:16:42 +0000 (16:16 +0900)]
Set supported_device_types according to MLAPI backend type

NNFW supports only CPU and GPU accelerated NN runtime so
Consider using NNFW tensor filter plugin of NNStreamer.

Change-Id: I3ed4ae5018b984c812f8bad69eebbfdae69dd030
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix initializer list coding rule 94/235394/1
Inki Dae [Thu, 4 Jun 2020 08:00:14 +0000 (17:00 +0900)]
Fix initializer list coding rule

Change-Id: Id95bf653d7b6274c4803a6b240783905e96300ce
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix coding style based on Tizen SE C++ Coding Rule 60/235360/1
Inki Dae [Thu, 4 Jun 2020 05:47:00 +0000 (14:47 +0900)]
Fix coding style based on Tizen SE C++ Coding Rule

Tizen SE C++ Coding Rule:
https://code.sec.samsung.net/confluence/pages/viewpage.action?pageId=160925159

Change-Id: I1ae54a3676dc9cc0e06d4322eb612ceb07d7626c
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoChange a function name from SetPluginType to SetPrivateData 75/235275/2
Inki Dae [Wed, 3 Jun 2020 08:41:48 +0000 (17:41 +0900)]
Change a function name from SetPluginType to SetPrivateData

Change-Id: I4a0d2ea3345ab650f5ffe9072f8edc79f5fdca98
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoChange postfix of file name to "mlapi"
Inki Dae [Tue, 2 Jun 2020 09:36:54 +0000 (18:36 +0900)]
Change postfix of file name to "mlapi"

Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoChange a backend type from VIVANTE to MLAPI
Inki Dae [Tue, 2 Jun 2020 09:13:04 +0000 (18:13 +0900)]
Change a backend type from VIVANTE to MLAPI

Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd init code
Inki Dae [Wed, 27 May 2020 08:29:46 +0000 (17:29 +0900)]
Add init code

Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoInitial empty repository master
Tizen Infrastructure [Wed, 27 May 2020 08:16:04 +0000 (08:16 +0000)]
Initial empty repository