platform/core/multimedia/inference-engine-armnn.git
2 years agofix coverity issue 37/288737/1 accepted/tizen_unified tizen accepted/tizen/unified/20230316.174026
Inki Dae [Wed, 22 Feb 2023 07:29:02 +0000 (16:29 +0900)]
fix coverity issue

[Version] : 0.0.2
[Issue type] bug fix

CID : 1656258

Fixed a coverity issue, Dead default in switch.
For invalid format type, it will be handled by switch-case statement.
So drop unnecessary exception handling.

Change-Id: I6dee7f5e83682b4c05001317d3f3845980cfb728
Signed-off-by: Inki Dae <inki.dae@samsung.com>
2 years agodisable tflite parser support 55/285955/3 accepted/tizen/unified/20221226.070017
Inki Dae [Fri, 23 Dec 2022 05:25:57 +0000 (14:25 +0900)]
disable tflite parser support

Change-Id: I0d9c2d7dfef04df0bc4f8e08c2de5a5306a4fc9f
Signed-off-by: Inki Dae <inki.dae@samsung.com>
3 years agoFix typo 62/269262/1
Hyunsoo Park [Wed, 12 Jan 2022 07:00:13 +0000 (16:00 +0900)]
Fix typo

[Versin] 0.0.1-5
[Issue type] clean up

Change-Id: I61464f10539fd9cd671056f8afad3fdd268b0dc5
Signed-off-by: Hyunsoo Park <hance.park@samsung.com>
3 years agoExcludes tuning level when READ mode is set 08/265608/2 accepted/tizen_7.0_unified accepted/tizen_7.0_unified_hotfix tizen_7.0 tizen_7.0_hotfix tizen_devel accepted/tizen/7.0/unified/20221110.060517 accepted/tizen/7.0/unified/hotfix/20221116.105353 accepted/tizen/unified/20220110.140027 submit/tizen/20220105.080154 submit/tizen/20220105.081745 tizen_7.0_m2_release
Hyunsoo Park [Mon, 25 Oct 2021 06:31:04 +0000 (15:31 +0900)]
Excludes tuning level when READ mode is set

Change-Id: I33210956a9af5ef94614a582eda945dcab350724
Signed-off-by: Hyunsoo Park <hance.park@samsung.com>
4 years agoRemove swap and clear codes in destructor 79/255979/1 accepted/tizen_6.5_unified tizen_6.5 accepted/tizen/6.5/unified/20211028.115235 accepted/tizen/unified/20210509.123814 accepted/tizen/unified/20210608.131159 submit/tizen/20210422.072212 submit/tizen/20210428.062907 submit/tizen/20210506.010918 submit/tizen/20210507.005054 submit/tizen/20210513.034723 submit/tizen/20210513.045159 submit/tizen/20210604.014750 submit/tizen/20210604.014824 submit/tizen_6.5/20211028.162401 tizen_6.5.m2_release
Tae-Young Chung [Fri, 26 Mar 2021 01:23:23 +0000 (10:23 +0900)]
Remove swap and clear codes in destructor

std::map destroyer will deallocates all the storage capacity so
we don't need to clear and swap memeber variables which have std::map type.

Change-Id: Ide07a16e6b723495428dbaab89884989dc0cb2f6
Signed-off-by: Tae-Young Chung <ty83.chung@samsung.com>
4 years agoChange members of inference_engine_layer_property structure, 97/254897/4
Tae-Young Chung [Wed, 10 Mar 2021 09:15:19 +0000 (18:15 +0900)]
Change members of inference_engine_layer_property structure,
and change vector<inference_engine_tensor_buffer> to map<string, inference_engine_tensor_buffer>

This is based on
https://review.tizen.org/gerrit/#/c/platform/core/multimedia/inference-engine-interface/+/254892/
https://review.tizen.org/gerrit/#/c/platform/core/api/mediavision/+/254953/

Change-Id: Iba73fa67f586287f0efe90ac2750fab9c6927e45
Signed-off-by: Tae-Young Chung <ty83.chung@samsung.com>
4 years agoEnable optimization feature 13/253413/2
Inki Dae [Wed, 10 Feb 2021 01:17:41 +0000 (10:17 +0900)]
Enable optimization feature

Enabled below two optimization features,
    - converting data type from fp32 to fp16.
    - using Winograd algorithm[1] for matrix multiplication optimization,
      which reduces matrix multiplication count.

[1] https://en.wikipedia.org/wiki/Coppersmith%E2%80%93Winograd_algorithm

Change-Id: I9d37bc8e1cbb196ce46a2725a15ff09240826aaa
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd ARMNN logging level support 58/253358/2
Inki Dae [Tue, 9 Feb 2021 07:11:27 +0000 (16:11 +0900)]
Add ARMNN logging level support

ARMNN runtime engine supports five step logging levels.
This patch allows user to change the logging level at build time.

Change-Id: I621158be707f21dbd357da24ee00e271b34692b8
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoConvert specific tuning mode to the one for ARMNN 56/252756/3
Inki Dae [Fri, 29 Jan 2021 08:24:55 +0000 (17:24 +0900)]
Convert specific tuning mode to the one for ARMNN

Change-Id: I2a0d2bcd3b6846971720b542c78823069a92484b
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoSubject: Add CLTuner support
Inki Dae [Tue, 2 Feb 2021 06:46:29 +0000 (15:46 +0900)]
Subject: Add CLTuner support

Added CLTuner support ARMNN engine.

The use of CLTuner is valid only in case of GpuAcc request.
There are two behaviors of CLTuner according to CLTuner configuration
like below,
    [cltuner->active is true]
        -> It means that ARMNN backend uses CLTuner API of ARMNN engine.
    [cltuner->active is false]
        -> It means that ARMNN backend skips the use of CLTuner API
           of ARMNN engine.
    [cltuner->update is true]
        -> It means that ARMNN backend requests a tuning file generation
           to ARMNN engine.
    [cltuner->update is false]
        -> It means that ARMNN backend requests a inference to ARMNN engine,
           and ARMNN will use a tuned file for the inference.

Change-Id: Id74aeab532d312f1db7b7f856667a53d07abb329
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix build warning
Inki Dae [Thu, 7 Jan 2021 08:01:50 +0000 (17:01 +0900)]
Fix build warning

This patch fixes below build warning,
 warning: 'const void InferenceEngineImpl::ARMNNImpl::GraphDebugCallback(armnn::LayerGuid,
 unsigned int, armnn::ITensorHandle*)' defined but not used [-Wunused-function]

Change-Id: I736cb165ba29300b376ea834df379f78a595e9ce
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd Graph debug mode support
Inki Dae [Thu, 7 Jan 2021 05:22:40 +0000 (14:22 +0900)]
Add Graph debug mode support

Added Graph debug mode support, which adds debug layer to all layers
of a given model in runtime so that we can check the computation result
to each layer while in inference.

To invoke graph debug mode, set ARMNN_GRAPH_DEBUG flag to 1 in
spec file.

Change-Id: Ia9cd25728903784ef25fd09c637c34a5cd5a7de4
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoUse a proper notation for runtime context
Inki Dae [Tue, 2 Feb 2021 01:59:16 +0000 (10:59 +0900)]
Use a proper notation for runtime context

According to https://en.wikipedia.org/wiki/Hungarian_notation,
Changing prefix of mRuntime to sRuntime is better. By doing so,
we can identify that this member should be initialized or not.

Change-Id: I465afccd98727242c186d409bc1dd69e36da936a
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoMake sure to clear mRuntime at destructor 62/252562/1
Inki Dae [Fri, 29 Jan 2021 08:44:30 +0000 (17:44 +0900)]
Make sure to clear mRuntime at destructor

Change-Id: I2dddcffe5bec11eb27462ed3ce9eb53f5c44e6cf
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoDrop w option 58/251058/1
Inki Dae [Thu, 7 Jan 2021 08:07:59 +0000 (17:07 +0900)]
Drop w option

Change-Id: I1b49728e7777ae018b0b6ea498968f2651f3b7b2
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd Signed64 tensor data type support 52/242452/1 accepted/tizen_6.0_unified accepted/tizen_6.0_unified_hotfix tizen_6.0 tizen_6.0_hotfix accepted/tizen/6.0/unified/20201030.121051 accepted/tizen/6.0/unified/hotfix/20201103.051548 accepted/tizen/unified/20200911.043126 accepted/tizen/unified/20201112.124207 submit/tizen/20200828.025650 submit/tizen/20200910.024357 submit/tizen/20201104.021236 submit/tizen/20201109.032237 submit/tizen/20201109.053646 submit/tizen/20201110.032259 submit/tizen_6.0/20201029.205103 submit/tizen_6.0_hotfix/20201102.192503 submit/tizen_6.0_hotfix/20201103.114803 tizen_6.0.m2_release
Inki Dae [Thu, 27 Aug 2020 05:38:42 +0000 (14:38 +0900)]
Add Signed64 tensor data type support

Change-Id: Iae20f462062d155c3d7fc2891feb4fa989820a96
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoFix uninitialized variable issue 65/242365/2
Inki Dae [Tue, 25 Aug 2020 23:28:31 +0000 (08:28 +0900)]
Fix uninitialized variable issue

This fixes below coverity issues,
 CID : 11475481147784

Change-Id: I16b15715b197aa6b70930fe02288f486868cfd99
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoConsider two and more models inference support 04/240604/1
Inki Dae [Mon, 10 Aug 2020 02:46:36 +0000 (11:46 +0900)]
Consider two and more models inference support

This patch considers two and more models inference.

Mediavision has one backend engine context per a model.
However, ARMNN requires one runtime instance for all models.
So this patch makes it to use a same runtime instance
for two and more models.

Change-Id: Icc2a42fe20fefbc0b76806e7d107b5ad961bded0
Signed-off-by: Inki Dae <inki.dae@samsung.com>
4 years agoAdd init code for member variable 95/237995/1 accepted/tizen/unified/20200721.142307 submit/tizen/20200713.025722 submit/tizen/20200721.054132
Hyunsoo Park [Tue, 7 Jul 2020 08:10:43 +0000 (17:10 +0900)]
Add init code for member variable

Change-Id: I5906254f4327ce05933f4faf99aec2c96d6151d4
Signed-off-by: Hyunsoo Park <hance.park@samsung.com>
5 years agoFix initializer list coding rule 93/235393/1 accepted/tizen/unified/20200628.221632 submit/tizen/20200626.060446 submit/tizen/20200626.070253
Inki Dae [Thu, 4 Jun 2020 07:56:35 +0000 (16:56 +0900)]
Fix initializer list coding rule

Change-Id: I40a673c88db39892135e76f5685e0dc333a53eb0
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoFix coding style based on Tizen SE C++ Coding Rule 57/235357/1
Inki Dae [Thu, 4 Jun 2020 05:38:01 +0000 (14:38 +0900)]
Fix coding style based on Tizen SE C++ Coding Rule

Tizen SE C++ Coding Rule:
https://code.sec.samsung.net/confluence/pages/viewpage.action?pageId=160925159

Change-Id: Icbc6dc69923927758bf734bd0b348c2bfa5729f0
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoAdd SetPrivateData function 79/235279/1
Inki Dae [Wed, 3 Jun 2020 09:05:27 +0000 (18:05 +0900)]
Add SetPrivateData function

This function is needed for inference engine interface to pass
armnn specific data to this backend before loading a model file.

Change-Id: I162967bcf0bc7ddb6735a85c0381420183b4830b
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agopackaging: build only on arm and aarch64 57/233457/1 accepted/tizen/unified/20200529.124301 submit/tizen/20200515.015336 submit/tizen/20200528.065520
Inki Dae [Fri, 15 May 2020 01:49:57 +0000 (10:49 +0900)]
packaging: build only on arm and aarch64

ARMNN is NN runtime library for ARM system so do not build
it on other architecture.

Change-Id: I5cc692abe635b331c7efae9f912b94ea47f2e035
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agofix build error on aarch64 54/233454/1 submit/tizen/20200515.014005
Inki Dae [Fri, 15 May 2020 01:33:28 +0000 (10:33 +0900)]
fix build error on aarch64

In aarch64, size_t is 64bit data and so make sure to use correct
format specifier, and minor cleanup.

Change-Id: I30e3147aaba6e4a9cc1dc87f1fd540054a9a204f
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agochange enumeration values to new ones submit/tizen/20200515.011345
Inki Dae [Fri, 17 Apr 2020 07:47:45 +0000 (16:47 +0900)]
change enumeration values to new ones

some enumeration values of inference-engine-interface have been
updated so change them.

Change-Id: Ia057060061027647873fde09911a0f17fb12e830
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agopackaging: add armnn build requirement
Inki Dae [Thu, 9 Apr 2020 06:06:11 +0000 (15:06 +0900)]
packaging: add armnn build requirement

Change-Id: I04c4116253ee5576eb96df4edf1bf98af943c0f6
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoDrop GetInferenceResult function sandbox/inki.dae/working
Inki Dae [Mon, 24 Feb 2020 05:47:07 +0000 (14:47 +0900)]
Drop GetInferenceResult function

Change-Id: Iaecaebe2eb1219a4903c06ff06dde3eb7e69e683
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoAdd more logs and fallback device
Inki Dae [Mon, 24 Feb 2020 05:17:05 +0000 (14:17 +0900)]
Add more logs and fallback device

Change-Id: Ia3f50b639790edbd47ffb7ec4edc42eb3090f4c2
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoFix typo
Inki Dae [Thu, 20 Feb 2020 01:39:40 +0000 (10:39 +0900)]
Fix typo

Change-Id: I7aa4e6733f1743bbe6d3ac1a3f65468e6ad78c0e
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoUse ARMNN internal data type instead of specific one
Inki Dae [Wed, 19 Feb 2020 08:01:56 +0000 (17:01 +0900)]
Use ARMNN internal data type instead of specific one

Change-Id: I75393f94465b2d3638f66ccc2207cb60655751db
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoDrop dead functions
Inki Dae [Tue, 18 Feb 2020 04:20:01 +0000 (13:20 +0900)]
Drop dead functions

AllocateTensorBuffer and ReleaseTenosrBuffer functions aren't needed
anymore so drop them.

Change-Id: I0d3f0a7080bb3c668e903d8116569dca811e652a
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoFix build warnings
Inki Dae [Tue, 18 Feb 2020 04:17:01 +0000 (13:17 +0900)]
Fix build warnings

Change-Id: I64bfaed5fb6b197835065ac9e0715c6d9ffd8b39
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoConsider various NN model formats
Inki Dae [Tue, 18 Feb 2020 04:12:36 +0000 (13:12 +0900)]
Consider various NN model formats

As of now, only TFLite model is supported, and other will be
supported layer.

Change-Id: Ie255875032bfc6051da7a14913564bb9548c69b7
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoAdd multiple input and output tensors support
Inki Dae [Tue, 18 Feb 2020 02:37:41 +0000 (11:37 +0900)]
Add multiple input and output tensors support

Change-Id: Ibde983e23f3ba208b29494ece236675213aa52ef
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoAdd user desired input and output layer support
Inki Dae [Tue, 18 Feb 2020 01:56:36 +0000 (10:56 +0900)]
Add user desired input and output layer support

Change-Id: I33216ac409f8087d166154ff57b1cec4b2a80879
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoReceive output tensor buffers for the inference
Inki Dae [Fri, 14 Feb 2020 04:24:09 +0000 (13:24 +0900)]
Receive output tensor buffers for the inference

This patch receives tensor buffers allocated by upper layer
and makes them to be used for the inference.

As for this, it implements GetOutputTensorBuffers callback.

Change-Id: Ic8a3978205b38de0b1b93cd61b6fec4e98c62f3a
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoRename input/output binding members
Inki Dae [Thu, 13 Feb 2020 08:34:09 +0000 (17:34 +0900)]
Rename input/output binding members

Change-Id: I8fc122e1eb5c42bdc444930b40015884b5080648
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoUse input tensor buffers coming from upper layer
Inki Dae [Thu, 13 Feb 2020 07:32:33 +0000 (16:32 +0900)]
Use input tensor buffers coming from upper layer

This patch makes armnn backend to use input tensor buffers
coming from Inference layer for the inference.

As for this, it addes new several callbacks, and drops unnecessary ones.

Change-Id: Id922a1ecef70e36e781c66f363995a261713b6b0
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoUse a generic channel value
Inki Dae [Wed, 12 Feb 2020 04:33:45 +0000 (13:33 +0900)]
Use a generic channel value

Nowi, Inference layer will pass a generic channel value
instead of OpenCV specific.

Change-Id: Ibfabe6d3d93cb72cd7ff68479da83cd06f386246
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoReceive model file format from inference engine interface
Inki Dae [Tue, 11 Feb 2020 06:29:51 +0000 (15:29 +0900)]
Receive model file format from inference engine interface

Change-Id: I9a63c2d214221eb56049a111d26e1123f7e7c58c
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoImplement GetBackendCapacity callback
Inki Dae [Mon, 10 Feb 2020 06:28:58 +0000 (15:28 +0900)]
Implement GetBackendCapacity callback

This patch implements GetBackendCapacity callback and
provides device types that ARMNN can run the inference.

Change-Id: If67f53f8e0932ec997dd3ba03bf065dc64056627
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoConsider one more inference target devices
Inki Dae [Fri, 7 Feb 2020 06:43:13 +0000 (15:43 +0900)]
Consider one more inference target devices

Change-Id: I51b9a04f60e4b4fe873d8d8a1fda6a122532ce31
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoDrop SetInputTensorParam/SetOutputTensorParam callbacks
Inki Dae [Wed, 5 Feb 2020 09:31:21 +0000 (18:31 +0900)]
Drop SetInputTensorParam/SetOutputTensorParam callbacks

Change-Id: I1ee785034aa06b19fdba260c87f9081a7d312952
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoAdd logs to print out a given target device
Inki Dae [Wed, 5 Feb 2020 08:29:30 +0000 (17:29 +0900)]
Add logs to print out a given target device

Change-Id: Ie85dfbc9e731e0eeceab6fc869c4edfdaea3d3b7
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agocode refactoring to Inference initialization process
Inki Dae [Wed, 5 Feb 2020 07:18:05 +0000 (16:18 +0900)]
code refactoring to Inference initialization process

This patch uses new Load callback and constructor,
and replaces Init/DeInit callbacks with BindBackend/UnbindBackend callbacks.

Change-Id: Iec12fc15a3754ee312e6352201497fa9f78ac4d5
Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoadd initial code
Inki Dae [Wed, 22 Jan 2020 09:28:14 +0000 (18:28 +0900)]
add initial code

Signed-off-by: Inki Dae <inki.dae@samsung.com>
5 years agoInitial empty repository master
Tizen Infrastructure [Wed, 22 Jan 2020 09:10:28 +0000 (09:10 +0000)]
Initial empty repository