fix NNTrainer inference issue 66/266466/6
authorInki Dae <inki.dae@samsung.com>
Fri, 12 Nov 2021 09:41:41 +0000 (18:41 +0900)
committerInki Dae <inki.dae@samsung.com>
Wed, 17 Nov 2021 10:57:59 +0000 (19:57 +0900)
commita8639468bd02c6d1ba0896cccf16e867e8448363
tree2c5990c0fe97736c203f4fa824b399b64584f6e3
parent79adeca56b75e9e02e8cceb8b3bbf65695d861a3
fix NNTrainer inference issue

[Version] : 0.2.0-0
[Issue type] : bug fix

Fixed NNTrainer inference issue.

NNTrainer backend needs ml tensor info for input and output tensors
when loding a given model. So this patch creates the info before
requesting the model loading, and then sets them ml_single_open function
as its arguments.

Change-Id: I4e444b7b1d87c37249ddf2ac8b7c56aa7119602a
Signed-off-by: Inki Dae <inki.dae@samsung.com>
packaging/inference-engine-mlapi.spec
src/inference_engine_mlapi.cpp
src/inference_engine_mlapi_private.h