Apply a new policy for inference engine backend path 48/255748/5
authorInki Dae <inki.dae@samsung.com>
Thu, 18 Mar 2021 12:04:25 +0000 (21:04 +0900)
committerInki Dae <inki.dae@samsung.com>
Wed, 24 Mar 2021 03:41:28 +0000 (12:41 +0900)
commitd0d92d064e3b2ddccf66eb1e4f6913ccacc7d10c
tree8d280e653aedead74d9cc68867eedc56713d90ce
parentc1b6cab12c8cbf6b4292d3b6be814214f54bba0f
Apply a new policy for inference engine backend path

This patch applies a new policy for inference engine backend path.

Inference engine interface framework has a role to decide which inference
engine API framework - internal or ML Single API - will be used according
to user desired inference requests.

In runtime, inference engine interface framework loads ini file -
/etc/inference/inference_engine_backend_path.ini - and parse it
to check which API framework current Platform wants to use for a given
backend.

So this patch applies below role according to user desired inference requests.

    User          INI configuration file             API framework
   ---------------------------------------------------------------
   ONE                      -                           MLAPI
   MLAPI                    -                           MLAPI
   CUSTOM device(NPU)       -                           MLAPI
   ARMNN                    -                          Internal
   ARMNN                  MLAPI                         MLAPI
   TFLITE                   -                          Internal
   TFLITE                 MLAPI                         MLAPI
   OPENCV                   -                          Internal
   --------------------------------------------------------------

   Legends
   -------
     - : nothing declared.
 Internal : internal plugin will be used.
 MLAPI : ML Single API will be used.

Change-Id: If2f310171671e911717792538cc6977c46f2bcd8
Signed-off-by: Inki Dae <inki.dae@samsung.com>
include/inference_engine_common_impl.h
src/inference_engine_common_impl.cpp