Apply a new policy for inference engine backend path
This patch applies a new policy for inference engine backend path.
Inference engine interface framework has a role to decide which inference
engine API framework - internal or ML Single API - will be used according
to user desired inference requests.
In runtime, inference engine interface framework loads ini file -
/etc/inference/inference_engine_backend_path.ini - and parse it
to check which API framework current Platform wants to use for a given
backend.
So this patch applies below role according to user desired inference requests.
User INI configuration file API framework
---------------------------------------------------------------
ONE - MLAPI
MLAPI - MLAPI
CUSTOM device(NPU) - MLAPI
ARMNN - Internal
ARMNN MLAPI MLAPI
TFLITE - Internal
TFLITE MLAPI MLAPI
OPENCV - Internal
--------------------------------------------------------------
Legends
-------
- : nothing declared.
Internal : internal plugin will be used.
MLAPI : ML Single API will be used.
Change-Id: If2f310171671e911717792538cc6977c46f2bcd8
Signed-off-by: Inki Dae <inki.dae@samsung.com>