[WIP. Tizen/API] Simple Single-Shot Inference Low-Level API
authorMyungJoo Ham <myungjoo.ham@samsung.com>
Fri, 29 Mar 2019 05:00:35 +0000 (14:00 +0900)
committerjaeyun-jung <39614140+jaeyun-jung@users.noreply.github.com>
Mon, 3 Jun 2019 06:57:34 +0000 (15:57 +0900)
commitcea67de3abc4db00773b358a0bf379650eaeb420
tree0235e4e3e391c742f34f88774432c85fa1a04bc0
parent1afb43875b8895c7cc47f651f839a084fc372f05
[WIP. Tizen/API] Simple Single-Shot Inference Low-Level API

This is quite similar with CoreML/MLModel API, which is the
low-level API for iOS CoreML.

With this, application developers may invoke "inferences"
of a given model with a single data frame without
any pre/post-processings.

The corresponding pipeline will be:

app_src --> tensor_filter --> tensor_sink

with 0/1 framerate.

Signed-off-by: MyungJoo Ham <myungjoo.ham@samsung.com>
tizen-api/doc/nnstreamer_doc.h
tizen-api/include/nnstreamer-single.h [new file with mode: 0644]
tizen-api/src/nnstreamer-single.c [new file with mode: 0644]