--- /dev/null
+# nnkit
+
+`nnkit` is collection of neural networks tools for our _nncc_ project.
+
+## nni
+
+`nni` is a command line interface to interact with existing inference engines
+or compiled artifacts.
+
+## How nni works
+
+`nni` first dynamically loads `backend` and multiple pre/post `action`
+specified by command-line. After loading backend and actions, `nni` requests
+`backend` to prepare itself. When backend is prepared, `backend` exposes its
+internal state to `nni` (as `nnkit::TensorContext`).
+`nni` takes this state, and passes it to registered pre `action`(s).
+Each action may read tensor(s) (e.g. dump the content into a file),
+or manipuate their value (e.g. fill random values).
+`nni` then invokes `backend` through `run()` method.
+After successful running the `backend`, post `action`(s) are called same like
+pre `action`(s) as a teardown step.
+
+## Backends
+
+Currently there are two backends as of writing this document
+
+- Caffe as `libnnkit_caffe_backend.so`
+- Tensorflow Lite as `libnnkit_tflite_backend.so`
+
+# How to run inference with nni
+
+To run `nni`, we need to provide a backend module and argument(s) if required
+and optional `pre-` or `post-` action module(s)
+
+## Building backends
+
+Notice: This section will be removed when not needed anymore.
+
+You may need to give options to build the backends
+- We may not need the following steps any more when these `DOWNLOAD_XXX` options are turned on by default
+```
+rm -rf build
+
+./nncc configure \
+-DDOWNLOAD_FARMHASH=1 -DDOWNLOAD_EIGEN=1 -DDOWNLOAD_GEMMLOWP=1 \
+-DDOWNLOAD_NEON2SSE=1 -DDOWNLOAD_FLATBUFFERS=1 -DDOWNLOAD_GFLAGS=1 \
+-DDOWNLOAD_TENSORFLOW=1
+
+./nncc build
+```
+
+## How to pass arguments
+
+Syntax is `--argument` with `value` form. Existing arguments are as follows.
+- `--backend` [Backend module path]. Only one is needed.
+- `--backend-arg` [Backend argument]. Argument(s) for the backend.
+- `--pre` [Pre-Action module path]. Multiple Pre-Action can be given.
+- `--pre-arg` [Pre-Action argument]. Set argument(s) for the pre-action just before.
+- `--post` [Post-Action module path]. Multiple Post-Action can be given.
+- `--post-arg` [Post-Action argument]. Set argument(s) for the post-action just before.
+
+For example,
+```
+nni \
+--backend ./path/to/backend --backend-arg arg1 --backend-arg arg2 \
+--pre ./path/to/preA --pre-arg arg1preA --pre-arg arg2preA \
+--pre ./path/to/preB --pre-arg arg1preB --pre-arg arg2preB \
+--post ./path/to/postA --post-arg arg1postA
+```
+
+This will run
+- backend `./path/to/backend` with arguments `arg1 arg2` with
+ - pre-action `./path/to/preA` with arguments `arg1preA arg2preA`,
+ - pre-action `./path/to/preB` with arguments `arg1preB arg2preB` and
+ - post-action `./path/to/postA` with an argument `arg1postA`
+
+## Running with tflite backend
+
+```
+cd build
+
+contrib/nnkit/tools/nni/nni \
+--backend ./contrib/nnkit/backends/tflite/libnnkit_tflite_backend.so \
+--backend-arg inceptionv3_non_slim_2015.tflite
+```
+
+You can download `inceptionv3_non_slim_2015.tflite` model from
+[here](https://storage.googleapis.com/download.tensorflow.org/models/tflite/inception_v3_2015_2017_11_10.zip)
+
+## Dump HDF5
+
+You can drop a HDF5 file of inputs and outputs with `HDF5_export_action` action module.
+
+```
+cd build
+
+contrib/nnkit/tools/nni/nni \
+--backend ./contrib/nnkit/backends/tflite/libnnkit_tflite_backend.so \
+--backend-arg inceptionv3_non_slim_2015.tflite \
+--pre ./contrib/nnkit/actions/HDF5/libnnkit_HDF5_export_action.so \
+--pre-arg ./pre.hdf5 \
+--post ./contrib/nnkit/actions/HDF5/libnnkit_HDF5_export_action.so \
+--post-arg ./post.hdf5
+```
+
+This will drop `pre.hdf5` and `post.hdf5` files containing input and output
+tensor of inceptionv3_non_slim_2015.tflite model.