5 ### Convert tensorflow pb file to nnpackage
6 Follow the [compiler guide](https://github.com/Samsung/ONE/blob/master/docs/nncc/v1.0.0/tutorial.md) to generate nnpackge from tensorflow pb file
8 ### Convert tflite file to nnpackage
9 Please see [model2nnpkg](https://github.com/Samsung/ONE/tree/master/tools/nnpackage_tool/model2nnpkg) for converting from tflite model file.
11 ## Build app with NNFW API
13 Here are basic steps to build app with [NNFW C API](https://github.com/Samsung/ONE/blob/master/runtime/onert/api/include/nnfw.h)
15 1) Initialize nnfw_session
17 nnfw_session *session = nullptr;
18 nnfw_create_session(&session);
22 nnfw_load_model_from_file(session, nnpackage_path);
24 3) (Optional) Assign a specific backend to operations
26 // Use acl_neon backend for CONV_2D and acl_cl for otherwise.
27 // Note that defalut backend is acl_cl
28 nnfw_set_op_backend(session, "CONV_2D", "acl_neon");
34 nnfw_prepare(session);
37 5) Prepare Input/Output
39 // Prepare input. Here we just allocate dummy input arrays.
40 std::vector<float> input;
42 nnfw_input_tensorinfo(session, 0, &ti); // get first input's info
43 uint32_t input_elements = num_elems(&ti);
44 input.resize(input_elements);
45 // TODO: Please add initialization for your input.
46 nnfw_set_input(session, 0, ti.dtype, input.data(), sizeof(float) * input_elements);
49 std::vector<float> output;
50 nnfw_output_tensorinfo(session, 0, &ti); // get first output's info
51 uint32_t output_elements = num_elems(&ti);
52 output.resize(output_elements);
53 nnfw_set_output(session, 0, ti.dtype, output.data(), sizeof(float) * output_elements);
61 ## Run Inference with app on the target devices
62 reference app : [minimal app](https://github.com/Samsung/ONE/blob/master/runtime/onert/sample/minimal)
65 $ ./minimal path_to_nnpackage_directory