5 There are a lot of tools related to TfLite. However, it is inconvenient to use the tools directly because there are many locations and parameters. The tflkit has been created to make it easier to run frequently used tools in scripts. The function provided in this directory uses existing tools rather than implementing them directly. So, additional modifications may occur depending on the TensorFlow version or other external factors. The function provided in this directory will be gradually expanded.
9 The scripts here use TensorFlow's tools, so you need an environment to build TensorFlow.
10 Running the scripts within this tutorial requires:
11 * [Install Bazel](https://docs.bazel.build/versions/master/install.html), the build tool used to compile TensorFlow.
13 Initially, no external packages are installed on this project. Therefore, before running these scripts, you should install the associcated packages by running the following command once.
22 TensorFlow uses `summarize_graph` tool to inspect the model and provide guesses about likely input and output nodes, as well as other information that's useful for debugging. For more information, see [Inspecting Graphs](https://github.com/tensorflow/tensorflow/tree/9590c4c32dd4346ea5c35673336f5912c6072bf2/tensorflow/tools/graph_transforms#inspecting-graphs) page.
26 $ bazel build tensorflow/tools/graph_transforms:summarize_graph
27 $ bazel-bin/tensorflow/tools/graph_transforms/summarize_graph --in_graph=<pb file>
34 $ ./summarize_pb.sh <pb file>
37 The results shown below:
39 $ ./summarize_pb.sh inception_v3.pb
48 name=InceptionV3/Predictions/Reshape_1, op=Reshape
68 ## Summarize TfLite model
74 $ ./summarize_tflite.sh <tflite file>
77 The results shown below:
79 $ ./summarize_tflite.sh inception_v3.tflite
82 Main model input tensors: [317]
83 Main model output tensors: [316]
85 Operator 0: CONV_2D (instrs: 39,073,760, cycls: 39,073,760)
86 Fused Activation: RELU
87 Input Tensors[317, 0, 5]
88 Tensor 317 : buffer 183 | Empty | FLOAT32 | Shape [1, 299, 299, 3] (b'input')
89 Tensor 0 : buffer 205 | Filled | FLOAT32 | Shape [32, 3, 3, 3] (b'InceptionV3/Conv2d_1a_3x3/weights')
90 Tensor 5 : buffer 52 | Filled | FLOAT32 | Shape [32] (b'InceptionV3/InceptionV3/Conv2d_1a_3x3/Conv2D_bias')
92 Tensor 6 : buffer 285 | Empty | FLOAT32 | Shape [1, 149, 149, 32] (b'InceptionV3/InceptionV3/Conv2d_1a_3x3/Relu')
96 Operator 125: SOFTMAX (instrs: 4,003, cycls: 4,003)
98 Tensor 225 : buffer 142 | Empty | FLOAT32 | Shape [1, 1001] (b'InceptionV3/Logits/SpatialSqueeze')
100 Tensor 316 : buffer 53 | Empty | FLOAT32 | Shape [1, 1001] (b'InceptionV3/Predictions/Reshape_1')
103 Number of all operator types: 6
104 CONV_2D : 95 (instrs: 11,435,404,777)
105 MAX_POOL_2D : 4 (instrs: 12,755,516)
106 AVERAGE_POOL_2D : 10 (instrs: 36,305,334)
107 CONCATENATION : 15 (instrs: 0)
108 RESHAPE : 1 (instrs: ???)
109 SOFTMAX : 1 (instrs: 4,003)
110 Number of all operators : 126 (total instrs: 11,484,469,630)
113 ## Convert a TensorFlow model into TfLite model
117 TensorFlow provides some kinds of converting guideline. In Python, the [TFLiteConverter](https://www.tensorflow.org/api_docs/python/tf/lite/TFLiteConverter) class will help you to convert a TensorFlow GraphDef or SavedModel into `output_format` using TOCO. The `output_format` can be `TFLITE` or `GRAPHVIZ_DOT` format. The default `output_format` is `TFLITE`. And there is a Python command line interface for running TOCO, and its name is [tflite_convert](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/python/tflite_convert.py). This converts a TensorFlow GraphDef or SavedModel into `TFLITE` or `GRAPHVIZ_DOT` format like [TFLiteConverter](https://www.tensorflow.org/api_docs/python/tf/lite/TFLiteConverter). These two way also supports to convert a TensorFlow Keras model into `output_format`. Both functions are implemented using a tool called TOCO.
121 The tflkit uses the [tflite_convert](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/python/tflite_convert.py) python command line interface to convert a TensorFlow model into TfLite model. It only supports to convert a TensorFlow GraphDef file into `TFLITE` format file. This tool supports the creation of individual `TFLITE` files for different input shapes. When converting to multiple `TFLITE` files, it needs to put a string called `NAME` in `TFLITE_PATH`. The string `NAME` will be replaced by what is listed in the `NAME` environment. This tool requires an information file as a parameter. There is an [example file](convert.template) for a convert information. The `--tensorflow_path` and `--tensorflow_version` can change the TensorFlow location. By default, it uses `externals/tensorflow` directory.
124 * GRAPHDEF_PATH : Full filepath of file containing frozen TensorFlow GraphDef.
125 * TFLITE_PATH : Full filepath of the output TfLite model. If `NAME` optional environment is used, it must include `NAME` string in the file name. (ex. `[...]/inception_v3_NAME.tflite`)
126 * INPUT : Names of the input arrays, comma-separated.
127 * INPUT_SHAPE : Shapes correspoding to `INPUT`, colon-separated.
128 For the creation of individual `TFLITE` files for different input shapes, space-separated.
129 * OUTPUT : Names of the output arrays, comma-seperated.
130 * NAME(Optional) : Names of the individual `TFLITE` files for different input shapes, space-seperated.
132 Usage (for example, [InceptionV3](https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz)):
134 $ wget https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz
135 $ tar xzvf inception_v3_2018_04_27.tgz ./inception_v3.pb
136 $ cat > convert.info << END
137 GRAPHDEF_PATH="${PWD}/inception_v3.pb"
138 TFLITE_PATH="${PWD}/inception_v3.tflite"
140 INPUT_SHAPE="1,299,299,3"
141 OUTPUT="InceptionV3/Predictions/Reshape_1"
144 $ ./tflite_convert.sh --info=./convert.info --tensorflow_version=1.12
149 Usage (for example, multiple `TFLITE` files):
151 $ cat > convert_multiple.info << END
152 GRAPHDEF_PATH="${PWD}/inception_v3.pb"
153 TFLITE_PATH="${PWD}/inception_v3_NAME.tflite"
155 INPUT_SHAPE="1,299,299,3 3,299,299,3"
156 OUTPUT="InceptionV3/Predictions/Reshape_1"
159 $ ./tflite_convert.sh --info=./convert_multiple.info --tensorflow_version=1.12
161 inception_v3_batch1.tflite
162 inception_v3_batch3.tflite
165 ## Optimize a TensorFlow model for inference
169 This [optimize tool](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py) behaves like a transform tool. However, this tool is optimized to convert the trained TensorFlow graph for inference. This tool removes parts of a graph that are only needed for training. These include:
170 - Removing training-only operations like checkpoint saving.
171 - Stripping out parts of the graph that are never reached.
172 - Removing debug operations like CheckNumerics.
173 - Folding batch normalization ops into the pre-calculated weights.
174 - Fusing common operations into unified version.
175 The input and output file of this tool is a TensorFlow GraphDef file.
179 The [optimize_for_inference.sh](optimize_for_inference.sh) file invokes the TensorFlow [optimize tool](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py). This tool requires a optimize information file as a parameter. Here is an [example file](optimize.template) for this tool. The information file needs `INPUT` and `OUTPUT` array names. The [summarize_pb.sh](summarize_pb.sh) file will help you to define the `INPUT` and `OUTPUT` array names. The `--tensorflow_path` can change the TensorFlow location. By default, it uses `externals/tensorflow` directory.
181 Optimize information:
182 * GRAPHDEF_PATH : Full filepath of file containing frozen TensorFlow GraphDef.
183 * OPTIMIZE_PATH : Full filepath to save the output optimized graph.
184 * INPUT : Names of the input arrays, comma-separated.
185 * OUTPUT : Names of the output arrays, comma-seperated.
187 Usage (for example, [InceptionV3](https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz)):
189 $ wget https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz
190 $ tar xzvf inception_v3_2018_04_27.tgz ./inception_v3.pb
191 $ cat > optimize.info << END
192 GRAPHDEF_PATH="${PWD}/inception_v3.pb"
193 OPTIMIZE_PATH="${PWD}/inception_v3.optimize.pb"
195 OUTPUT="InceptionV3/Predictions/Reshape_1"
197 $ ./optimize_for_inference.sh --info=./optimize.info
199 inception_v3.optimize.pb inception_v3.pb
202 ## Transform a TensorFlow graph
206 The trained TensorFlow model can be trasformed by some variants to deploy it in production. This [Graph Transform Tool](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/graph_transforms#graph-transform-tool) provides to support this behavior. There are so many transform options in this tool. For more information on transform options, please see [this page](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/graph_transforms#transform-reference). The input and output file of this tool is a TensorFlow GraphDef file.
210 The [transform_graph.sh](transform_graph.sh) file supports to transform a TensorFlow GraphDef using various transform options. This tool requires a transform information file as a parameter and the transform options are described in the information file. There is an [example file](transform.template) for this tool. The information file needs `INPUT` and `OUTPUT` array names. The [summarize_pb.sh](summarize_pb.sh) file will help you to define the `INPUT` and `OUTPUT` array names. The `--tensorflow_path` can change the TensorFlow location. By default, it uses `externals/tensorflow` directory.
212 Transform information:
213 * GRAPHDEF_PATH : Full filepath of file containing frozen TensorFlow GraphDef.
214 * TRANSFORM_PATH : Full filepath of the output TensorFlow GraphDef.
215 * INPUT : Names of the input arrays, comma-separated.
216 * OUTPUT : Names of the output arrays, comma-seperated.
217 * TRANSFORM_OPTIONS : Names of transform option, space-separated.
218 By default, it includes the following options.
219 * strip_unused_nodes : Removes all nodes not used in calculated the layer given in `OUTPUT` array, fed by `INPUT` array.
220 * remove_nodes : Removes the given name nodes from the graph.
221 * `Identity` is not necessary in inference graph. But if it needs in inference graph, this tool does not remove this node.
222 * `CheckNumerics` is useful during training but it is not necessary in inference graph.
223 * fold_constants : Replaces the sub-graps that always evaluate to constant expressions with those constants. This optimization is always executed at run-time after the graph is loaded, so it does'nt help latency, but it can simplify the graph and so make futher processing easier.
224 * fold_batch_norms : Scans the graph for any channel-wise multiplies immediately after convolutions, and multiplies the convolution's weights with the Mul instead so this can be omitted at inference time. It should be run after `fold_constants`.
226 Usage (for example, [InceptionV3](https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz)):
228 $ wget https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz
229 $ tar xzvf inception_v3_2018_04_27.tgz ./inception_v3.pb
230 $ cat > transform.info << END
231 GRAPHDEF_PATH="${PWD}/inception_v3.pb"
232 TRANSFORM_PATH="${PWD}/inception_v3.transform.pb"
234 OUTPUT="InceptionV3/Predictions/Reshape_1"
235 TRANSFORM_OPTIONS="strip_unused_nodes(type=float, shape=\"1,299,299,3\") remove_nodes(op=Identity, op=CheckNumerics) fold_constants(ignore_errors=true) fold_batch_norms"
237 $ ./transform_graph.sh --info=./transform.info
239 inception_v3.pb inception_v3.transform.pb
242 ## Freeze a TensorFlow model
246 TensorFlow provides methods to save and restore models. Each method stores related files in different ways. Here are two common ways to save the freeze stored models.
247 1. Use [tf.train.Saver](https://www.tensorflow.org/guide/saved_model#save_and_restore_variables)
248 In this way, it creates a `MetaGraphDef` file and checkpoint files that contain the saved variables. Saving this way will result in the following files in the exported directory:
252 checkpoint model.ckpt.data-00000-of-00001 model.ckpt.index model.ckpt.meta
255 2. Use [SavedModel](https://www.tensorflow.org/guide/saved_model#build_and_load_a_savedmodel)
256 It is the easiest way to create a saved model. Saving this way will result in the following files in the exported directory:
259 $ ls /tmp/saved_model/
260 saved_model.pb variables
261 $ tree /tmp/saved_model/
265 ├── variables.data-00000-of-00001
269 The [freeze_graph](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/freeze_graph.py) tool receives these files as input parameters and combines the stored variables and the standalone graphdef to generate a frozen graphdef file.
273 The tflkit provides the simple way to create a frozen graph using [freeze_graph](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/freeze_graph.py) tool. This tool requires an information file as a parameter. There is an [example file](freeze.template) for a freeze tool. Either `SAVED_MODEL` or `META_GRAPH` must be declared. And `META_GRAPH` is always used with `CKPT_PATH`. The `--tensorflow_path` can change the TensorFlow location. By default, it uses `externals/tensorflow` directory.
276 * SAVED_MODEL : Full directory path with TensorFlow `SavedModel` file and variables.
277 * META_GRAPH : Full filepath of file containing TensorFlow `MetaGraphDef`.
278 * CKPT_PATH : Full filepath of file containing TensorFlow variables. (ex. [...]/*.ckpt)
279 * FROZEN_PATH : Full filepath to save the output frozen graph.
280 * OUTPUT : Names of the output arrays, comma-separated.
282 Usage (for example, `tf.train.Saver`):
284 $ cat > sample_saver.py << END
285 import tensorflow as tf
287 # Create some variables.
288 v1 = tf.get_variable("v1", shape=[3], initializer = tf.zeros_initializer)
289 v2 = tf.get_variable("v2", shape=[5], initializer = tf.zeros_initializer)
291 inc_v1 = v1.assign(v1+1)
292 dec_v2 = v2.assign(v2-1)
294 # Add an op to initialize the variables.
295 init_op = tf.global_variables_initializer()
297 # Add ops to save and restore all the variables.
298 saver = tf.train.Saver()
300 # Later, launch the model, initialize the variables, do some work, and save the
302 with tf.Session() as sess:
304 # Do some work with the model.
307 # Save the variables to disk.
308 save_path = saver.save(sess, "/tmp/saver/model.ckpt")
309 print("Model saved in path: %s" % save_path)
311 $ python sample_saver.py
313 checkpoint model.ckpt.data-00000-of-00001 model.ckpt.index model.ckpt.meta
314 $ cat > freeze_saver.info << END
316 META_GRAPH="/tmp/saver/model.ckpt.meta"
317 CKPT_PATH="/tmp/saver/model.ckpt"
318 FROZEN_PATH="/tmp/saver/model.frozen.pb"
321 $ ./freeze_graph.sh --info=./freeze_saver.info
323 /tmp/saver/model.frozen.pb
326 Usage (for example, `SavedModel`):
328 $ cat > sample_saved_model.py << END
329 import tensorflow as tf
331 # Create some variables.
332 v1 = tf.get_variable("v1", shape=[3], initializer = tf.zeros_initializer)
333 v2 = tf.get_variable("v2", shape=[5], initializer = tf.zeros_initializer)
335 inc_v1 = v1.assign(v1+1)
336 dec_v2 = v2.assign(v2-1)
338 # Add an op to initialize the variables.
339 init_op = tf.global_variables_initializer()
341 # Later, launch the model, initialize the variables, do some work, and save the
343 with tf.Session() as sess:
345 # Do some work with the model.
348 # Save the variables to disk.
349 tf.saved_model.simple_save(sess, "/tmp/saved_model", inputs={'v1':v1}, outputs={'v2':v2})
351 $ python sample_saved_model.py
352 $ ls /tmp/saved_model/
353 saved_model.pb variables
354 $ cat > freeze_saved_model.info << END
355 SAVED_MODEL="/tmp/saved_model"
358 FROZEN_PATH="/tmp/saved_model/model.frozen.pb"
361 $ ./freeze_graph.sh --info=./info/freeze_saved_model.info
362 $ ls /tmp/saved_model/
363 model.frozen.pb saved_model.pb variables
364 $ ls /tmp/saved_model/*.frozen.pb
365 /tmp/saved_model/model.frozen.pb