Add optimize_for_inference script into tflkit (#3546)
author윤지영/동작제어Lab(SR)/Engineer/삼성전자 <jy910.yun@samsung.com>
Wed, 14 Nov 2018 00:46:35 +0000 (09:46 +0900)
committer박세희/동작제어Lab(SR)/Principal Engineer/삼성전자 <saehie.park@samsung.com>
Wed, 14 Nov 2018 00:46:35 +0000 (09:46 +0900)
* Add optimize_for_inference script into tflkit

optimize_for_inference tool optimizes a TensorFlow GraphDef for inference.

Signed-off-by: Jiyoung Yun <jy910.yun@samsung.com>
* Add detail information to README.md

Signed-off-by: Jiyoung Yun <jy910.yun@samsung.com>
tools/tflkit/README.md
tools/tflkit/info/optimize.template [new file with mode: 0644]
tools/tflkit/optimize_for_inference.sh [new file with mode: 0755]

index c0c7c1a..c1202bd 100644 (file)
@@ -161,3 +161,40 @@ $ ls *.tflite
 inception_v3_batch1.tflite
 inception_v3_batch3.tflite
 ```
+
+## Optimize a TensorFlow model for inference
+
+### TensorFlow
+
+This [optimize tool](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py) behaves like a transform tool. However, this tool is optimized to convert the trained TensorFlow graph for inference. This tool removes parts of a graph that are only needed for training. These include:
+  - Removing training-only operations like checkpoint saving.
+  - Stripping out parts of the graph that are never reached.
+  - Removing debug operations like CheckNumerics.
+  - Folding batch normalization ops into the pre-calculated weights.
+  - Fusing common operations into unified version.
+The input and output file of this tool is a TensorFlow GraphDef file.
+
+### with tflkit
+
+The [optimize_for_inference.sh](optimize_for_inference.sh) file invokes the TensorFlow [optimize tool](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py). This tool requires a optimize information file as a parameter. Here is an [example file](info/optimize.template) for this tool. The information file needs `INPUT` and `OUTPUT` array names. The [summarize_pb.sh](summarize_pb.sh) file will help you to define the `INPUT` and `OUTPUT` array names. The `--tensorflow_path` can change the TensorFlow location. By default, it uses `externals/tensorflow` directory.
+
+Optimize information:
+  * GRAPHDEF_PATH : Full filepath of file containing frozen TensorFlow GraphDef.
+  * OPTIMIZE_PATH : Full filepath to save the output optimized graph.
+  * INPUT : Names of the input arrays, comma-separated.
+  * OUTPUT : Names of the output arrays, comma-seperated.
+
+Usage (for example, [InceptionV3](https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz)):
+```
+$ wget https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz
+$ tar xzvf inception_v3_2018_04_27.tgz ./inception_v3.pb
+$ cat > optimize.info << END
+GRAPHDEF_PATH="${PWD}/inception_v3.pb"
+OPTIMIZE_PATH="${PWD}/inception_v3.optimize.pb"
+INPUT="input"
+OUTPUT="InceptionV3/Predictions/Reshape_1"
+END
+$ ./optimize_for_inference.sh --info=./optimize.info
+$ ls *.pb
+inception_v3.optimize.pb  inception_v3.pb
+```
diff --git a/tools/tflkit/info/optimize.template b/tools/tflkit/info/optimize.template
new file mode 100644 (file)
index 0000000..5293a03
--- /dev/null
@@ -0,0 +1,4 @@
+GRAPHDEF_PATH=
+OPTIMIZE_PATH=
+INPUT=
+OUTPUT=
diff --git a/tools/tflkit/optimize_for_inference.sh b/tools/tflkit/optimize_for_inference.sh
new file mode 100755 (executable)
index 0000000..ef6e529
--- /dev/null
@@ -0,0 +1,90 @@
+#!/bin/bash
+
+usage()
+{
+  echo "usage : $0"
+  echo "       --info=Information file"
+  echo "       --tensorflow_path=TensorFlow path (Use externals/tensorflow by default)"
+}
+
+SCRIPT_PATH="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
+
+TF_DIR="${SCRIPT_PATH}/../../externals/tensorflow"
+
+for i in "$@"
+do
+  case $i in
+    --info=*)
+      INFO=${i#*=}
+      ;;
+    --tensorflow_path=*)
+      TF_DIR=${i#*=}
+      ;;
+    -h|--help)
+      usage
+      exit 0
+      ;;
+    *)
+      usage
+      exit 1
+      ;;
+  esac
+  shift
+done
+
+if [ -z "$INFO" ]; then
+  echo "INFO is unset or set to the empty string"
+  usage
+  exit 1
+fi
+if [ -z "$TF_DIR" ]; then
+  echo "tensorflow_path is unset or set to the empty string"
+  usage
+  exit 1
+fi
+
+if [ ! -x "$(command -v bazel)" ]; then
+  echo "Cannot find bazel. Please install bazel."
+  exit 1
+fi
+
+source $INFO
+
+if [ -z "$GRAPHDEF_PATH" ]; then
+  echo "GRAPHDEF_PATH is unset or set to the empty string"
+  echo "Update the $INFO file"
+  exit 1
+fi
+if [ -z "$OPTIMIZE_PATH" ]; then
+  echo "OPTIMIZE_PATH is unset or set to the empty string"
+  echo "Update the $INFO file"
+  exit 1
+fi
+if [ -z "$INPUT" ]; then
+  echo "INPUT is unset or set to the empty string"
+  echo "Update the $INFO file"
+  exit 1
+fi
+if [ -z "$OUTPUT" ]; then
+  echo "OUTPUT is unset or set to the empty string"
+  echo "Update the $INFO file"
+  exit 1
+fi
+
+CUR_DIR=$(pwd)
+{
+  echo "Enter $TF_DIR"
+  pushd $TF_DIR > /dev/null
+
+  bazel run tensorflow/python/tools:optimize_for_inference -- \
+  --input="$GRAPHDEF_PATH" \
+  --output="$OPTIMIZE_PATH" \
+  --frozen_graph=True \
+  --input_names="$INPUT" \
+  --output_names="$OUTPUT" \
+  --toco_compatible=True
+
+  popd
+
+  echo "OUTPUT FILE : $OPTIMIZE_PATH"
+}