From ea91195771f4b9d7cbe81707929db6ecf8d46025 Mon Sep 17 00:00:00 2001 From: =?utf8?q?=EC=9C=A4=EC=A7=80=EC=98=81/=EB=8F=99=EC=9E=91=EC=A0=9C?= =?utf8?q?=EC=96=B4Lab=28SR=29/Engineer/=EC=82=BC=EC=84=B1=EC=A0=84?= =?utf8?q?=EC=9E=90?= Date: Wed, 14 Nov 2018 09:46:35 +0900 Subject: [PATCH] Add optimize_for_inference script into tflkit (#3546) * Add optimize_for_inference script into tflkit optimize_for_inference tool optimizes a TensorFlow GraphDef for inference. Signed-off-by: Jiyoung Yun * Add detail information to README.md Signed-off-by: Jiyoung Yun --- tools/tflkit/README.md | 37 ++++++++++++++ tools/tflkit/info/optimize.template | 4 ++ tools/tflkit/optimize_for_inference.sh | 90 ++++++++++++++++++++++++++++++++++ 3 files changed, 131 insertions(+) create mode 100644 tools/tflkit/info/optimize.template create mode 100755 tools/tflkit/optimize_for_inference.sh diff --git a/tools/tflkit/README.md b/tools/tflkit/README.md index c0c7c1a..c1202bd 100644 --- a/tools/tflkit/README.md +++ b/tools/tflkit/README.md @@ -161,3 +161,40 @@ $ ls *.tflite inception_v3_batch1.tflite inception_v3_batch3.tflite ``` + +## Optimize a TensorFlow model for inference + +### TensorFlow + +This [optimize tool](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py) behaves like a transform tool. However, this tool is optimized to convert the trained TensorFlow graph for inference. This tool removes parts of a graph that are only needed for training. These include: + - Removing training-only operations like checkpoint saving. + - Stripping out parts of the graph that are never reached. + - Removing debug operations like CheckNumerics. + - Folding batch normalization ops into the pre-calculated weights. + - Fusing common operations into unified version. +The input and output file of this tool is a TensorFlow GraphDef file. + +### with tflkit + +The [optimize_for_inference.sh](optimize_for_inference.sh) file invokes the TensorFlow [optimize tool](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py). This tool requires a optimize information file as a parameter. Here is an [example file](info/optimize.template) for this tool. The information file needs `INPUT` and `OUTPUT` array names. The [summarize_pb.sh](summarize_pb.sh) file will help you to define the `INPUT` and `OUTPUT` array names. The `--tensorflow_path` can change the TensorFlow location. By default, it uses `externals/tensorflow` directory. + +Optimize information: + * GRAPHDEF_PATH : Full filepath of file containing frozen TensorFlow GraphDef. + * OPTIMIZE_PATH : Full filepath to save the output optimized graph. + * INPUT : Names of the input arrays, comma-separated. + * OUTPUT : Names of the output arrays, comma-seperated. + +Usage (for example, [InceptionV3](https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz)): +``` +$ wget https://storage.googleapis.com/download.tensorflow.org/models/tflite/model_zoo/upload_20180427/inception_v3_2018_04_27.tgz +$ tar xzvf inception_v3_2018_04_27.tgz ./inception_v3.pb +$ cat > optimize.info << END +GRAPHDEF_PATH="${PWD}/inception_v3.pb" +OPTIMIZE_PATH="${PWD}/inception_v3.optimize.pb" +INPUT="input" +OUTPUT="InceptionV3/Predictions/Reshape_1" +END +$ ./optimize_for_inference.sh --info=./optimize.info +$ ls *.pb +inception_v3.optimize.pb inception_v3.pb +``` diff --git a/tools/tflkit/info/optimize.template b/tools/tflkit/info/optimize.template new file mode 100644 index 0000000..5293a03 --- /dev/null +++ b/tools/tflkit/info/optimize.template @@ -0,0 +1,4 @@ +GRAPHDEF_PATH= +OPTIMIZE_PATH= +INPUT= +OUTPUT= diff --git a/tools/tflkit/optimize_for_inference.sh b/tools/tflkit/optimize_for_inference.sh new file mode 100755 index 0000000..ef6e529 --- /dev/null +++ b/tools/tflkit/optimize_for_inference.sh @@ -0,0 +1,90 @@ +#!/bin/bash + +usage() +{ + echo "usage : $0" + echo " --info=Information file" + echo " --tensorflow_path=TensorFlow path (Use externals/tensorflow by default)" +} + +SCRIPT_PATH="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" + +TF_DIR="${SCRIPT_PATH}/../../externals/tensorflow" + +for i in "$@" +do + case $i in + --info=*) + INFO=${i#*=} + ;; + --tensorflow_path=*) + TF_DIR=${i#*=} + ;; + -h|--help) + usage + exit 0 + ;; + *) + usage + exit 1 + ;; + esac + shift +done + +if [ -z "$INFO" ]; then + echo "INFO is unset or set to the empty string" + usage + exit 1 +fi +if [ -z "$TF_DIR" ]; then + echo "tensorflow_path is unset or set to the empty string" + usage + exit 1 +fi + +if [ ! -x "$(command -v bazel)" ]; then + echo "Cannot find bazel. Please install bazel." + exit 1 +fi + +source $INFO + +if [ -z "$GRAPHDEF_PATH" ]; then + echo "GRAPHDEF_PATH is unset or set to the empty string" + echo "Update the $INFO file" + exit 1 +fi +if [ -z "$OPTIMIZE_PATH" ]; then + echo "OPTIMIZE_PATH is unset or set to the empty string" + echo "Update the $INFO file" + exit 1 +fi +if [ -z "$INPUT" ]; then + echo "INPUT is unset or set to the empty string" + echo "Update the $INFO file" + exit 1 +fi +if [ -z "$OUTPUT" ]; then + echo "OUTPUT is unset or set to the empty string" + echo "Update the $INFO file" + exit 1 +fi + +CUR_DIR=$(pwd) +{ + echo "Enter $TF_DIR" + pushd $TF_DIR > /dev/null + + bazel run tensorflow/python/tools:optimize_for_inference -- \ + --input="$GRAPHDEF_PATH" \ + --output="$OPTIMIZE_PATH" \ + --frozen_graph=True \ + --input_names="$INPUT" \ + --output_names="$OUTPUT" \ + --toco_compatible=True + + popd + + echo "OUTPUT FILE : $OPTIMIZE_PATH" +} -- 2.7.4