-# Saving and Restoring
+# Save and Restore
-This document explains how to save and restore
-@{$variables$variables} and models.
+The @{tf.train.Saver} class provides methods to save and restore models. The
+@{tf.saved_model.simple_save} function is an easy way to build a
+@{tf.saved_model$saved model} suitable for serving.
+[Estimators](/programmers_guide/estimators) automatically save and restore
+variables in the `model_dir`.
-Important: TensorFlow model files are code. Be careful with untrusted code.
-See [Using TensorFlow Securely](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/SECURITY.md)
-for details.
-
-## Saving and restoring variables
-
-A TensorFlow variable provides the best way to represent shared, persistent
-state manipulated by your program. (See @{$variables$Variables} for details.)
-This section explains how to save and restore variables.
-Note that Estimators automatically saves and restores variables
-(in the `model_dir`).
+## Save and restore variables
-The `tf.train.Saver` class provides methods for saving and restoring models.
-The `tf.train.Saver` constructor adds `save` and `restore` ops to the graph
-for all, or a specified list, of the variables in the graph. The `Saver`
-object provides methods to run these ops, specifying paths for the checkpoint
-files to write to or read from.
+TensorFlow @{$variables} are the best way to represent shared, persistent state
+manipulated by your program. The `tf.train.Saver` constructor adds `save` and
+`restore` ops to the graph for all, or a specified list, of the variables in the
+graph. The `Saver` object provides methods to run these ops, specifying paths
+for the checkpoint files to write to or read from.
-The saver will restore all variables already defined in your model. If you're
+`Saver` restores all variables already defined in your model. If you're
loading a model without knowing how to build its graph (for example, if you're
writing a generic program to load models), then read the
[Overview of saving and restoring models](#models) section
later in this document.
-TensorFlow saves variables in binary **checkpoint files** that,
-roughly speaking, map variable names to tensor values.
-
+TensorFlow saves variables in binary *checkpoint files* that map variable
+names to tensor values.
+Caution: TensorFlow model files are code. Be careful with untrusted code.
+See [Using TensorFlow Securely](https://github.com/tensorflow/tensorflow/blob/master/SECURITY.md)
+for details.
-### Saving variables
+### Save variables
Create a `Saver` with `tf.train.Saver()` to manage all variables in the
model. For example, the following snippet demonstrates how to call the
print("Model saved in path: %s" % save_path)
```
-
-
-### Restoring variables
+### Restore variables
The `tf.train.Saver` object not only saves variables to checkpoint files, it
also restores variables. Note that when you restore variables you do not have
print("v2 : %s" % v2.eval())
```
-Notes:
-
-* There is not a physical file called "/tmp/model.ckpt". It is the **prefix**
- of filenames created for the checkpoint. Users only interact with the
- prefix instead of physical checkpoint files.
+Note: There is not a physical file called `/tmp/model.ckpt`. It is the *prefix* of
+filenames created for the checkpoint. Users only interact with the prefix
+instead of physical checkpoint files.
-
-### Choosing which variables to save and restore
+### Choose variables to save and restore
If you do not pass any arguments to `tf.train.Saver()`, the saver handles all
variables in the graph. Each variable is saved under the name that was passed
<a name="models"></a>
-## Overview of saving and restoring models
+## Save and restore models
+
+Use `SavedModel` to save and load your model—variables, the graph, and the
+graph's metadata. This is a language-neutral, recoverable, hermetic
+serialization format that enables higher-level systems and tools to produce,
+consume, and transform TensorFlow models. TensorFlow provides several ways to
+interact with `SavedModel`, including the @{tf.saved_model} APIs,
+@{tf.estimator.Estimator}, and a command-line interface.
+
-When you want to save and load variables, the graph, and the
-graph's metadata--basically, when you want to save or restore
-your model--we recommend using SavedModel.
-**SavedModel** is a language-neutral, recoverable, hermetic
-serialization format. SavedModel enables higher-level systems
-and tools to produce, consume, and transform TensorFlow models.
-TensorFlow provides several mechanisms for interacting with
-SavedModel, including tf.saved_model APIs, Estimator APIs and a CLI.
+## Build and load a SavedModel
+### Simple save
-## APIs to build and load a SavedModel
+The easiest way to create a `SavedModel` is to use the @{tf.saved_model.simple_save}
+function:
-This section focuses on the APIs for building and loading a SavedModel,
-particularly when using lower-level TensorFlow APIs.
+```python
+simple_save(session,
+ export_dir,
+ inputs={"x": x, "y": y},
+ outputs={"z": z})
+```
+This configures the `SavedModel` so it can be loaded by
+[TensorFlow serving](/serving/serving_basic) and supports the
+[Predict API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto).
+To access the classify, regress, or multi-inference APIs, use the manual
+`SavedModel` builder APIs or an @{tf.estimator.Estimator}.
-### Building a SavedModel
+### Manually build a SavedModel
-We provide a Python implementation of the SavedModel
-@{tf.saved_model.builder$builder}.
-The `SavedModelBuilder` class provides functionality to
+If your use case isn't covered by @{tf.saved_model.simple_save}, use the manual
+@{tf.saved_model.builder$builder APIs} to create a `SavedModel`.
+
+The @{tf.saved_model.builder.SavedModelBuilder} class provides functionality to
save multiple `MetaGraphDef`s. A **MetaGraph** is a dataflow graph, plus
its associated variables, assets, and signatures. A **`MetaGraphDef`**
is the protocol buffer representation of a MetaGraph. A **signature** is
```
-### Loading a SavedModel in Python
+### Load a SavedModel in Python
The Python version of the SavedModel
@{tf.saved_model.loader$loader}
```
-### Loading a SavedModel in C++
+### Load a SavedModel in C++
The C++ version of the SavedModel
[loader](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/cc/saved_model/loader.h)
&bundle);
```
-### Loading and Serving a SavedModel in TensorFlow Serving
+### Load and serve a SavedModel in TensorFlow serving
You can easily load and serve a SavedModel with the TensorFlow Serving Model
Server binary. See [instructions](https://www.tensorflow.org/serving/setup#installing_using_apt-get)
* Serve the model from a local server and request predictions.
-### Preparing serving inputs
+### Prepare serving inputs
During training, an @{$premade_estimators#input_fn$`input_fn()`} ingests data
and prepares it for use by the model. At serving time, similarly, a
By contrast, the *output* portion of the signature is determined by the model.
-### Performing the export
+### Perform the export
To export your trained Estimator, call
@{tf.estimator.Estimator.export_savedmodel} with the export base path and
> Note: It is your responsibility to garbage-collect old exports.
> Otherwise, successive exports will accumulate under `export_dir_base`.
-### Specifying the outputs of a custom model
+### Specify the outputs of a custom model
When writing a custom `model_fn`, you must populate the `export_outputs` element
of the @{tf.estimator.EstimatorSpec} return value. This is a dict of
does not specify one.
-### Serving the exported model locally
+### Serve the exported model locally
For local deployment, you can serve your model using
[TensorFlow Serving](https://github.com/tensorflow/serving), an open-source project that loads a
Now you have a server listening for inference requests via gRPC on port 9000!
-### Requesting predictions from a local server
+### Request predictions from a local server
The server responds to gRPC requests according to the
[PredictionService](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/prediction_service.proto#L15)
expressions) and then fetching the output.
-### Installing the SavedModel CLI
+### Install the SavedModel CLI
Broadly speaking, you can install TensorFlow in either of the following
two ways:
`<input_key>=[{"age":[22,24],"education":["BS","MS"]}]`
```
-#### Save Output
+#### Save output
By default, the SavedModel CLI writes output to stdout. If a directory is
passed to `--outdir` option, the outputs will be saved as npy files named after
Use `--overwrite` to overwrite existing output files.
-#### TensorFlow Debugger (tfdbg) Integration
+#### TensorFlow debugger (tfdbg) integration
If `--tf_debug` option is set, the SavedModel CLI will use the
TensorFlow Debugger (tfdbg) to watch the intermediate Tensors and runtime
Each graph is associated with a specific set of tags, which enables
identification during a load or restore operation.
-
-
-