Checkpointable: Handle Optimizer non-slot variables
authorAllen Lavoie <allenl@google.com>
Wed, 28 Feb 2018 21:55:35 +0000 (13:55 -0800)
committerTensorFlower Gardener <gardener@tensorflow.org>
Wed, 28 Feb 2018 22:00:00 +0000 (14:00 -0800)
commita72ece230eb46c1afcb96c52dc5ae6ceabdeaf25
tree87717cdf42cc12d5f487a5b01dba017ac655493f
parent9d6c5a06638262f6815717c682fab29ba3524282
Checkpointable: Handle Optimizer non-slot variables

Overrides the Checkpointable dependency-gathering logic to key Optimizer dependencies to the current graph.

Moves my Checkpointable Adam prototype out of contrib.

Right now there is no check that loading all happens in the same graph. This would be easy enough to do (save a Graph ID with the _Checkpoint object), but it's not clear to me that it's useful; doing deferred restoration in whichever graph the variable is created in seems reasonable. (Let me know if you disagree)

PiperOrigin-RevId: 187378372
16 files changed:
tensorflow/contrib/eager/python/checkpointable_utils_test.py
tensorflow/python/ops/variables.py
tensorflow/python/training/checkpointable.py
tensorflow/python/training/optimizer.py
tensorflow/tools/api/golden/tensorflow.train.-adadelta-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-adagrad-d-a-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-adagrad-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-adam-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-ftrl-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-gradient-descent-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-momentum-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-proximal-adagrad-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-proximal-gradient-descent-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-r-m-s-prop-optimizer.pbtxt
tensorflow/tools/api/golden/tensorflow.train.-sync-replicas-optimizer.pbtxt