Autograd using torchscript (#14604)
authorAiling Zhang <ailzhang@fb.com>
Wed, 19 Dec 2018 02:56:06 +0000 (18:56 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Wed, 19 Dec 2018 03:10:57 +0000 (19:10 -0800)
commit6ab2e7442ddaa1aee9fc5cb936b4bcca03f2c121
tree8eba5b25dbc5a580bfe47bd15be04faa0af2d456
parent4928c7641540c21e40e845c9383681b16093bf1f
Autograd using torchscript (#14604)

Summary:
This PR enables autodiff to use the forward/backward graph compiled from python code, instead of using symbolic gradients(modifying the original graph directly).

We put the map in a separate .h file for now to wait for the native_functions.yaml and derivatives.yaml merge. This should ideally go into native_functions.yaml eventually.

This PR should be enough to unblock us for now, we can start writing gradients for aten functions in python.

Differential Revision: D13494635

Pulled By: ailzhang

fbshipit-source-id: f8d51a15243ac46afd09d930c573ccdfcd9fdaaf
21 files changed:
aten/src/ATen/core/Tensor.h
aten/src/ATen/core/TensorMethods.h
aten/src/ATen/core/Type.h
aten/src/ATen/core/aten_interned_strings.h
aten/src/ATen/native/TensorShape.cpp
aten/src/ATen/native/native_functions.yaml
test/cpp/jit/tests.h
test/expect/TestFuser.test_lstm_cuda-backward.expect
test/expect/TestFuser.test_milstm_cuda-backward.expect
test/expect/TestJit.test_cpp_cuda.expect
tools/build_variables.py
torch/CMakeLists.txt
torch/_tensor_docs.py
torch/csrc/jit/autodiff.cpp
torch/csrc/jit/operator.cpp
torch/csrc/jit/operator.h
torch/csrc/jit/passes/dead_code_elimination.cpp
torch/csrc/jit/passes/lower_tuples.cpp
torch/csrc/jit/passes/lower_tuples.h
torch/csrc/jit/symbolic_script.cpp [new file with mode: 0644]
torch/csrc/jit/symbolic_script.h [new file with mode: 0644]