Revert existing no_grad_embedding_renorm_ from aten (#14639)
authorElias Ellison <eellison@fb.com>
Sat, 1 Dec 2018 00:53:55 +0000 (16:53 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Sat, 1 Dec 2018 00:57:51 +0000 (16:57 -0800)
commit404ad939e5988d022e32101d5cb48fff23036288
tree22a4bfaf572c0046fcddc5169225ef70b718b28c
parentaeb38cfcea0debfb44bd06f06e2d826e637968db
Revert existing no_grad_embedding_renorm_ from aten (#14639)

Summary:
Remove no_grad_embedding_renorm_ from aten. Setting the derivatives of the inputs to false has different semantics from calling with no_grad(), because it will not error if an input is modified and then has it's grad accessed.

Instead, make a custom op, and use NoGradGuard.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/14639

Differential Revision: D13285604

Pulled By: eellison

fbshipit-source-id: c7d343fe8f22e369669e92799f167674f124ffe7
aten/src/ATen/core/aten_interned_strings.h
aten/src/ATen/native/Embedding.cpp
aten/src/ATen/native/cuda/Embedding.cu
aten/src/ATen/native/native_functions.yaml
test/test_jit.py
tools/autograd/derivatives.yaml
torch/csrc/jit/register_special_ops.cpp
torch/jit/__init__.py
torch/nn/functional.py