Support Embedding + EmbeddingBag in Script (#14415)
authorElias Ellison <eellison@fb.com>
Wed, 28 Nov 2018 18:50:26 +0000 (10:50 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Wed, 28 Nov 2018 18:52:30 +0000 (10:52 -0800)
commit7749804099b8a64aea4bf91e298a20976f9b10ad
tree290de4c378fdba73e94a1d13e67e8adcf9165abf
parentc32debb9165b46c3f10a42e0465e7d8e23ed11db
Support Embedding + EmbeddingBag in Script (#14415)

Summary:
Add support for Embedding and EmbeddingBag in script. Both functions require with torch.no_grad(), which we don't have any plans to support in the near future. To work around this, I added a embedding_renorm function without derivatives.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/14415

Reviewed By: wanchaol

Differential Revision: D13219647

Pulled By: eellison

fbshipit-source-id: c90706aa6fbd48686eb10f3efdb65844be7b8717
aten/src/ATen/core/aten_interned_strings.h
aten/src/ATen/native/Embedding.cpp
aten/src/ATen/native/cuda/Embedding.cu
aten/src/ATen/native/native_functions.yaml
test/test_jit.py
tools/autograd/derivatives.yaml
torch/csrc/jit/script/init.cpp
torch/jit/__init__.py
torch/nn/functional.py
torch/nn/modules/sparse.py