Import MultiheadAttention to PyTorch (#18334)
authorGuanheng Zhang <zhangguanheng@fb.com>
Thu, 11 Apr 2019 15:04:32 +0000 (08:04 -0700)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Thu, 11 Apr 2019 15:07:30 +0000 (08:07 -0700)
commit4b20fc826da7d15c67ab632d95b0a22ea55daa84
tree75688d2a9eba7fbfeff79101f2c59cdccdac5802
parentb6f130aa709030c0cf46cbd5082a3bbdfea59a8b
Import MultiheadAttention to PyTorch (#18334)

Summary:
Import MultiheadAttention into the core pytorch framework.
Users now can import MultiheadAttention directly from torch.nn.
See "Attention Is All You Need" for more details related to MultiheadAttention function.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/18334

Differential Revision: D14577966

Pulled By: zhangguanheng66

fbshipit-source-id: 756c0deff623f3780651d9f9a70ce84516c806d3
test/test_nn.py
torch/nn/modules/__init__.py
torch/nn/modules/activation.py