Allow TransformerEncoder and TransformerDecoder to accept 0-dim batch sized tensors...
authorSameer Deshmukh <sameer.deshmukh93@gmail.com>
Fri, 13 Aug 2021 23:08:01 +0000 (16:08 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Fri, 13 Aug 2021 23:11:57 +0000 (16:11 -0700)
commit809e1e7457b011f3e6528304d9f8461b1cb3475b
tree87e999ba829ccb5559c44f514256e7283412194f
parentab7a47298039da3b60b4e33cbb70c5fc06e1a315
Allow TransformerEncoder and TransformerDecoder to accept 0-dim batch sized tensors. (#62800)

Summary:
This issue fixes a part of https://github.com/pytorch/pytorch/issues/12013, which is summarized concretely in  https://github.com/pytorch/pytorch/issues/38115.

This PR allows TransformerEncoder and Decoder (alongwith the inner `Layer` classes) to accept inputs with 0-dimensional batch sizes.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/62800

Reviewed By: VitalyFedyunin

Differential Revision: D30303240

Pulled By: jbschlosser

fbshipit-source-id: 8f8082a6f2a9f9d7ce0b22a942d286d5db62bd12
test/test_nn.py
torch/nn/functional.py