ENH Adds no_batch_dim for NLLLoss (#62651)
authorThomas J. Fan <thomasjpfan@gmail.com>
Tue, 24 Aug 2021 15:26:21 +0000 (08:26 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Tue, 24 Aug 2021 15:27:27 +0000 (08:27 -0700)
commit2ca2761f3c448e58d587b440d459d328ae8efc95
treee85b1df7edfc8ad4a523eeadc5a2c8dec6caa276
parentd3be02d1004aa84cb2e001a2d72a04356d4d4cd3
ENH Adds no_batch_dim for NLLLoss (#62651)

Summary:
Towards https://github.com/pytorch/pytorch/issues/60585

Pull Request resolved: https://github.com/pytorch/pytorch/pull/62651

Reviewed By: VitalyFedyunin

Differential Revision: D30303340

Pulled By: jbschlosser

fbshipit-source-id: 7ab478cf63bf6cd1f850cad5fd101e74a2cfe3f5
aten/src/ATen/native/LossNLL.cpp
aten/src/ATen/native/cuda/Loss.cu
torch/nn/modules/loss.py
torch/testing/_internal/common_nn.py