fix double backward for half softmax/logsoftmax (#17330)
authorNatalia Gimelshein <ngimelshein@nvidia.com>
Thu, 21 Feb 2019 22:35:20 +0000 (14:35 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Thu, 21 Feb 2019 22:58:45 +0000 (14:58 -0800)
commit5fa78303edcfcc841c30ef27462f80904b3c79a2
treee013d6e13a975ca57f9fc44c8324a96a5bfcd3e7
parent9101dfc57ccb6b6931b4e80233bbc64d9080d2e8
fix double backward for half softmax/logsoftmax (#17330)

Summary:
Fix for #17261, SsnL do you have tests for it in your other PR? If not, I'll add to this. Example from #17261 now does not error out (and same for log_softmax).
Pull Request resolved: https://github.com/pytorch/pytorch/pull/17330

Differential Revision: D14171529

Pulled By: soumith

fbshipit-source-id: ee925233feb1b44ef9f1d757db59ca3601aadef2
test/test_nn.py
tools/autograd/derivatives.yaml