Fix for double backwards tests (#18190)
authorIgor Fedan <ifedan@fb.com>
Mon, 1 Apr 2019 19:30:29 +0000 (12:30 -0700)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Mon, 1 Apr 2019 19:33:30 +0000 (12:33 -0700)
commitd6c269c33e73f47cd5f244d7ec989629f1fe592e
tree36128ae26fd43a2c64fba51e97b72369cdbfd234
parentbe01c907972367fc421acf86dd825caf662e448a
Fix for double backwards tests (#18190)

Summary:
If none of the outputs require_grad, we don't actually check gradgrad, instead we will check that their numerical gradients are 0.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/18190

Differential Revision: D14563388

Pulled By: ifedan

fbshipit-source-id: a4eb94c9eb60f14dbe6986cd8cef1fe78a7bc839
test/test_autograd.py
torch/autograd/gradcheck.py