Fix sum_to behavior with zero dimensions (#15796)
authorRichard Zou <zou3519@gmail.com>
Tue, 8 Jan 2019 20:34:43 +0000 (12:34 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Tue, 8 Jan 2019 21:19:54 +0000 (13:19 -0800)
commit196eee6ccd9bbab46d712c863265a433832b8f9c
treec7186703686ec066f9dfc0324bcfd89671ca127b
parent734eb31035c45079a3b4a6a2aac28fb7a28773cf
Fix sum_to behavior with zero dimensions (#15796)

Summary:
Fixes #15223.

This fixes an autograd bug where backprop either fails or produces
gradients of incorrect sizes when tensors with zero-sized dimensions are
involved.

Previously, we were reducing along dimensions that had size greater than 1
when summing to a size in autograd. This is incorrect because we should also reduce
along dimensions with size 0 to produce a tensor of size 1 in that
dimension that then gets viewed to the correct shape.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/15796

Differential Revision: D13593199

Pulled By: zou3519

fbshipit-source-id: 2e2acac34943a9b7fabadc10c9efd4f66db298fd
aten/src/ATen/ExpandUtils.h
test/test_autograd.py