Fix precision issue with expansion that prefers 'probs' over 'logits' (#18614)
authorAhmad Salim Al-Sibahi <ahmad@di.ku.dk>
Fri, 5 Apr 2019 19:45:37 +0000 (12:45 -0700)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Fri, 5 Apr 2019 20:07:01 +0000 (13:07 -0700)
commit8e1e29124de99c01d08a2e2c02455c72335a971d
tree2d8afa113bfa1faba5e156579f410708227586f9
parentb90cbb841d98aeae435b1d02b1ff211e028cdf5b
Fix precision issue with expansion that prefers 'probs' over 'logits' (#18614)

Summary:
I have experienced that sometimes both were in `__dict__`, but it chose to copy `probs` which loses precision over `logits`. This is especially important when training (bayesian) neural networks or doing other type of optimization, since the loss is heavily affected.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/18614

Differential Revision: D14793486

Pulled By: ezyang

fbshipit-source-id: d4ff5e34fbb4021ea9de9f58af09a7de00d80a63
torch/distributions/bernoulli.py
torch/distributions/binomial.py
torch/distributions/categorical.py
torch/distributions/geometric.py
torch/distributions/negative_binomial.py
torch/distributions/relaxed_bernoulli.py