pow scalar exponent / base autodiff, fusion (#19324)
authorThomas Viehmann <tv.code@beamnet.de>
Fri, 19 Apr 2019 00:52:33 +0000 (17:52 -0700)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Fri, 19 Apr 2019 00:58:35 +0000 (17:58 -0700)
commitb9291f55bb5e16e5acb9c8e9e2cb69bac56e8ab1
treeb3bfbb629ca4d5542379174ea5848b6d4eb87c93
parentb4fa979a37749acf5d7d4e7cb30cf7ac4f5fae95
pow scalar exponent / base autodiff, fusion (#19324)

Summary:
Fixes: #19253

Fixing pow(Tensor, float) is straightforward.
The breakage for pow(float, Tensor) is a bit more subtle to trigger, and fixing needs `torch.log` (`math.log` didn't work) from the newly merged #19115  (Thanks ngimel for pointing out this has landed.)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/19324

Differential Revision: D15003531

Pulled By: ailzhang

fbshipit-source-id: 8b22138fa27a43806b82886fb3a7b557bbb5a865
test/test_jit.py
torch/csrc/jit/passes/graph_fuser.cpp
torch/csrc/jit/symbolic_script.cpp