AD support for adaptive_avg_pool2d (#15459)
authorAiling Zhang <ailzhang@fb.com>
Fri, 21 Dec 2018 23:32:44 +0000 (15:32 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Fri, 21 Dec 2018 23:38:24 +0000 (15:38 -0800)
Summary:
This adds AD support for adaptive_avg_pool2d, which is necessary for resnet50 in pytorch/vision:master. cc: soumith asuhan dlibenzi

apaszke  I saw that autodiff bug you fixed in #15403 , as it doesn't prevent this PR from passing, so I'll leave it for your PR to fix it. :)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/15459

Differential Revision: D13534732

Pulled By: ailzhang

fbshipit-source-id: 4e48b93e35d5ecfe7bd64b6a132a55b07843f206

test/test_jit.py
torch/csrc/jit/symbolic_script.cpp

index ac5d387..4104249 100644 (file)
@@ -9642,6 +9642,7 @@ EXCLUDE_SCRIPT_MODULES = {
 
 DISABLE_AUTODIFF_SUBGRAPH_INLINING = {
     'test_nn_avg_pool2d',
+    'test_nn_adaptive_avg_pool2d',
     'test_nn_log_softmax',
     'test_nn_threshold',
     'test_nn_nll_loss',
index dd670ed..12e45b5 100644 (file)
@@ -13,6 +13,14 @@ namespace torch { namespace jit {
                 grad_other = (grad_output * self).sum_to_size(other.size())
                 return grad_self, grad_other
             return self * other, backward
+
+        def adaptive_avg_pool2d(self,
+                                output_size: List[int]):
+            def backward(grad_output):
+                grad_self = torch.adaptive_avg_pool2d_backward(grad_output, self)
+                return grad_self, None
+
+            return torch.adaptive_avg_pool2d(self, output_size), backward
       )"
     };
     std::unordered_map<std::string, GradientPair> schema_to_graphs;