Revert D30745961: [DDP] Remove self.modules_params
authorHoward Huang <howardhuang@fb.com>
Thu, 9 Sep 2021 15:20:40 +0000 (08:20 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Thu, 9 Sep 2021 15:23:14 +0000 (08:23 -0700)
Test Plan: revert-hammer

Differential Revision:
D30745961 (https://github.com/pytorch/pytorch/commit/8c095102948c9601792a884dad56da5085c51bee)

Original commit changeset: 32d102502570

fbshipit-source-id: 59f7cc50d369b6cc2856cf4ebd0f58b96202336d

torch/nn/parallel/distributed.py

index 734d42c..a1e24b1 100644 (file)
@@ -734,6 +734,9 @@ class DistributedDataParallel(Module, Joinable):
             for replica in modules_and_parameters
         ]
 
+        # The following modules_params and modules_buffers are used for
+        # param/buffer sync in _sync_params.
+        self.modules_params = [list(self._get_parameters(self.module))]
         self._assign_modules_buffers()
 
         return parameters, expect_sparse_gradient