[DDP] Remove self.modules_params (#64473)
authorRohan Varma <rvarm1@fb.com>
Thu, 9 Sep 2021 02:13:33 +0000 (19:13 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Thu, 9 Sep 2021 02:16:13 +0000 (19:16 -0700)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/64473

Unused after SPMD deprecated.
ghstack-source-id: 137526305

Test Plan: CI

Reviewed By: zhaojuanmao

Differential Revision: D30745961

fbshipit-source-id: 32d102502570291e01579e5b47a6d74dc71013bb

torch/nn/parallel/distributed.py

index a1e24b1..734d42c 100644 (file)
@@ -734,9 +734,6 @@ class DistributedDataParallel(Module, Joinable):
             for replica in modules_and_parameters
         ]
 
-        # The following modules_params and modules_buffers are used for
-        # param/buffer sync in _sync_params.
-        self.modules_params = [list(self._get_parameters(self.module))]
         self._assign_modules_buffers()
 
         return parameters, expect_sparse_gradient