Turn off layer norm in jit symbolic differentiation (#63816)
authorXiaodong Wang <xdwang@fb.com>
Tue, 24 Aug 2021 22:45:59 +0000 (15:45 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Tue, 24 Aug 2021 22:47:13 +0000 (15:47 -0700)
commit6d58c830072eda05786a336c252b624c105a6fac
tree15112f23b9471c88017a2e6742f66f3fc081195d
parent41ffec07ce8abfffd3f4f450ae442a8f5982a074
Turn off layer norm in jit symbolic differentiation (#63816)

Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/63816

Test Plan:
Confirmed this can rescue the NE:

https://www.internalfb.com/mast/job/torchx_xdwang-SparseNNApplication_72cf593d

Reviewed By: ngimel

Differential Revision: D30498746

fbshipit-source-id: 4a387f32ee2f70685de6104459c7f21bfbddc187
torch/csrc/jit/runtime/symbolic_script.cpp
torch/testing/_internal/jit_metaprogramming_utils.py