DOC Improve documentation for LayerNorm (#63144)
authorAspenStars <AspenStars@outlook.com>
Fri, 13 Aug 2021 13:40:41 +0000 (06:40 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Fri, 13 Aug 2021 14:04:40 +0000 (07:04 -0700)
commit72bc6dc8c31f977036195536292170ac9132751d
tree12324ac677f8bb8cd608f13c9c68c92f20ff618c
parentaa665e1ab820594359ea653c96ce2ce5e5977327
DOC Improve documentation for LayerNorm (#63144)

Summary:
In this [commit](https://github.com/pytorch/pytorch/pull/59178/commits/7026995f3ca253fbc19bf511d53f48f861799a4a) and [issue](https://github.com/pytorch/pytorch/pull/59178#issuecomment-897485295), the [Line 134](https://github.com/deniskokarev/pytorch/blob/47e286d024c183cb26a464447b34fde88b80d17d/torch/nn/modules/normalization.py#L134) will overwrite the "embedding" variate which would cause an error when initiating `nn.LayerNorm` function.

I suggest renaming the "embedding" in [Line 133](https://github.com/deniskokarev/pytorch/blob/47e286d024c183cb26a464447b34fde88b80d17d/torch/nn/modules/normalization.py#L133) to "embedding_dim".

The final example is:
```
batch, sentence_length, embedding_dim = 20, 5, 10
embedding = torch.randn(batch, sentence_length, embedding_dim)
layer_norm = nn.LayerNorm(embedding_dim)
```

Fixes #{59178}

Pull Request resolved: https://github.com/pytorch/pytorch/pull/63144

Reviewed By: bdhirsh

Differential Revision: D30288778

Pulled By: jbschlosser

fbshipit-source-id: e74b11430e302dae5661bf6e830ee5ac6c1838c4
torch/nn/modules/normalization.py