From: Bharat123rox Date: Wed, 17 Apr 2019 11:31:46 +0000 (-0700) Subject: Add rst entry for nn.MultiheadAttention (#19346) X-Git-Tag: accepted/tizen/6.5/unified/20211028.231830~200 X-Git-Url: http://review.tizen.org/git/?a=commitdiff_plain;h=3fcee4875c4383bff78917284875f180ad41903c;p=platform%2Fupstream%2Fpytorch.git Add rst entry for nn.MultiheadAttention (#19346) Summary: Fix #19259 by adding the missing `autoclass` entry for `nn.MultiheadAttention` from [here](https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/activation.py#L676) Pull Request resolved: https://github.com/pytorch/pytorch/pull/19346 Differential Revision: D14971426 Pulled By: soumith fbshipit-source-id: ceaaa8ea4618c38fa2bff139e7fa0d6c9ea193ea --- diff --git a/docs/source/nn.rst b/docs/source/nn.rst index 08b3358..1d4bc3e 100644 --- a/docs/source/nn.rst +++ b/docs/source/nn.rst @@ -308,6 +308,11 @@ Non-linear activations (weighted sum, nonlinearity) .. autoclass:: LogSigmoid :members: +:hidden:`MultiheadAttention` +~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +.. autoclass:: MultiheadAttention + :members: + :hidden:`PReLU` ~~~~~~~~~~~~~~~