From 3fcee4875c4383bff78917284875f180ad41903c Mon Sep 17 00:00:00 2001 From: Bharat123rox Date: Wed, 17 Apr 2019 04:31:46 -0700 Subject: [PATCH] Add rst entry for nn.MultiheadAttention (#19346) Summary: Fix #19259 by adding the missing `autoclass` entry for `nn.MultiheadAttention` from [here](https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/activation.py#L676) Pull Request resolved: https://github.com/pytorch/pytorch/pull/19346 Differential Revision: D14971426 Pulled By: soumith fbshipit-source-id: ceaaa8ea4618c38fa2bff139e7fa0d6c9ea193ea --- docs/source/nn.rst | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/docs/source/nn.rst b/docs/source/nn.rst index 08b3358..1d4bc3e 100644 --- a/docs/source/nn.rst +++ b/docs/source/nn.rst @@ -308,6 +308,11 @@ Non-linear activations (weighted sum, nonlinearity) .. autoclass:: LogSigmoid :members: +:hidden:`MultiheadAttention` +~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +.. autoclass:: MultiheadAttention + :members: + :hidden:`PReLU` ~~~~~~~~~~~~~~~ -- 2.7.4