Use at::zeros instead of torch::zeros in non-differentiable example (#15527)
authorAlexander Rodin <rodin.alexander@gmail.com>
Wed, 26 Dec 2018 05:43:38 +0000 (21:43 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Wed, 26 Dec 2018 05:50:17 +0000 (21:50 -0800)
Summary:
There was a typo in C++ docs.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/15527

Differential Revision: D13547858

Pulled By: soumith

fbshipit-source-id: 1f5250206ca6e13b1b1443869b1e1c837a756cb5

docs/cpp/source/index.rst

index 572ed03..f620ff4 100644 (file)
@@ -69,7 +69,7 @@ a taste of this interface:
 The ``at::Tensor`` class in ATen is not differentiable by default. To add the
 differentiability of tensors the autograd API provides, you must use tensor
 factory functions from the `torch::` namespace instead of the `at` namespace.
-For example, while a tensor created with `torch::ones` will not be differentiable,
+For example, while a tensor created with `at::ones` will not be differentiable,
 a tensor created with `torch::ones` will be.
 
 C++ Frontend