From 60b13d1f719f556264739e59697981262316a5cc Mon Sep 17 00:00:00 2001 From: Alexander Rodin Date: Tue, 25 Dec 2018 21:43:38 -0800 Subject: [PATCH] Use at::zeros instead of torch::zeros in non-differentiable example (#15527) Summary: There was a typo in C++ docs. Pull Request resolved: https://github.com/pytorch/pytorch/pull/15527 Differential Revision: D13547858 Pulled By: soumith fbshipit-source-id: 1f5250206ca6e13b1b1443869b1e1c837a756cb5 --- docs/cpp/source/index.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cpp/source/index.rst b/docs/cpp/source/index.rst index 572ed03..f620ff420 100644 --- a/docs/cpp/source/index.rst +++ b/docs/cpp/source/index.rst @@ -69,7 +69,7 @@ a taste of this interface: The ``at::Tensor`` class in ATen is not differentiable by default. To add the differentiability of tensors the autograd API provides, you must use tensor factory functions from the `torch::` namespace instead of the `at` namespace. -For example, while a tensor created with `torch::ones` will not be differentiable, +For example, while a tensor created with `at::ones` will not be differentiable, a tensor created with `torch::ones` will be. C++ Frontend -- 2.7.4