Use move to avoid copying (#17188)
authorMichael Liu <michaeluw@fb.com>
Tue, 19 Feb 2019 15:11:27 +0000 (07:11 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Tue, 19 Feb 2019 15:14:27 +0000 (07:14 -0800)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/17188

Using flag "-Wreturn-std-move", compiler can identify the cases where a copy
operation
is performed when a move operation would have been available. Wrapped return
statement with std::move to fix.

For some reason, these files are not automatically modded. With D14115372
we should be able to turn on the compile flag

Reviewed By: soumith

Differential Revision: D14115786

fbshipit-source-id: e763b92eecbe4468027fc141d029618d1e9f280b

torch/csrc/autograd/VariableTypeManual.cpp

index 26c2b8b..c0a801f 100644 (file)
@@ -341,7 +341,7 @@ Tensor VariableType::detach(const Tensor & self) const {
   if (jit::tracer::isTracing()) {
     jit::tracer::addOutput(node, result);
   }
-  return result;
+  return std::move(result);
 }
 
 Tensor & VariableType::detach_(Tensor & self) const {