Tensor reinitialization codemod - 5/5 (#15884)
authorJerry Zhang <jerryzh@fb.com>
Fri, 11 Jan 2019 00:24:34 +0000 (16:24 -0800)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Fri, 11 Jan 2019 00:32:26 +0000 (16:32 -0800)
commit890568a018b9043878547cf22465d23270731a6f
tree6dc797788b92ec148c684e181fce1aebb0b5dea4
parente46e572b30881051ef0ee062331017fab9a8b59d
Tensor reinitialization codemod - 5/5 (#15884)

Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/15884

Codemod generated with clangr shard mode, 25 files per diff,
To eliminiate partially initialized Tensor, we split the initialization of local Tensor variables into two steps, first declare un uninitialized Tensor, and
call `ReinitializeTensor` to initialize it.
motivation: https://github.com/pytorch/pytorch/pull/12407

Reviewed By: hyuen

Differential Revision: D13586737

fbshipit-source-id: dc8e49e9f29505b8898bb19f84c1a983f2d811ab
13 files changed:
caffe2/quantization/server/fully_connected_fake_lowp_op.cc
caffe2/quantization/server/fully_connected_fake_lowp_op.h
caffe2/sgd/lars_op.h
caffe2/sgd/yellowfin_op.h
caffe2/utils/math.h
caffe2/utils/math_test.cc
caffe2/video/video_input_op.h
modules/detectron/group_spatial_softmax_op.cu
modules/detectron/group_spatial_softmax_op.h
modules/detectron/sigmoid_cross_entropy_loss_op.cu
modules/detectron/sigmoid_cross_entropy_loss_op.h
modules/detectron/softmax_focal_loss_op.cu
modules/detectron/softmax_focal_loss_op.h