[fx2trt] fix elementwise op converter with one operand being a literal and has differ...
authorShiyan Deng <dsy842974287@fb.com>
Tue, 14 Sep 2021 19:25:45 +0000 (12:25 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Tue, 14 Sep 2021 19:27:37 +0000 (12:27 -0700)
commit57eda692192ed4bcd291f09bf5b2e409c3031db6
treef809991079f44ceef5c528057e727694ae6ecb21
parent3727baea6f181f35ed83772abb960ba264d50f12
[fx2trt] fix elementwise op converter with one operand being a literal and has different type (#65004)

Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/65004

If we have some code like `torch.add(x, 1)` and x is a float tensor then in conversion things would falling apart because currently we will add a constant layer of int32 dtype for `1` but we actually need float dtype.

This diff adds an arg to `get_trt_tensor` which specify the dtype of the constant layer we would created.

Also, start to add doc string for functions.

Reviewed By: yinghai

Differential Revision: D30852156

fbshipit-source-id: 650ce72d2794093a4616e640ea503dcc1c6b2bc4
torch/fx/experimental/fx2trt/converters/acc_ops_converters.py