Bf16 matmul (#64619)
authorhaozhe.zhu <haozhe.zhu@intel.com>
Fri, 17 Sep 2021 16:52:47 +0000 (09:52 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Fri, 17 Sep 2021 17:31:56 +0000 (10:31 -0700)
commit752a8202303089386a3973dee753dc78d2a657e2
tree3d29061c7b120304317eee8c31b8403b91f09581
parentf9bf144a0c5e3627f5fafb256cebf1f02152ab0c
Bf16 matmul (#64619)

Summary:
Re-create PR to fix https://github.com/pytorch/pytorch/pull/61891.

Drop the support for addbmm.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/64619

Reviewed By: jbschlosser

Differential Revision: D30902995

Pulled By: VitalyFedyunin

fbshipit-source-id: dc318d73adff8f6974c9752d0d097e69276f8206
aten/src/ATen/native/Blas.cpp
aten/src/ATen/native/LinearAlgebra.cpp
aten/src/ATen/native/mkldnn/Matmul.cpp [new file with mode: 0644]
aten/src/ATen/native/mkldnn/Matmul.h [new file with mode: 0644]
aten/src/ATen/native/mkldnn/Utils.h
test/test_linalg.py
tools/build_variables.bzl
torch/testing/_internal/common_methods_invocations.py