Fix alignment issues for Fake BFP16 fp32 -> bfp16 rounding routines (#18321)
authorJianyu Huang <jianyuhuang@fb.com>
Fri, 22 Mar 2019 19:28:04 +0000 (12:28 -0700)
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>
Fri, 22 Mar 2019 19:41:58 +0000 (12:41 -0700)
commit18a6781f5758aed7f910a49e62f65f20f18c8f81
tree8c48f1c5dbdb27f0d76a5d95a33d5a5b06161ff9
parent6e0cbc7f3141e2df2cb7e5e2dfb09033fc2bb735
Fix alignment issues for Fake BFP16 fp32 -> bfp16 rounding routines (#18321)

Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/18321

As title.

Reviewed By: jspark1105

Differential Revision: D14575512

fbshipit-source-id: 0e33cdab54b1aef8b67f0b4c366692c5dbdf631d
caffe2/quantization/server/fully_connected_fake_lowp_op_avx2.cc