[X86][AVX] Decode constant bits from insert_subvector(c1, c2, c3)
authorSimon Pilgrim <llvm-dev@redking.me.uk>
Sat, 15 Jun 2019 17:05:24 +0000 (17:05 +0000)
committerSimon Pilgrim <llvm-dev@redking.me.uk>
Sat, 15 Jun 2019 17:05:24 +0000 (17:05 +0000)
commit990f3ceb67673eaed9bc9d76e720683a70b831e4
tree2ed5cb4e6094e8eb2a52d63dd29144e2f09f8489
parent5dd61974f94f20515436e864fc123751267f6a88
[X86][AVX] Decode constant bits from insert_subvector(c1, c2, c3)

This mostly happens due to SimplifyDemandedVectorElts reducing a vector to insert_subvector(undef, c1, 0)

llvm-svn: 363499
llvm/lib/Target/X86/X86ISelLowering.cpp
llvm/test/CodeGen/X86/avx512-shuffles/partial_permute.ll
llvm/test/CodeGen/X86/vector-shuffle-combining-avx.ll
llvm/test/CodeGen/X86/vector-shuffle-combining-avx512bw.ll
llvm/test/CodeGen/X86/vector-shuffle-combining-xop.ll