Richard Biener [Thu, 19 Mar 2020 09:19:24 +0000 (10:19 +0100)]
ipa/94217 simplify offsetted address build
This avoids using build_ref_for_offset and build_fold_addr_expr
where type mixup easily results in something not IP invariant.
2020-03-19 Richard Biener <rguenther@suse.de>
PR ipa/94217
* ipa-cp.c (ipa_get_jf_ancestor_result): Avoid build_fold_addr_expr
and build_ref_for_offset.
Richard Biener [Thu, 19 Mar 2020 09:15:52 +0000 (10:15 +0100)]
middle-end/94216 fix another build_fold_addr_expr use
2020-03-19 Richard Biener <rguenther@suse.de>
PR middle-end/94216
* fold-const.c (fold_binary_loc): Avoid using
build_fold_addr_expr when we really want an ADDR_EXPR.
* g++.dg/torture/pr94216.C: New testcase.
GCC Administrator [Thu, 19 Mar 2020 00:16:20 +0000 (00:16 +0000)]
Daily bump.
Jonathan Wakely [Wed, 18 Mar 2020 23:19:12 +0000 (23:19 +0000)]
libstdc++: Fix is_trivially_constructible (PR 94033)
This attempts to make is_nothrow_constructible more robust (and
efficient to compile) by not depending on is_constructible. Instead the
__is_constructible intrinsic is used directly. The helper class
__is_nt_constructible_impl which checks whether the construction is
non-throwing now takes a bool template parameter that is substituted by
the result of the instrinsic. This fixes the reported bug by not using
the already-instantiated (and incorrect) value of std::is_constructible.
I don't think it really fixes the problem in general, because
std::is_nothrow_constructible itself could already have been
instantiated in a context where it gives the wrong result. A proper fix
needs to be done in the compiler.
PR libstdc++/94033
* include/std/type_traits (__is_nt_default_constructible_atom): Remove.
(__is_nt_default_constructible_impl): Remove.
(__is_nothrow_default_constructible_impl): Remove.
(__is_nt_constructible_impl): Add bool template parameter. Adjust
partial specializations.
(__is_nothrow_constructible_impl): Replace class template with alias
template.
(is_nothrow_default_constructible): Derive from alias template
__is_nothrow_constructible_impl instead of
__is_nothrow_default_constructible_impl.
* testsuite/20_util/is_nothrow_constructible/94003.cc: New test.
Segher Boessenkool [Wed, 18 Mar 2020 21:58:45 +0000 (21:58 +0000)]
rs6000: Add back some w* constraints (PR91886)
In May and June last year I deleted many of our (vector) constraints.
We can now just use "wa" for those, together with some other
conditions, which can be per alternative using the "enabled" attribute
(which in turn primarily uses the "isa" attribute).
But, it turns out that Clang implements some of those constraints as
well, and at least musl uses some of them. It is easy for us to add
those contraints back (as undocumented aliases to "wa", which always
did mean the same thing for valid inline assembler code), so do that.
gcc/
* config/rs6000/constraints.md (wd, wf, wi, ws, ww): New undocumented
aliases for "wa".
Jeff Law [Wed, 18 Mar 2020 22:07:28 +0000 (16:07 -0600)]
Complete change to resolve pr90275.
PR rtl-optimization/90275
* cse.c (cse_insn): Delete no-op register moves too.
Martin Sebor [Wed, 18 Mar 2020 20:47:29 +0000 (14:47 -0600)]
PR ipa/92799 - ICE on a weakref function definition followed by a declaration
gcc/testsuite/ChangeLog:
PR ipa/92799
* gcc.dg/attr-weakref-5.c: New test.
gcc/ChangeLog:
PR ipa/92799
* cgraphunit.c (process_function_and_variable_attributes): Also
complain about weakref function definitions and drop all effects
of the attribute.
Srinath Parvathaneni [Wed, 18 Mar 2020 19:16:32 +0000 (19:16 +0000)]
[ARM][GCC][8/5x]: Remaining MVE store intrinsics which stores an half word, word and double word to memory.
This patch supports the following MVE ACLE store intrinsics which stores an halfword, word or double word to memory.
vstrdq_scatter_base_p_s64, vstrdq_scatter_base_p_u64, vstrdq_scatter_base_s64, vstrdq_scatter_base_u64, vstrdq_scatter_offset_p_s64, vstrdq_scatter_offset_p_u64, vstrdq_scatter_offset_s64, vstrdq_scatter_offset_u64, vstrdq_scatter_shifted_offset_p_s64,
vstrdq_scatter_shifted_offset_p_u64, vstrdq_scatter_shifted_offset_s64,
vstrdq_scatter_shifted_offset_u64, vstrhq_scatter_offset_f16, vstrhq_scatter_offset_p_f16, vstrhq_scatter_shifted_offset_f16, vstrhq_scatter_shifted_offset_p_f16,
vstrwq_scatter_base_f32, vstrwq_scatter_base_p_f32, vstrwq_scatter_offset_f32, vstrwq_scatter_offset_p_f32, vstrwq_scatter_offset_p_s32, vstrwq_scatter_offset_p_u32, vstrwq_scatter_offset_s32, vstrwq_scatter_offset_u32, vstrwq_scatter_shifted_offset_f32,
vstrwq_scatter_shifted_offset_p_f32, vstrwq_scatter_shifted_offset_p_s32,
vstrwq_scatter_shifted_offset_p_u32, vstrwq_scatter_shifted_offset_s32,
vstrwq_scatter_shifted_offset_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch a new predicate "Ri" is defined to check the immediate is in the range of +/-1016 and multiple of 8.
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vstrdq_scatter_base_p_s64): Define macro.
(vstrdq_scatter_base_p_u64): Likewise.
(vstrdq_scatter_base_s64): Likewise.
(vstrdq_scatter_base_u64): Likewise.
(vstrdq_scatter_offset_p_s64): Likewise.
(vstrdq_scatter_offset_p_u64): Likewise.
(vstrdq_scatter_offset_s64): Likewise.
(vstrdq_scatter_offset_u64): Likewise.
(vstrdq_scatter_shifted_offset_p_s64): Likewise.
(vstrdq_scatter_shifted_offset_p_u64): Likewise.
(vstrdq_scatter_shifted_offset_s64): Likewise.
(vstrdq_scatter_shifted_offset_u64): Likewise.
(vstrhq_scatter_offset_f16): Likewise.
(vstrhq_scatter_offset_p_f16): Likewise.
(vstrhq_scatter_shifted_offset_f16): Likewise.
(vstrhq_scatter_shifted_offset_p_f16): Likewise.
(vstrwq_scatter_base_f32): Likewise.
(vstrwq_scatter_base_p_f32): Likewise.
(vstrwq_scatter_offset_f32): Likewise.
(vstrwq_scatter_offset_p_f32): Likewise.
(vstrwq_scatter_offset_p_s32): Likewise.
(vstrwq_scatter_offset_p_u32): Likewise.
(vstrwq_scatter_offset_s32): Likewise.
(vstrwq_scatter_offset_u32): Likewise.
(vstrwq_scatter_shifted_offset_f32): Likewise.
(vstrwq_scatter_shifted_offset_p_f32): Likewise.
(vstrwq_scatter_shifted_offset_p_s32): Likewise.
(vstrwq_scatter_shifted_offset_p_u32): Likewise.
(vstrwq_scatter_shifted_offset_s32): Likewise.
(vstrwq_scatter_shifted_offset_u32): Likewise.
(__arm_vstrdq_scatter_base_p_s64): Define intrinsic.
(__arm_vstrdq_scatter_base_p_u64): Likewise.
(__arm_vstrdq_scatter_base_s64): Likewise.
(__arm_vstrdq_scatter_base_u64): Likewise.
(__arm_vstrdq_scatter_offset_p_s64): Likewise.
(__arm_vstrdq_scatter_offset_p_u64): Likewise.
(__arm_vstrdq_scatter_offset_s64): Likewise.
(__arm_vstrdq_scatter_offset_u64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_p_s64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_p_u64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_s64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_u64): Likewise.
(__arm_vstrwq_scatter_offset_p_s32): Likewise.
(__arm_vstrwq_scatter_offset_p_u32): Likewise.
(__arm_vstrwq_scatter_offset_s32): Likewise.
(__arm_vstrwq_scatter_offset_u32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_p_s32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_p_u32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_s32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_u32): Likewise.
(__arm_vstrhq_scatter_offset_f16): Likewise.
(__arm_vstrhq_scatter_offset_p_f16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_f16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_f16): Likewise.
(__arm_vstrwq_scatter_base_f32): Likewise.
(__arm_vstrwq_scatter_base_p_f32): Likewise.
(__arm_vstrwq_scatter_offset_f32): Likewise.
(__arm_vstrwq_scatter_offset_p_f32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_f32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_p_f32): Likewise.
(vstrhq_scatter_offset): Define polymorphic variant.
(vstrhq_scatter_offset_p): Likewise.
(vstrhq_scatter_shifted_offset): Likewise.
(vstrhq_scatter_shifted_offset_p): Likewise.
(vstrwq_scatter_base): Likewise.
(vstrwq_scatter_base_p): Likewise.
(vstrwq_scatter_offset): Likewise.
(vstrwq_scatter_offset_p): Likewise.
(vstrwq_scatter_shifted_offset): Likewise.
(vstrwq_scatter_shifted_offset_p): Likewise.
(vstrdq_scatter_base_p): Likewise.
(vstrdq_scatter_base): Likewise.
(vstrdq_scatter_offset_p): Likewise.
(vstrdq_scatter_offset): Likewise.
(vstrdq_scatter_shifted_offset_p): Likewise.
(vstrdq_scatter_shifted_offset): Likewise.
* config/arm/arm_mve_builtins.def (STRSBS): Use builtin qualifier.
(STRSBS_P): Likewise.
(STRSBU): Likewise.
(STRSBU_P): Likewise.
(STRSS): Likewise.
(STRSS_P): Likewise.
(STRSU): Likewise.
(STRSU_P): Likewise.
* config/arm/constraints.md (Ri): Define.
* config/arm/mve.md (VSTRDSBQ): Define iterator.
(VSTRDSOQ): Likewise.
(VSTRDSSOQ): Likewise.
(VSTRWSOQ): Likewise.
(VSTRWSSOQ): Likewise.
(mve_vstrdq_scatter_base_p_<supf>v2di): Define RTL pattern.
(mve_vstrdq_scatter_base_<supf>v2di): Likewise.
(mve_vstrdq_scatter_offset_p_<supf>v2di): Likewise.
(mve_vstrdq_scatter_offset_<supf>v2di): Likewise.
(mve_vstrdq_scatter_shifted_offset_p_<supf>v2di): Likewise.
(mve_vstrdq_scatter_shifted_offset_<supf>v2di): Likewise.
(mve_vstrhq_scatter_offset_fv8hf): Likewise.
(mve_vstrhq_scatter_offset_p_fv8hf): Likewise.
(mve_vstrhq_scatter_shifted_offset_fv8hf): Likewise.
(mve_vstrhq_scatter_shifted_offset_p_fv8hf): Likewise.
(mve_vstrwq_scatter_base_fv4sf): Likewise.
(mve_vstrwq_scatter_base_p_fv4sf): Likewise.
(mve_vstrwq_scatter_offset_fv4sf): Likewise.
(mve_vstrwq_scatter_offset_p_fv4sf): Likewise.
(mve_vstrwq_scatter_offset_p_<supf>v4si): Likewise.
(mve_vstrwq_scatter_offset_<supf>v4si): Likewise.
(mve_vstrwq_scatter_shifted_offset_fv4sf): Likewise.
(mve_vstrwq_scatter_shifted_offset_p_fv4sf): Likewise.
(mve_vstrwq_scatter_shifted_offset_p_<supf>v4si): Likewise.
(mve_vstrwq_scatter_shifted_offset_<supf>v4si): Likewise.
* config/arm/predicates.md (Ri): Define predicate to check immediate
is the range +/-1016 and multiple of 8.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_p_s64.c: New test.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_p_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_p_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_p_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_p_s64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_p_u64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_s64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_u64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_f16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_f16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_f32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_f32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_u32.c:
Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 19:08:29 +0000 (19:08 +0000)]
[ARM][GCC][7/5x]: MVE store intrinsics which stores byte,half word or word to memory.
This patch supports the following MVE ACLE store intrinsics which stores a byte, halfword, or word to memory.
vst1q_f32, vst1q_f16, vst1q_s8, vst1q_s32, vst1q_s16, vst1q_u8, vst1q_u32, vst1q_u16, vstrhq_f16, vstrhq_scatter_offset_s32, vstrhq_scatter_offset_s16, vstrhq_scatter_offset_u32, vstrhq_scatter_offset_u16, vstrhq_scatter_offset_p_s32, vstrhq_scatter_offset_p_s16, vstrhq_scatter_offset_p_u32, vstrhq_scatter_offset_p_u16, vstrhq_scatter_shifted_offset_s32,
vstrhq_scatter_shifted_offset_s16, vstrhq_scatter_shifted_offset_u32,
vstrhq_scatter_shifted_offset_u16, vstrhq_scatter_shifted_offset_p_s32,
vstrhq_scatter_shifted_offset_p_s16, vstrhq_scatter_shifted_offset_p_u32,
vstrhq_scatter_shifted_offset_p_u16, vstrhq_s32, vstrhq_s16, vstrhq_u32, vstrhq_u16, vstrhq_p_f16, vstrhq_p_s32, vstrhq_p_s16, vstrhq_p_u32, vstrhq_p_u16, vstrwq_f32, vstrwq_s32, vstrwq_u32, vstrwq_p_f32, vstrwq_p_s32, vstrwq_p_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vst1q_f32): Define macro.
(vst1q_f16): Likewise.
(vst1q_s8): Likewise.
(vst1q_s32): Likewise.
(vst1q_s16): Likewise.
(vst1q_u8): Likewise.
(vst1q_u32): Likewise.
(vst1q_u16): Likewise.
(vstrhq_f16): Likewise.
(vstrhq_scatter_offset_s32): Likewise.
(vstrhq_scatter_offset_s16): Likewise.
(vstrhq_scatter_offset_u32): Likewise.
(vstrhq_scatter_offset_u16): Likewise.
(vstrhq_scatter_offset_p_s32): Likewise.
(vstrhq_scatter_offset_p_s16): Likewise.
(vstrhq_scatter_offset_p_u32): Likewise.
(vstrhq_scatter_offset_p_u16): Likewise.
(vstrhq_scatter_shifted_offset_s32): Likewise.
(vstrhq_scatter_shifted_offset_s16): Likewise.
(vstrhq_scatter_shifted_offset_u32): Likewise.
(vstrhq_scatter_shifted_offset_u16): Likewise.
(vstrhq_scatter_shifted_offset_p_s32): Likewise.
(vstrhq_scatter_shifted_offset_p_s16): Likewise.
(vstrhq_scatter_shifted_offset_p_u32): Likewise.
(vstrhq_scatter_shifted_offset_p_u16): Likewise.
(vstrhq_s32): Likewise.
(vstrhq_s16): Likewise.
(vstrhq_u32): Likewise.
(vstrhq_u16): Likewise.
(vstrhq_p_f16): Likewise.
(vstrhq_p_s32): Likewise.
(vstrhq_p_s16): Likewise.
(vstrhq_p_u32): Likewise.
(vstrhq_p_u16): Likewise.
(vstrwq_f32): Likewise.
(vstrwq_s32): Likewise.
(vstrwq_u32): Likewise.
(vstrwq_p_f32): Likewise.
(vstrwq_p_s32): Likewise.
(vstrwq_p_u32): Likewise.
(__arm_vst1q_s8): Define intrinsic.
(__arm_vst1q_s32): Likewise.
(__arm_vst1q_s16): Likewise.
(__arm_vst1q_u8): Likewise.
(__arm_vst1q_u32): Likewise.
(__arm_vst1q_u16): Likewise.
(__arm_vstrhq_scatter_offset_s32): Likewise.
(__arm_vstrhq_scatter_offset_s16): Likewise.
(__arm_vstrhq_scatter_offset_u32): Likewise.
(__arm_vstrhq_scatter_offset_u16): Likewise.
(__arm_vstrhq_scatter_offset_p_s32): Likewise.
(__arm_vstrhq_scatter_offset_p_s16): Likewise.
(__arm_vstrhq_scatter_offset_p_u32): Likewise.
(__arm_vstrhq_scatter_offset_p_u16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_s32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_s16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_u32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_u16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_s32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_s16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_u32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_u16): Likewise.
(__arm_vstrhq_s32): Likewise.
(__arm_vstrhq_s16): Likewise.
(__arm_vstrhq_u32): Likewise.
(__arm_vstrhq_u16): Likewise.
(__arm_vstrhq_p_s32): Likewise.
(__arm_vstrhq_p_s16): Likewise.
(__arm_vstrhq_p_u32): Likewise.
(__arm_vstrhq_p_u16): Likewise.
(__arm_vstrwq_s32): Likewise.
(__arm_vstrwq_u32): Likewise.
(__arm_vstrwq_p_s32): Likewise.
(__arm_vstrwq_p_u32): Likewise.
(__arm_vstrwq_p_f32): Likewise.
(__arm_vstrwq_f32): Likewise.
(__arm_vst1q_f32): Likewise.
(__arm_vst1q_f16): Likewise.
(__arm_vstrhq_f16): Likewise.
(__arm_vstrhq_p_f16): Likewise.
(vst1q): Define polymorphic variant.
(vstrhq): Likewise.
(vstrhq_p): Likewise.
(vstrhq_scatter_offset_p): Likewise.
(vstrhq_scatter_offset): Likewise.
(vstrhq_scatter_shifted_offset_p): Likewise.
(vstrhq_scatter_shifted_offset): Likewise.
(vstrwq_p): Likewise.
(vstrwq): Likewise.
* config/arm/arm_mve_builtins.def (STRS): Use builtin qualifier.
(STRS_P): Likewise.
(STRSS): Likewise.
(STRSS_P): Likewise.
(STRSU): Likewise.
(STRSU_P): Likewise.
(STRU): Likewise.
(STRU_P): Likewise.
* config/arm/mve.md (VST1Q): Define iterator.
(VSTRHSOQ): Likewise.
(VSTRHSSOQ): Likewise.
(VSTRHQ): Likewise.
(VSTRWQ): Likewise.
(mve_vstrhq_fv8hf): Define RTL pattern.
(mve_vstrhq_p_fv8hf): Likewise.
(mve_vstrhq_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_offset_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_offset_<supf><mode>): Likewise.
(mve_vstrhq_scatter_shifted_offset_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_shifted_offset_<supf><mode>): Likewise.
(mve_vstrhq_<supf><mode>): Likewise.
(mve_vstrwq_fv4sf): Likewise.
(mve_vstrwq_p_fv4sf): Likewise.
(mve_vstrwq_p_<supf>v4si): Likewise.
(mve_vstrwq_<supf>v4si): Likewise.
(mve_vst1q_f<mode>): Define expand.
(mve_vst1q_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vst1q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vst1q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_s16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_u16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_s16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_u16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:58:48 +0000 (18:58 +0000)]
[ARM][GCC][6/5x]: Remaining MVE load intrinsics which loads half word and word or double word from memory.
This patch supports the following Remaining MVE ACLE load intrinsics which load an halfword,
word or double word from memory.
vldrdq_gather_base_s64, vldrdq_gather_base_u64, vldrdq_gather_base_z_s64,
vldrdq_gather_base_z_u64, vldrdq_gather_offset_s64, vldrdq_gather_offset_u64,
vldrdq_gather_offset_z_s64, vldrdq_gather_offset_z_u64, vldrdq_gather_shifted_offset_s64,
vldrdq_gather_shifted_offset_u64, vldrdq_gather_shifted_offset_z_s64,
vldrdq_gather_shifted_offset_z_u64, vldrhq_gather_offset_f16, vldrhq_gather_offset_z_f16,
vldrhq_gather_shifted_offset_f16, vldrhq_gather_shifted_offset_z_f16, vldrwq_gather_base_f32,
vldrwq_gather_base_z_f32, vldrwq_gather_offset_f32, vldrwq_gather_offset_s32,
vldrwq_gather_offset_u32, vldrwq_gather_offset_z_f32, vldrwq_gather_offset_z_s32,
vldrwq_gather_offset_z_u32, vldrwq_gather_shifted_offset_f32, vldrwq_gather_shifted_offset_s32,
vldrwq_gather_shifted_offset_u32, vldrwq_gather_shifted_offset_z_f32,
vldrwq_gather_shifted_offset_z_s32, vldrwq_gather_shifted_offset_z_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vld1q_s8): Define macro.
(vld1q_s32): Likewise.
(vld1q_s16): Likewise.
(vld1q_u8): Likewise.
(vld1q_u32): Likewise.
(vld1q_u16): Likewise.
(vldrhq_gather_offset_s32): Likewise.
(vldrhq_gather_offset_s16): Likewise.
(vldrhq_gather_offset_u32): Likewise.
(vldrhq_gather_offset_u16): Likewise.
(vldrhq_gather_offset_z_s32): Likewise.
(vldrhq_gather_offset_z_s16): Likewise.
(vldrhq_gather_offset_z_u32): Likewise.
(vldrhq_gather_offset_z_u16): Likewise.
(vldrhq_gather_shifted_offset_s32): Likewise.
(vldrhq_gather_shifted_offset_s16): Likewise.
(vldrhq_gather_shifted_offset_u32): Likewise.
(vldrhq_gather_shifted_offset_u16): Likewise.
(vldrhq_gather_shifted_offset_z_s32): Likewise.
(vldrhq_gather_shifted_offset_z_s16): Likewise.
(vldrhq_gather_shifted_offset_z_u32): Likewise.
(vldrhq_gather_shifted_offset_z_u16): Likewise.
(vldrhq_s32): Likewise.
(vldrhq_s16): Likewise.
(vldrhq_u32): Likewise.
(vldrhq_u16): Likewise.
(vldrhq_z_s32): Likewise.
(vldrhq_z_s16): Likewise.
(vldrhq_z_u32): Likewise.
(vldrhq_z_u16): Likewise.
(vldrwq_s32): Likewise.
(vldrwq_u32): Likewise.
(vldrwq_z_s32): Likewise.
(vldrwq_z_u32): Likewise.
(vld1q_f32): Likewise.
(vld1q_f16): Likewise.
(vldrhq_f16): Likewise.
(vldrhq_z_f16): Likewise.
(vldrwq_f32): Likewise.
(vldrwq_z_f32): Likewise.
(__arm_vld1q_s8): Define intrinsic.
(__arm_vld1q_s32): Likewise.
(__arm_vld1q_s16): Likewise.
(__arm_vld1q_u8): Likewise.
(__arm_vld1q_u32): Likewise.
(__arm_vld1q_u16): Likewise.
(__arm_vldrhq_gather_offset_s32): Likewise.
(__arm_vldrhq_gather_offset_s16): Likewise.
(__arm_vldrhq_gather_offset_u32): Likewise.
(__arm_vldrhq_gather_offset_u16): Likewise.
(__arm_vldrhq_gather_offset_z_s32): Likewise.
(__arm_vldrhq_gather_offset_z_s16): Likewise.
(__arm_vldrhq_gather_offset_z_u32): Likewise.
(__arm_vldrhq_gather_offset_z_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u16): Likewise.
(__arm_vldrhq_s32): Likewise.
(__arm_vldrhq_s16): Likewise.
(__arm_vldrhq_u32): Likewise.
(__arm_vldrhq_u16): Likewise.
(__arm_vldrhq_z_s32): Likewise.
(__arm_vldrhq_z_s16): Likewise.
(__arm_vldrhq_z_u32): Likewise.
(__arm_vldrhq_z_u16): Likewise.
(__arm_vldrwq_s32): Likewise.
(__arm_vldrwq_u32): Likewise.
(__arm_vldrwq_z_s32): Likewise.
(__arm_vldrwq_z_u32): Likewise.
(__arm_vld1q_f32): Likewise.
(__arm_vld1q_f16): Likewise.
(__arm_vldrwq_f32): Likewise.
(__arm_vldrwq_z_f32): Likewise.
(__arm_vldrhq_z_f16): Likewise.
(__arm_vldrhq_f16): Likewise.
(vld1q): Define polymorphic variant.
(vldrhq_gather_offset): Likewise.
(vldrhq_gather_offset_z): Likewise.
(vldrhq_gather_shifted_offset): Likewise.
(vldrhq_gather_shifted_offset_z): Likewise.
* config/arm/arm_mve_builtins.def (LDRU): Use builtin qualifier.
(LDRS): Likewise.
(LDRU_Z): Likewise.
(LDRS_Z): Likewise.
(LDRGU_Z): Likewise.
(LDRGU): Likewise.
(LDRGS_Z): Likewise.
(LDRGS): Likewise.
* config/arm/mve.md (MVE_H_ELEM): Define mode iterator.
(V_sz_elem1): Likewise.
(VLD1Q): Define iterator.
(VLDRHGOQ): Likewise.
(VLDRHGSOQ): Likewise.
(VLDRHQ): Likewise.
(VLDRWQ): Likewise.
(mve_vldrhq_fv8hf): Define RTL pattern.
(mve_vldrhq_gather_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_<supf><mode>): Likewise.
(mve_vldrhq_z_fv8hf): Likewise.
(mve_vldrhq_z_<supf><mode>): Likewise.
(mve_vldrwq_fv4sf): Likewise.
(mve_vldrwq_<supf>v4si): Likewise.
(mve_vldrwq_z_fv4sf): Likewise.
(mve_vldrwq_z_<supf>v4si): Likewise.
(mve_vld1q_f<mode>): Define RTL expand pattern.
(mve_vld1q_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vld1q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vld1q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:48:05 +0000 (18:48 +0000)]
[ARM][GCC][5/5x]: MVE ACLE load intrinsics which load a byte, halfword, or word from memory.
This patch supports the following MVE ACLE load intrinsics which load a byte, halfword,
or word from memory.
vld1q_s8, vld1q_s32, vld1q_s16, vld1q_u8, vld1q_u32, vld1q_u16, vldrhq_gather_offset_s32,
vldrhq_gather_offset_s16, vldrhq_gather_offset_u32, vldrhq_gather_offset_u16,
vldrhq_gather_offset_z_s32, vldrhq_gather_offset_z_s16, vldrhq_gather_offset_z_u32,
vldrhq_gather_offset_z_u16, vldrhq_gather_shifted_offset_s32,vldrwq_f32, vldrwq_z_f32,
vldrhq_gather_shifted_offset_s16, vldrhq_gather_shifted_offset_u32,
vldrhq_gather_shifted_offset_u16, vldrhq_gather_shifted_offset_z_s32,
vldrhq_gather_shifted_offset_z_s16, vldrhq_gather_shifted_offset_z_u32,
vldrhq_gather_shifted_offset_z_u16, vldrhq_s32, vldrhq_s16, vldrhq_u32, vldrhq_u16,
vldrhq_z_s32, vldrhq_z_s16, vldrhq_z_u32, vldrhq_z_u16, vldrwq_s32, vldrwq_u32,
vldrwq_z_s32, vldrwq_z_u32, vld1q_f32, vld1q_f16, vldrhq_f16, vldrhq_z_f16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vld1q_s8): Define macro.
(vld1q_s32): Likewise.
(vld1q_s16): Likewise.
(vld1q_u8): Likewise.
(vld1q_u32): Likewise.
(vld1q_u16): Likewise.
(vldrhq_gather_offset_s32): Likewise.
(vldrhq_gather_offset_s16): Likewise.
(vldrhq_gather_offset_u32): Likewise.
(vldrhq_gather_offset_u16): Likewise.
(vldrhq_gather_offset_z_s32): Likewise.
(vldrhq_gather_offset_z_s16): Likewise.
(vldrhq_gather_offset_z_u32): Likewise.
(vldrhq_gather_offset_z_u16): Likewise.
(vldrhq_gather_shifted_offset_s32): Likewise.
(vldrhq_gather_shifted_offset_s16): Likewise.
(vldrhq_gather_shifted_offset_u32): Likewise.
(vldrhq_gather_shifted_offset_u16): Likewise.
(vldrhq_gather_shifted_offset_z_s32): Likewise.
(vldrhq_gather_shifted_offset_z_s16): Likewise.
(vldrhq_gather_shifted_offset_z_u32): Likewise.
(vldrhq_gather_shifted_offset_z_u16): Likewise.
(vldrhq_s32): Likewise.
(vldrhq_s16): Likewise.
(vldrhq_u32): Likewise.
(vldrhq_u16): Likewise.
(vldrhq_z_s32): Likewise.
(vldrhq_z_s16): Likewise.
(vldrhq_z_u32): Likewise.
(vldrhq_z_u16): Likewise.
(vldrwq_s32): Likewise.
(vldrwq_u32): Likewise.
(vldrwq_z_s32): Likewise.
(vldrwq_z_u32): Likewise.
(vld1q_f32): Likewise.
(vld1q_f16): Likewise.
(vldrhq_f16): Likewise.
(vldrhq_z_f16): Likewise.
(vldrwq_f32): Likewise.
(vldrwq_z_f32): Likewise.
(__arm_vld1q_s8): Define intrinsic.
(__arm_vld1q_s32): Likewise.
(__arm_vld1q_s16): Likewise.
(__arm_vld1q_u8): Likewise.
(__arm_vld1q_u32): Likewise.
(__arm_vld1q_u16): Likewise.
(__arm_vldrhq_gather_offset_s32): Likewise.
(__arm_vldrhq_gather_offset_s16): Likewise.
(__arm_vldrhq_gather_offset_u32): Likewise.
(__arm_vldrhq_gather_offset_u16): Likewise.
(__arm_vldrhq_gather_offset_z_s32): Likewise.
(__arm_vldrhq_gather_offset_z_s16): Likewise.
(__arm_vldrhq_gather_offset_z_u32): Likewise.
(__arm_vldrhq_gather_offset_z_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u16): Likewise.
(__arm_vldrhq_s32): Likewise.
(__arm_vldrhq_s16): Likewise.
(__arm_vldrhq_u32): Likewise.
(__arm_vldrhq_u16): Likewise.
(__arm_vldrhq_z_s32): Likewise.
(__arm_vldrhq_z_s16): Likewise.
(__arm_vldrhq_z_u32): Likewise.
(__arm_vldrhq_z_u16): Likewise.
(__arm_vldrwq_s32): Likewise.
(__arm_vldrwq_u32): Likewise.
(__arm_vldrwq_z_s32): Likewise.
(__arm_vldrwq_z_u32): Likewise.
(__arm_vld1q_f32): Likewise.
(__arm_vld1q_f16): Likewise.
(__arm_vldrwq_f32): Likewise.
(__arm_vldrwq_z_f32): Likewise.
(__arm_vldrhq_z_f16): Likewise.
(__arm_vldrhq_f16): Likewise.
(vld1q): Define polymorphic variant.
(vldrhq_gather_offset): Likewise.
(vldrhq_gather_offset_z): Likewise.
(vldrhq_gather_shifted_offset): Likewise.
(vldrhq_gather_shifted_offset_z): Likewise.
* config/arm/arm_mve_builtins.def (LDRU): Use builtin qualifier.
(LDRS): Likewise.
(LDRU_Z): Likewise.
(LDRS_Z): Likewise.
(LDRGU_Z): Likewise.
(LDRGU): Likewise.
(LDRGS_Z): Likewise.
(LDRGS): Likewise.
* config/arm/mve.md (MVE_H_ELEM): Define mode iterator.
(V_sz_elem1): Likewise.
(VLD1Q): Define iterator.
(VLDRHGOQ): Likewise.
(VLDRHGSOQ): Likewise.
(VLDRHQ): Likewise.
(VLDRWQ): Likewise.
(mve_vldrhq_fv8hf): Define RTL pattern.
(mve_vldrhq_gather_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_<supf><mode>): Likewise.
(mve_vldrhq_z_fv8hf): Likewise.
(mve_vldrhq_z_<supf><mode>): Likewise.
(mve_vldrwq_fv4sf): Likewise.
(mve_vldrwq_<supf>v4si): Likewise.
(mve_vldrwq_z_fv4sf): Likewise.
(mve_vldrwq_z_<supf>v4si): Likewise.
(mve_vld1q_f<mode>): Define RTL expand pattern.
(mve_vld1q_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vld1q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vld1q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:35:17 +0000 (18:35 +0000)]
[ARM][GCC][4/5x]: MVE load intrinsics with zero(_z) suffix.
This patch supports the following MVE ACLE load intrinsics with zero(_z) suffix.
* ``_z`` (zero) which indicates false-predicated lanes are filled with zeroes, these are only used for load instructions.
vldrbq_gather_offset_z_s16, vldrbq_gather_offset_z_u8, vldrbq_gather_offset_z_s32, vldrbq_gather_offset_z_u16, vldrbq_gather_offset_z_u32, vldrbq_gather_offset_z_s8, vldrbq_z_s16, vldrbq_z_u8, vldrbq_z_s8, vldrbq_z_s32, vldrbq_z_u16, vldrbq_z_u32, vldrwq_gather_base_z_u32, vldrwq_gather_base_z_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (LDRGBS_Z_QUALIFIERS): Define builtin
qualifier.
(LDRGBU_Z_QUALIFIERS): Likewise.
(LDRGS_Z_QUALIFIERS): Likewise.
(LDRGU_Z_QUALIFIERS): Likewise.
(LDRS_Z_QUALIFIERS): Likewise.
(LDRU_Z_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vldrbq_gather_offset_z_s16): Define macro.
(vldrbq_gather_offset_z_u8): Likewise.
(vldrbq_gather_offset_z_s32): Likewise.
(vldrbq_gather_offset_z_u16): Likewise.
(vldrbq_gather_offset_z_u32): Likewise.
(vldrbq_gather_offset_z_s8): Likewise.
(vldrbq_z_s16): Likewise.
(vldrbq_z_u8): Likewise.
(vldrbq_z_s8): Likewise.
(vldrbq_z_s32): Likewise.
(vldrbq_z_u16): Likewise.
(vldrbq_z_u32): Likewise.
(vldrwq_gather_base_z_u32): Likewise.
(vldrwq_gather_base_z_s32): Likewise.
(__arm_vldrbq_gather_offset_z_s8): Define intrinsic.
(__arm_vldrbq_gather_offset_z_s32): Likewise.
(__arm_vldrbq_gather_offset_z_s16): Likewise.
(__arm_vldrbq_gather_offset_z_u8): Likewise.
(__arm_vldrbq_gather_offset_z_u32): Likewise.
(__arm_vldrbq_gather_offset_z_u16): Likewise.
(__arm_vldrbq_z_s8): Likewise.
(__arm_vldrbq_z_s32): Likewise.
(__arm_vldrbq_z_s16): Likewise.
(__arm_vldrbq_z_u8): Likewise.
(__arm_vldrbq_z_u32): Likewise.
(__arm_vldrbq_z_u16): Likewise.
(__arm_vldrwq_gather_base_z_s32): Likewise.
(__arm_vldrwq_gather_base_z_u32): Likewise.
(vldrbq_gather_offset_z): Define polymorphic variant.
* config/arm/arm_mve_builtins.def (LDRGBS_Z_QUALIFIERS): Use builtin
qualifier.
(LDRGBU_Z_QUALIFIERS): Likewise.
(LDRGS_Z_QUALIFIERS): Likewise.
(LDRGU_Z_QUALIFIERS): Likewise.
(LDRS_Z_QUALIFIERS): Likewise.
(LDRU_Z_QUALIFIERS): Likewise.
* config/arm/mve.md (mve_vldrbq_gather_offset_z_<supf><mode>): Define
RTL pattern.
(mve_vldrbq_z_<supf><mode>): Likewise.
(mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog: Likewise.
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_z_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:22:21 +0000 (18:22 +0000)]
[ARM][GCC][3/5x]: MVE store intrinsics with predicated suffix.
This patch supports the following MVE ACLE store intrinsics with predicated suffix.
vstrbq_p_s8, vstrbq_p_s32, vstrbq_p_s16, vstrbq_p_u8, vstrbq_p_u32, vstrbq_p_u16, vstrbq_scatter_offset_p_s8, vstrbq_scatter_offset_p_s32, vstrbq_scatter_offset_p_s16, vstrbq_scatter_offset_p_u8, vstrbq_scatter_offset_p_u32, vstrbq_scatter_offset_p_u16, vstrwq_scatter_base_p_s32, vstrwq_scatter_base_p_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (STRS_P_QUALIFIERS): Define builtin
qualifier.
(STRU_P_QUALIFIERS): Likewise.
(STRSU_P_QUALIFIERS): Likewise.
(STRSS_P_QUALIFIERS): Likewise.
(STRSBS_P_QUALIFIERS): Likewise.
(STRSBU_P_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vstrbq_p_s8): Define macro.
(vstrbq_p_s32): Likewise.
(vstrbq_p_s16): Likewise.
(vstrbq_p_u8): Likewise.
(vstrbq_p_u32): Likewise.
(vstrbq_p_u16): Likewise.
(vstrbq_scatter_offset_p_s8): Likewise.
(vstrbq_scatter_offset_p_s32): Likewise.
(vstrbq_scatter_offset_p_s16): Likewise.
(vstrbq_scatter_offset_p_u8): Likewise.
(vstrbq_scatter_offset_p_u32): Likewise.
(vstrbq_scatter_offset_p_u16): Likewise.
(vstrwq_scatter_base_p_s32): Likewise.
(vstrwq_scatter_base_p_u32): Likewise.
(__arm_vstrbq_p_s8): Define intrinsic.
(__arm_vstrbq_p_s32): Likewise.
(__arm_vstrbq_p_s16): Likewise.
(__arm_vstrbq_p_u8): Likewise.
(__arm_vstrbq_p_u32): Likewise.
(__arm_vstrbq_p_u16): Likewise.
(__arm_vstrbq_scatter_offset_p_s8): Likewise.
(__arm_vstrbq_scatter_offset_p_s32): Likewise.
(__arm_vstrbq_scatter_offset_p_s16): Likewise.
(__arm_vstrbq_scatter_offset_p_u8): Likewise.
(__arm_vstrbq_scatter_offset_p_u32): Likewise.
(__arm_vstrbq_scatter_offset_p_u16): Likewise.
(__arm_vstrwq_scatter_base_p_s32): Likewise.
(__arm_vstrwq_scatter_base_p_u32): Likewise.
(vstrbq_p): Define polymorphic variant.
(vstrbq_scatter_offset_p): Likewise.
(vstrwq_scatter_base_p): Likewise.
* config/arm/arm_mve_builtins.def (STRS_P_QUALIFIERS): Use builtin
qualifier.
(STRU_P_QUALIFIERS): Likewise.
(STRSU_P_QUALIFIERS): Likewise.
(STRSS_P_QUALIFIERS): Likewise.
(STRSBS_P_QUALIFIERS): Likewise.
(STRSBU_P_QUALIFIERS): Likewise.
* config/arm/mve.md (mve_vstrbq_scatter_offset_p_<supf><mode>): Define
RTL pattern.
(mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
(mve_vstrbq_p_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vstrbq_p_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vstrbq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:13:53 +0000 (18:13 +0000)]
[ARM][GCC][2/5x]: MVE load intrinsics.
This patch supports the following MVE ACLE load intrinsics.
vldrbq_gather_offset_u8, vldrbq_gather_offset_s8, vldrbq_s8, vldrbq_u8, vldrbq_gather_offset_u16, vldrbq_gather_offset_s16, vldrbq_s16, vldrbq_u16, vldrbq_gather_offset_u32, vldrbq_gather_offset_s32, vldrbq_s32, vldrbq_u32, vldrwq_gather_base_s32, vldrwq_gather_base_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (LDRGU_QUALIFIERS): Define builtin
qualifier.
(LDRGS_QUALIFIERS): Likewise.
(LDRS_QUALIFIERS): Likewise.
(LDRU_QUALIFIERS): Likewise.
(LDRGBS_QUALIFIERS): Likewise.
(LDRGBU_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vldrbq_gather_offset_u8): Define macro.
(vldrbq_gather_offset_s8): Likewise.
(vldrbq_s8): Likewise.
(vldrbq_u8): Likewise.
(vldrbq_gather_offset_u16): Likewise.
(vldrbq_gather_offset_s16): Likewise.
(vldrbq_s16): Likewise.
(vldrbq_u16): Likewise.
(vldrbq_gather_offset_u32): Likewise.
(vldrbq_gather_offset_s32): Likewise.
(vldrbq_s32): Likewise.
(vldrbq_u32): Likewise.
(vldrwq_gather_base_s32): Likewise.
(vldrwq_gather_base_u32): Likewise.
(__arm_vldrbq_gather_offset_u8): Define intrinsic.
(__arm_vldrbq_gather_offset_s8): Likewise.
(__arm_vldrbq_s8): Likewise.
(__arm_vldrbq_u8): Likewise.
(__arm_vldrbq_gather_offset_u16): Likewise.
(__arm_vldrbq_gather_offset_s16): Likewise.
(__arm_vldrbq_s16): Likewise.
(__arm_vldrbq_u16): Likewise.
(__arm_vldrbq_gather_offset_u32): Likewise.
(__arm_vldrbq_gather_offset_s32): Likewise.
(__arm_vldrbq_s32): Likewise.
(__arm_vldrbq_u32): Likewise.
(__arm_vldrwq_gather_base_s32): Likewise.
(__arm_vldrwq_gather_base_u32): Likewise.
(vldrbq_gather_offset): Define polymorphic variant.
* config/arm/arm_mve_builtins.def (LDRGU_QUALIFIERS): Use builtin
qualifier.
(LDRGS_QUALIFIERS): Likewise.
(LDRS_QUALIFIERS): Likewise.
(LDRU_QUALIFIERS): Likewise.
(LDRGBS_QUALIFIERS): Likewise.
(LDRGBU_QUALIFIERS): Likewise.
* config/arm/mve.md (VLDRBGOQ): Define iterator.
(VLDRBQ): Likewise.
(VLDRWGBQ): Likewise.
(mve_vldrbq_gather_offset_<supf><mode>): Define RTL pattern.
(mve_vldrbq_<supf><mode>): Likewise.
(mve_vldrwq_gather_base_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 17:54:30 +0000 (17:54 +0000)]
[ARM][GCC][1/5x]: MVE store intrinsics.
This patch supports the following MVE ACLE store intrinsics.
vstrbq_scatter_offset_s8, vstrbq_scatter_offset_s32, vstrbq_scatter_offset_s16, vstrbq_scatter_offset_u8, vstrbq_scatter_offset_u32, vstrbq_scatter_offset_u16, vstrbq_s8, vstrbq_s32, vstrbq_s16, vstrbq_u8, vstrbq_u32, vstrbq_u16, vstrwq_scatter_base_s32, vstrwq_scatter_base_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (STRS_QUALIFIERS): Define builtin qualifier.
(STRU_QUALIFIERS): Likewise.
(STRSS_QUALIFIERS): Likewise.
(STRSU_QUALIFIERS): Likewise.
(STRSBS_QUALIFIERS): Likewise.
(STRSBU_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vstrbq_s8): Define macro.
(vstrbq_u8): Likewise.
(vstrbq_u16): Likewise.
(vstrbq_scatter_offset_s8): Likewise.
(vstrbq_scatter_offset_u8): Likewise.
(vstrbq_scatter_offset_u16): Likewise.
(vstrbq_s16): Likewise.
(vstrbq_u32): Likewise.
(vstrbq_scatter_offset_s16): Likewise.
(vstrbq_scatter_offset_u32): Likewise.
(vstrbq_s32): Likewise.
(vstrbq_scatter_offset_s32): Likewise.
(vstrwq_scatter_base_s32): Likewise.
(vstrwq_scatter_base_u32): Likewise.
(__arm_vstrbq_scatter_offset_s8): Define intrinsic.
(__arm_vstrbq_scatter_offset_s32): Likewise.
(__arm_vstrbq_scatter_offset_s16): Likewise.
(__arm_vstrbq_scatter_offset_u8): Likewise.
(__arm_vstrbq_scatter_offset_u32): Likewise.
(__arm_vstrbq_scatter_offset_u16): Likewise.
(__arm_vstrbq_s8): Likewise.
(__arm_vstrbq_s32): Likewise.
(__arm_vstrbq_s16): Likewise.
(__arm_vstrbq_u8): Likewise.
(__arm_vstrbq_u32): Likewise.
(__arm_vstrbq_u16): Likewise.
(__arm_vstrwq_scatter_base_s32): Likewise.
(__arm_vstrwq_scatter_base_u32): Likewise.
(vstrbq): Define polymorphic variant.
(vstrbq_scatter_offset): Likewise.
(vstrwq_scatter_base): Likewise.
* config/arm/arm_mve_builtins.def (STRS_QUALIFIERS): Use builtin
qualifier.
(STRU_QUALIFIERS): Likewise.
(STRSS_QUALIFIERS): Likewise.
(STRSU_QUALIFIERS): Likewise.
(STRSBS_QUALIFIERS): Likewise.
(STRSBU_QUALIFIERS): Likewise.
* config/arm/mve.md (MVE_B_ELEM): Define mode attribute iterator.
(VSTRWSBQ): Define iterators.
(VSTRBSOQ): Likewise.
(VSTRBQ): Likewise.
(mve_vstrbq_<supf><mode>): Define RTL pattern.
(mve_vstrbq_scatter_offset_<supf><mode>): Likewise.
(mve_vstrwq_scatter_base_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vstrbq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vstrbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 17:16:21 +0000 (17:16 +0000)]
[ARM][GCC][4/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vabdq_m_f32, vabdq_m_f16, vaddq_m_f32, vaddq_m_f16, vaddq_m_n_f32, vaddq_m_n_f16, vandq_m_f32, vandq_m_f16, vbicq_m_f32, vbicq_m_f16, vbrsrq_m_n_f32, vbrsrq_m_n_f16, vcaddq_rot270_m_f32, vcaddq_rot270_m_f16, vcaddq_rot90_m_f32, vcaddq_rot90_m_f16, vcmlaq_m_f32, vcmlaq_m_f16, vcmlaq_rot180_m_f32, vcmlaq_rot180_m_f16, vcmlaq_rot270_m_f32, vcmlaq_rot270_m_f16, vcmlaq_rot90_m_f32, vcmlaq_rot90_m_f16, vcmulq_m_f32, vcmulq_m_f16, vcmulq_rot180_m_f32, vcmulq_rot180_m_f16, vcmulq_rot270_m_f32, vcmulq_rot270_m_f16, vcmulq_rot90_m_f32, vcmulq_rot90_m_f16, vcvtq_m_n_s32_f32, vcvtq_m_n_s16_f16, vcvtq_m_n_u32_f32, vcvtq_m_n_u16_f16, veorq_m_f32, veorq_m_f16, vfmaq_m_f32, vfmaq_m_f16, vfmaq_m_n_f32, vfmaq_m_n_f16, vfmasq_m_n_f32, vfmasq_m_n_f16, vfmsq_m_f32, vfmsq_m_f16, vmaxnmq_m_f32, vmaxnmq_m_f16, vminnmq_m_f32, vminnmq_m_f16, vmulq_m_f32, vmulq_m_f16, vmulq_m_n_f32, vmulq_m_n_f16, vornq_m_f32, vornq_m_f16, vorrq_m_f32, vorrq_m_f16, vsubq_m_f32, vsubq_m_f16, vsubq_m_n_f32, vsubq_m_n_f16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vabdq_m_f32): Define macro.
(vabdq_m_f16): Likewise.
(vaddq_m_f32): Likewise.
(vaddq_m_f16): Likewise.
(vaddq_m_n_f32): Likewise.
(vaddq_m_n_f16): Likewise.
(vandq_m_f32): Likewise.
(vandq_m_f16): Likewise.
(vbicq_m_f32): Likewise.
(vbicq_m_f16): Likewise.
(vbrsrq_m_n_f32): Likewise.
(vbrsrq_m_n_f16): Likewise.
(vcaddq_rot270_m_f32): Likewise.
(vcaddq_rot270_m_f16): Likewise.
(vcaddq_rot90_m_f32): Likewise.
(vcaddq_rot90_m_f16): Likewise.
(vcmlaq_m_f32): Likewise.
(vcmlaq_m_f16): Likewise.
(vcmlaq_rot180_m_f32): Likewise.
(vcmlaq_rot180_m_f16): Likewise.
(vcmlaq_rot270_m_f32): Likewise.
(vcmlaq_rot270_m_f16): Likewise.
(vcmlaq_rot90_m_f32): Likewise.
(vcmlaq_rot90_m_f16): Likewise.
(vcmulq_m_f32): Likewise.
(vcmulq_m_f16): Likewise.
(vcmulq_rot180_m_f32): Likewise.
(vcmulq_rot180_m_f16): Likewise.
(vcmulq_rot270_m_f32): Likewise.
(vcmulq_rot270_m_f16): Likewise.
(vcmulq_rot90_m_f32): Likewise.
(vcmulq_rot90_m_f16): Likewise.
(vcvtq_m_n_s32_f32): Likewise.
(vcvtq_m_n_s16_f16): Likewise.
(vcvtq_m_n_u32_f32): Likewise.
(vcvtq_m_n_u16_f16): Likewise.
(veorq_m_f32): Likewise.
(veorq_m_f16): Likewise.
(vfmaq_m_f32): Likewise.
(vfmaq_m_f16): Likewise.
(vfmaq_m_n_f32): Likewise.
(vfmaq_m_n_f16): Likewise.
(vfmasq_m_n_f32): Likewise.
(vfmasq_m_n_f16): Likewise.
(vfmsq_m_f32): Likewise.
(vfmsq_m_f16): Likewise.
(vmaxnmq_m_f32): Likewise.
(vmaxnmq_m_f16): Likewise.
(vminnmq_m_f32): Likewise.
(vminnmq_m_f16): Likewise.
(vmulq_m_f32): Likewise.
(vmulq_m_f16): Likewise.
(vmulq_m_n_f32): Likewise.
(vmulq_m_n_f16): Likewise.
(vornq_m_f32): Likewise.
(vornq_m_f16): Likewise.
(vorrq_m_f32): Likewise.
(vorrq_m_f16): Likewise.
(vsubq_m_f32): Likewise.
(vsubq_m_f16): Likewise.
(vsubq_m_n_f32): Likewise.
(vsubq_m_n_f16): Likewise.
(__attribute__): Likewise.
(__arm_vabdq_m_f32): Likewise.
(__arm_vabdq_m_f16): Likewise.
(__arm_vaddq_m_f32): Likewise.
(__arm_vaddq_m_f16): Likewise.
(__arm_vaddq_m_n_f32): Likewise.
(__arm_vaddq_m_n_f16): Likewise.
(__arm_vandq_m_f32): Likewise.
(__arm_vandq_m_f16): Likewise.
(__arm_vbicq_m_f32): Likewise.
(__arm_vbicq_m_f16): Likewise.
(__arm_vbrsrq_m_n_f32): Likewise.
(__arm_vbrsrq_m_n_f16): Likewise.
(__arm_vcaddq_rot270_m_f32): Likewise.
(__arm_vcaddq_rot270_m_f16): Likewise.
(__arm_vcaddq_rot90_m_f32): Likewise.
(__arm_vcaddq_rot90_m_f16): Likewise.
(__arm_vcmlaq_m_f32): Likewise.
(__arm_vcmlaq_m_f16): Likewise.
(__arm_vcmlaq_rot180_m_f32): Likewise.
(__arm_vcmlaq_rot180_m_f16): Likewise.
(__arm_vcmlaq_rot270_m_f32): Likewise.
(__arm_vcmlaq_rot270_m_f16): Likewise.
(__arm_vcmlaq_rot90_m_f32): Likewise.
(__arm_vcmlaq_rot90_m_f16): Likewise.
(__arm_vcmulq_m_f32): Likewise.
(__arm_vcmulq_m_f16): Likewise.
(__arm_vcmulq_rot180_m_f32): Define intrinsic.
(__arm_vcmulq_rot180_m_f16): Likewise.
(__arm_vcmulq_rot270_m_f32): Likewise.
(__arm_vcmulq_rot270_m_f16): Likewise.
(__arm_vcmulq_rot90_m_f32): Likewise.
(__arm_vcmulq_rot90_m_f16): Likewise.
(__arm_vcvtq_m_n_s32_f32): Likewise.
(__arm_vcvtq_m_n_s16_f16): Likewise.
(__arm_vcvtq_m_n_u32_f32): Likewise.
(__arm_vcvtq_m_n_u16_f16): Likewise.
(__arm_veorq_m_f32): Likewise.
(__arm_veorq_m_f16): Likewise.
(__arm_vfmaq_m_f32): Likewise.
(__arm_vfmaq_m_f16): Likewise.
(__arm_vfmaq_m_n_f32): Likewise.
(__arm_vfmaq_m_n_f16): Likewise.
(__arm_vfmasq_m_n_f32): Likewise.
(__arm_vfmasq_m_n_f16): Likewise.
(__arm_vfmsq_m_f32): Likewise.
(__arm_vfmsq_m_f16): Likewise.
(__arm_vmaxnmq_m_f32): Likewise.
(__arm_vmaxnmq_m_f16): Likewise.
(__arm_vminnmq_m_f32): Likewise.
(__arm_vminnmq_m_f16): Likewise.
(__arm_vmulq_m_f32): Likewise.
(__arm_vmulq_m_f16): Likewise.
(__arm_vmulq_m_n_f32): Likewise.
(__arm_vmulq_m_n_f16): Likewise.
(__arm_vornq_m_f32): Likewise.
(__arm_vornq_m_f16): Likewise.
(__arm_vorrq_m_f32): Likewise.
(__arm_vorrq_m_f16): Likewise.
(__arm_vsubq_m_f32): Likewise.
(__arm_vsubq_m_f16): Likewise.
(__arm_vsubq_m_n_f32): Likewise.
(__arm_vsubq_m_n_f16): Likewise.
(vabdq_m): Define polymorphic variant.
(vaddq_m): Likewise.
(vaddq_m_n): Likewise.
(vandq_m): Likewise.
(vbicq_m): Likewise.
(vbrsrq_m_n): Likewise.
(vcaddq_rot270_m): Likewise.
(vcaddq_rot90_m): Likewise.
(vcmlaq_m): Likewise.
(vcmlaq_rot180_m): Likewise.
(vcmlaq_rot270_m): Likewise.
(vcmlaq_rot90_m): Likewise.
(vcmulq_m): Likewise.
(vcmulq_rot180_m): Likewise.
(vcmulq_rot270_m): Likewise.
(vcmulq_rot90_m): Likewise.
(veorq_m): Likewise.
(vfmaq_m): Likewise.
(vfmaq_m_n): Likewise.
(vfmasq_m_n): Likewise.
(vfmsq_m): Likewise.
(vmaxnmq_m): Likewise.
(vminnmq_m): Likewise.
(vmulq_m): Likewise.
(vmulq_m_n): Likewise.
(vornq_m): Likewise.
(vsubq_m): Likewise.
(vsubq_m_n): Likewise.
(vorrq_m): Likewise.
* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE): Likewise.
* config/arm/mve.md (mve_vabdq_m_f<mode>): Define RTL pattern.
(mve_vaddq_m_f<mode>): Likewise.
(mve_vaddq_m_n_f<mode>): Likewise.
(mve_vandq_m_f<mode>): Likewise.
(mve_vbicq_m_f<mode>): Likewise.
(mve_vbrsrq_m_n_f<mode>): Likewise.
(mve_vcaddq_rot270_m_f<mode>): Likewise.
(mve_vcaddq_rot90_m_f<mode>): Likewise.
(mve_vcmlaq_m_f<mode>): Likewise.
(mve_vcmlaq_rot180_m_f<mode>): Likewise.
(mve_vcmlaq_rot270_m_f<mode>): Likewise.
(mve_vcmlaq_rot90_m_f<mode>): Likewise.
(mve_vcmulq_m_f<mode>): Likewise.
(mve_vcmulq_rot180_m_f<mode>): Likewise.
(mve_vcmulq_rot270_m_f<mode>): Likewise.
(mve_vcmulq_rot90_m_f<mode>): Likewise.
(mve_veorq_m_f<mode>): Likewise.
(mve_vfmaq_m_f<mode>): Likewise.
(mve_vfmaq_m_n_f<mode>): Likewise.
(mve_vfmasq_m_n_f<mode>): Likewise.
(mve_vfmsq_m_f<mode>): Likewise.
(mve_vmaxnmq_m_f<mode>): Likewise.
(mve_vminnmq_m_f<mode>): Likewise.
(mve_vmulq_m_f<mode>): Likewise.
(mve_vmulq_m_n_f<mode>): Likewise.
(mve_vornq_m_f<mode>): Likewise.
(mve_vorrq_m_f<mode>): Likewise.
(mve_vsubq_m_f<mode>): Likewise.
(mve_vsubq_m_n_f<mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_m_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_f32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 17:06:58 +0000 (17:06 +0000)]
[ARM][GCC][3/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vmlaldavaq_p_s16, vmlaldavaq_p_s32, vmlaldavaq_p_u16, vmlaldavaq_p_u32, vmlaldavaxq_p_s16, vmlaldavaxq_p_s32, vmlaldavaxq_p_u16, vmlaldavaxq_p_u32, vmlsldavaq_p_s16, vmlsldavaq_p_s32, vmlsldavaxq_p_s16, vmlsldavaxq_p_s32, vmullbq_poly_m_p16, vmullbq_poly_m_p8, vmulltq_poly_m_p16, vmulltq_poly_m_p8, vqdmullbq_m_n_s16, vqdmullbq_m_n_s32, vqdmullbq_m_s16, vqdmullbq_m_s32, vqdmulltq_m_n_s16, vqdmulltq_m_n_s32, vqdmulltq_m_s16, vqdmulltq_m_s32, vqrshrnbq_m_n_s16, vqrshrnbq_m_n_s32, vqrshrnbq_m_n_u16, vqrshrnbq_m_n_u32, vqrshrntq_m_n_s16, vqrshrntq_m_n_s32, vqrshrntq_m_n_u16, vqrshrntq_m_n_u32, vqrshrunbq_m_n_s16, vqrshrunbq_m_n_s32, vqrshruntq_m_n_s16, vqrshruntq_m_n_s32, vqshrnbq_m_n_s16, vqshrnbq_m_n_s32, vqshrnbq_m_n_u16, vqshrnbq_m_n_u32, vqshrntq_m_n_s16, vqshrntq_m_n_s32, vqshrntq_m_n_u16, vqshrntq_m_n_u32, vqshrunbq_m_n_s16, vqshrunbq_m_n_s32, vqshruntq_m_n_s16, vqshruntq_m_n_s32, vrmlaldavhaq_p_s32, vrmlaldavhaq_p_u32, vrmlaldavhaxq_p_s32, vrmlsldavhaq_p_s32, vrmlsldavhaxq_p_s32, vrshrnbq_m_n_s16, vrshrnbq_m_n_s32, vrshrnbq_m_n_u16, vrshrnbq_m_n_u32, vrshrntq_m_n_s16, vrshrntq_m_n_s32, vrshrntq_m_n_u16, vrshrntq_m_n_u32, vshllbq_m_n_s16, vshllbq_m_n_s8, vshllbq_m_n_u16, vshllbq_m_n_u8, vshlltq_m_n_s16, vshlltq_m_n_s8, vshlltq_m_n_u16, vshlltq_m_n_u8, vshrnbq_m_n_s16, vshrnbq_m_n_s32, vshrnbq_m_n_u16, vshrnbq_m_n_u32, vshrntq_m_n_s16, vshrntq_m_n_s32, vshrntq_m_n_u16, vshrntq_m_n_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-protos.h (arm_mve_immediate_check):
* config/arm/arm.c (arm_mve_immediate_check): Define fuction to check
mode and interger value.
* config/arm/arm_mve.h (vmlaldavaq_p_s32): Define macro.
(vmlaldavaq_p_s16): Likewise.
(vmlaldavaq_p_u32): Likewise.
(vmlaldavaq_p_u16): Likewise.
(vmlaldavaxq_p_s32): Likewise.
(vmlaldavaxq_p_s16): Likewise.
(vmlaldavaxq_p_u32): Likewise.
(vmlaldavaxq_p_u16): Likewise.
(vmlsldavaq_p_s32): Likewise.
(vmlsldavaq_p_s16): Likewise.
(vmlsldavaxq_p_s32): Likewise.
(vmlsldavaxq_p_s16): Likewise.
(vmullbq_poly_m_p8): Likewise.
(vmullbq_poly_m_p16): Likewise.
(vmulltq_poly_m_p8): Likewise.
(vmulltq_poly_m_p16): Likewise.
(vqdmullbq_m_n_s32): Likewise.
(vqdmullbq_m_n_s16): Likewise.
(vqdmullbq_m_s32): Likewise.
(vqdmullbq_m_s16): Likewise.
(vqdmulltq_m_n_s32): Likewise.
(vqdmulltq_m_n_s16): Likewise.
(vqdmulltq_m_s32): Likewise.
(vqdmulltq_m_s16): Likewise.
(vqrshrnbq_m_n_s32): Likewise.
(vqrshrnbq_m_n_s16): Likewise.
(vqrshrnbq_m_n_u32): Likewise.
(vqrshrnbq_m_n_u16): Likewise.
(vqrshrntq_m_n_s32): Likewise.
(vqrshrntq_m_n_s16): Likewise.
(vqrshrntq_m_n_u32): Likewise.
(vqrshrntq_m_n_u16): Likewise.
(vqrshrunbq_m_n_s32): Likewise.
(vqrshrunbq_m_n_s16): Likewise.
(vqrshruntq_m_n_s32): Likewise.
(vqrshruntq_m_n_s16): Likewise.
(vqshrnbq_m_n_s32): Likewise.
(vqshrnbq_m_n_s16): Likewise.
(vqshrnbq_m_n_u32): Likewise.
(vqshrnbq_m_n_u16): Likewise.
(vqshrntq_m_n_s32): Likewise.
(vqshrntq_m_n_s16): Likewise.
(vqshrntq_m_n_u32): Likewise.
(vqshrntq_m_n_u16): Likewise.
(vqshrunbq_m_n_s32): Likewise.
(vqshrunbq_m_n_s16): Likewise.
(vqshruntq_m_n_s32): Likewise.
(vqshruntq_m_n_s16): Likewise.
(vrmlaldavhaq_p_s32): Likewise.
(vrmlaldavhaq_p_u32): Likewise.
(vrmlaldavhaxq_p_s32): Likewise.
(vrmlsldavhaq_p_s32): Likewise.
(vrmlsldavhaxq_p_s32): Likewise.
(vrshrnbq_m_n_s32): Likewise.
(vrshrnbq_m_n_s16): Likewise.
(vrshrnbq_m_n_u32): Likewise.
(vrshrnbq_m_n_u16): Likewise.
(vrshrntq_m_n_s32): Likewise.
(vrshrntq_m_n_s16): Likewise.
(vrshrntq_m_n_u32): Likewise.
(vrshrntq_m_n_u16): Likewise.
(vshllbq_m_n_s8): Likewise.
(vshllbq_m_n_s16): Likewise.
(vshllbq_m_n_u8): Likewise.
(vshllbq_m_n_u16): Likewise.
(vshlltq_m_n_s8): Likewise.
(vshlltq_m_n_s16): Likewise.
(vshlltq_m_n_u8): Likewise.
(vshlltq_m_n_u16): Likewise.
(vshrnbq_m_n_s32): Likewise.
(vshrnbq_m_n_s16): Likewise.
(vshrnbq_m_n_u32): Likewise.
(vshrnbq_m_n_u16): Likewise.
(vshrntq_m_n_s32): Likewise.
(vshrntq_m_n_s16): Likewise.
(vshrntq_m_n_u32): Likewise.
(vshrntq_m_n_u16): Likewise.
(__arm_vmlaldavaq_p_s32): Define intrinsic.
(__arm_vmlaldavaq_p_s16): Likewise.
(__arm_vmlaldavaq_p_u32): Likewise.
(__arm_vmlaldavaq_p_u16): Likewise.
(__arm_vmlaldavaxq_p_s32): Likewise.
(__arm_vmlaldavaxq_p_s16): Likewise.
(__arm_vmlaldavaxq_p_u32): Likewise.
(__arm_vmlaldavaxq_p_u16): Likewise.
(__arm_vmlsldavaq_p_s32): Likewise.
(__arm_vmlsldavaq_p_s16): Likewise.
(__arm_vmlsldavaxq_p_s32): Likewise.
(__arm_vmlsldavaxq_p_s16): Likewise.
(__arm_vmullbq_poly_m_p8): Likewise.
(__arm_vmullbq_poly_m_p16): Likewise.
(__arm_vmulltq_poly_m_p8): Likewise.
(__arm_vmulltq_poly_m_p16): Likewise.
(__arm_vqdmullbq_m_n_s32): Likewise.
(__arm_vqdmullbq_m_n_s16): Likewise.
(__arm_vqdmullbq_m_s32): Likewise.
(__arm_vqdmullbq_m_s16): Likewise.
(__arm_vqdmulltq_m_n_s32): Likewise.
(__arm_vqdmulltq_m_n_s16): Likewise.
(__arm_vqdmulltq_m_s32): Likewise.
(__arm_vqdmulltq_m_s16): Likewise.
(__arm_vqrshrnbq_m_n_s32): Likewise.
(__arm_vqrshrnbq_m_n_s16): Likewise.
(__arm_vqrshrnbq_m_n_u32): Likewise.
(__arm_vqrshrnbq_m_n_u16): Likewise.
(__arm_vqrshrntq_m_n_s32): Likewise.
(__arm_vqrshrntq_m_n_s16): Likewise.
(__arm_vqrshrntq_m_n_u32): Likewise.
(__arm_vqrshrntq_m_n_u16): Likewise.
(__arm_vqrshrunbq_m_n_s32): Likewise.
(__arm_vqrshrunbq_m_n_s16): Likewise.
(__arm_vqrshruntq_m_n_s32): Likewise.
(__arm_vqrshruntq_m_n_s16): Likewise.
(__arm_vqshrnbq_m_n_s32): Likewise.
(__arm_vqshrnbq_m_n_s16): Likewise.
(__arm_vqshrnbq_m_n_u32): Likewise.
(__arm_vqshrnbq_m_n_u16): Likewise.
(__arm_vqshrntq_m_n_s32): Likewise.
(__arm_vqshrntq_m_n_s16): Likewise.
(__arm_vqshrntq_m_n_u32): Likewise.
(__arm_vqshrntq_m_n_u16): Likewise.
(__arm_vqshrunbq_m_n_s32): Likewise.
(__arm_vqshrunbq_m_n_s16): Likewise.
(__arm_vqshruntq_m_n_s32): Likewise.
(__arm_vqshruntq_m_n_s16): Likewise.
(__arm_vrmlaldavhaq_p_s32): Likewise.
(__arm_vrmlaldavhaq_p_u32): Likewise.
(__arm_vrmlaldavhaxq_p_s32): Likewise.
(__arm_vrmlsldavhaq_p_s32): Likewise.
(__arm_vrmlsldavhaxq_p_s32): Likewise.
(__arm_vrshrnbq_m_n_s32): Likewise.
(__arm_vrshrnbq_m_n_s16): Likewise.
(__arm_vrshrnbq_m_n_u32): Likewise.
(__arm_vrshrnbq_m_n_u16): Likewise.
(__arm_vrshrntq_m_n_s32): Likewise.
(__arm_vrshrntq_m_n_s16): Likewise.
(__arm_vrshrntq_m_n_u32): Likewise.
(__arm_vrshrntq_m_n_u16): Likewise.
(__arm_vshllbq_m_n_s8): Likewise.
(__arm_vshllbq_m_n_s16): Likewise.
(__arm_vshllbq_m_n_u8): Likewise.
(__arm_vshllbq_m_n_u16): Likewise.
(__arm_vshlltq_m_n_s8): Likewise.
(__arm_vshlltq_m_n_s16): Likewise.
(__arm_vshlltq_m_n_u8): Likewise.
(__arm_vshlltq_m_n_u16): Likewise.
(__arm_vshrnbq_m_n_s32): Likewise.
(__arm_vshrnbq_m_n_s16): Likewise.
(__arm_vshrnbq_m_n_u32): Likewise.
(__arm_vshrnbq_m_n_u16): Likewise.
(__arm_vshrntq_m_n_s32): Likewise.
(__arm_vshrntq_m_n_s16): Likewise.
(__arm_vshrntq_m_n_u32): Likewise.
(__arm_vshrntq_m_n_u16): Likewise.
(vmullbq_poly_m): Define polymorphic variant.
(vmulltq_poly_m): Likewise.
(vshllbq_m): Likewise.
(vshrntq_m_n): Likewise.
(vshrnbq_m_n): Likewise.
(vshlltq_m_n): Likewise.
(vshllbq_m_n): Likewise.
(vrshrntq_m_n): Likewise.
(vrshrnbq_m_n): Likewise.
(vqshruntq_m_n): Likewise.
(vqshrunbq_m_n): Likewise.
(vqdmullbq_m_n): Likewise.
(vqdmullbq_m): Likewise.
(vqdmulltq_m_n): Likewise.
(vqdmulltq_m): Likewise.
(vqrshrnbq_m_n): Likewise.
(vqrshrntq_m_n): Likewise.
(vqrshrunbq_m_n): Likewise.
(vqrshruntq_m_n): Likewise.
(vqshrnbq_m_n): Likewise.
(vqshrntq_m_n): Likewise.
* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
builtin qualifiers.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (VMLALDAVAQ_P): Define iterator.
(VMLALDAVAXQ_P): Likewise.
(VQRSHRNBQ_M_N): Likewise.
(VQRSHRNTQ_M_N): Likewise.
(VQSHRNBQ_M_N): Likewise.
(VQSHRNTQ_M_N): Likewise.
(VRSHRNBQ_M_N): Likewise.
(VRSHRNTQ_M_N): Likewise.
(VSHLLBQ_M_N): Likewise.
(VSHLLTQ_M_N): Likewise.
(VSHRNBQ_M_N): Likewise.
(VSHRNTQ_M_N): Likewise.
(mve_vmlaldavaq_p_<supf><mode>): Define RTL pattern.
(mve_vmlaldavaxq_p_<supf><mode>): Likewise.
(mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
(mve_vqrshrntq_m_n_<supf><mode>): Likewise.
(mve_vqshrnbq_m_n_<supf><mode>): Likewise.
(mve_vqshrntq_m_n_<supf><mode>): Likewise.
(mve_vrmlaldavhaq_p_sv4si): Likewise.
(mve_vrshrnbq_m_n_<supf><mode>): Likewise.
(mve_vrshrntq_m_n_<supf><mode>): Likewise.
(mve_vshllbq_m_n_<supf><mode>): Likewise.
(mve_vshlltq_m_n_<supf><mode>): Likewise.
(mve_vshrnbq_m_n_<supf><mode>): Likewise.
(mve_vshrntq_m_n_<supf><mode>): Likewise.
(mve_vmlsldavaq_p_s<mode>): Likewise.
(mve_vmlsldavaxq_p_s<mode>): Likewise.
(mve_vmullbq_poly_m_p<mode>): Likewise.
(mve_vmulltq_poly_m_p<mode>): Likewise.
(mve_vqdmullbq_m_n_s<mode>): Likewise.
(mve_vqdmullbq_m_s<mode>): Likewise.
(mve_vqdmulltq_m_n_s<mode>): Likewise.
(mve_vqdmulltq_m_s<mode>): Likewise.
(mve_vqrshrunbq_m_n_s<mode>): Likewise.
(mve_vqrshruntq_m_n_s<mode>): Likewise.
(mve_vqshrunbq_m_n_s<mode>): Likewise.
(mve_vqshruntq_m_n_s<mode>): Likewise.
(mve_vrmlaldavhaq_p_uv4si): Likewise.
(mve_vrmlaldavhaxq_p_sv4si): Likewise.
(mve_vrmlsldavhaq_p_sv4si): Likewise.
(mve_vrmlsldavhaxq_p_sv4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_m_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_m_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_m_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_m_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:58:10 +0000 (16:58 +0000)]
[ARM][GCC][2/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vabdq_m_s8, vabdq_m_s32, vabdq_m_s16, vabdq_m_u8, vabdq_m_u32, vabdq_m_u16, vaddq_m_n_s8, vaddq_m_n_s32, vaddq_m_n_s16, vaddq_m_n_u8, vaddq_m_n_u32, vaddq_m_n_u16, vaddq_m_s8, vaddq_m_s32, vaddq_m_s16, vaddq_m_u8, vaddq_m_u32, vaddq_m_u16, vandq_m_s8, vandq_m_s32, vandq_m_s16, vandq_m_u8, vandq_m_u32, vandq_m_u16, vbicq_m_s8, vbicq_m_s32, vbicq_m_s16, vbicq_m_u8, vbicq_m_u32, vbicq_m_u16, vbrsrq_m_n_s8, vbrsrq_m_n_s32, vbrsrq_m_n_s16, vbrsrq_m_n_u8, vbrsrq_m_n_u32, vbrsrq_m_n_u16, vcaddq_rot270_m_s8, vcaddq_rot270_m_s32, vcaddq_rot270_m_s16, vcaddq_rot270_m_u8, vcaddq_rot270_m_u32, vcaddq_rot270_m_u16, vcaddq_rot90_m_s8, vcaddq_rot90_m_s32, vcaddq_rot90_m_s16, vcaddq_rot90_m_u8, vcaddq_rot90_m_u32, vcaddq_rot90_m_u16, veorq_m_s8, veorq_m_s32, veorq_m_s16, veorq_m_u8, veorq_m_u32, veorq_m_u16, vhaddq_m_n_s8, vhaddq_m_n_s32, vhaddq_m_n_s16, vhaddq_m_n_u8, vhaddq_m_n_u32, vhaddq_m_n_u16, vhaddq_m_s8, vhaddq_m_s32, vhaddq_m_s16, vhaddq_m_u8, vhaddq_m_u32, vhaddq_m_u16, vhcaddq_rot270_m_s8, vhcaddq_rot270_m_s32, vhcaddq_rot270_m_s16, vhcaddq_rot90_m_s8, vhcaddq_rot90_m_s32, vhcaddq_rot90_m_s16, vhsubq_m_n_s8, vhsubq_m_n_s32, vhsubq_m_n_s16, vhsubq_m_n_u8, vhsubq_m_n_u32, vhsubq_m_n_u16, vhsubq_m_s8, vhsubq_m_s32, vhsubq_m_s16, vhsubq_m_u8, vhsubq_m_u32, vhsubq_m_u16, vmaxq_m_s8, vmaxq_m_s32, vmaxq_m_s16, vmaxq_m_u8, vmaxq_m_u32, vmaxq_m_u16, vminq_m_s8, vminq_m_s32, vminq_m_s16, vminq_m_u8, vminq_m_u32, vminq_m_u16, vmladavaq_p_s8, vmladavaq_p_s32, vmladavaq_p_s16, vmladavaq_p_u8, vmladavaq_p_u32, vmladavaq_p_u16, vmladavaxq_p_s8, vmladavaxq_p_s32, vmladavaxq_p_s16, vmlaq_m_n_s8, vmlaq_m_n_s32, vmlaq_m_n_s16, vmlaq_m_n_u8, vmlaq_m_n_u32, vmlaq_m_n_u16, vmlasq_m_n_s8, vmlasq_m_n_s32, vmlasq_m_n_s16, vmlasq_m_n_u8, vmlasq_m_n_u32, vmlasq_m_n_u16, vmlsdavaq_p_s8, vmlsdavaq_p_s32, vmlsdavaq_p_s16, vmlsdavaxq_p_s8, vmlsdavaxq_p_s32, vmlsdavaxq_p_s16, vmulhq_m_s8, vmulhq_m_s32, vmulhq_m_s16, vmulhq_m_u8, vmulhq_m_u32, vmulhq_m_u16, vmullbq_int_m_s8, vmullbq_int_m_s32, vmullbq_int_m_s16, vmullbq_int_m_u8, vmullbq_int_m_u32, vmullbq_int_m_u16, vmulltq_int_m_s8, vmulltq_int_m_s32, vmulltq_int_m_s16, vmulltq_int_m_u8, vmulltq_int_m_u32, vmulltq_int_m_u16, vmulq_m_n_s8, vmulq_m_n_s32, vmulq_m_n_s16, vmulq_m_n_u8, vmulq_m_n_u32, vmulq_m_n_u16, vmulq_m_s8, vmulq_m_s32, vmulq_m_s16, vmulq_m_u8, vmulq_m_u32, vmulq_m_u16, vornq_m_s8, vornq_m_s32, vornq_m_s16, vornq_m_u8, vornq_m_u32, vornq_m_u16, vorrq_m_s8, vorrq_m_s32, vorrq_m_s16, vorrq_m_u8, vorrq_m_u32, vorrq_m_u16, vqaddq_m_n_s8, vqaddq_m_n_s32, vqaddq_m_n_s16, vqaddq_m_n_u8, vqaddq_m_n_u32, vqaddq_m_n_u16, vqaddq_m_s8, vqaddq_m_s32, vqaddq_m_s16, vqaddq_m_u8, vqaddq_m_u32, vqaddq_m_u16, vqdmladhq_m_s8, vqdmladhq_m_s32, vqdmladhq_m_s16, vqdmladhxq_m_s8, vqdmladhxq_m_s32, vqdmladhxq_m_s16, vqdmlahq_m_n_s8, vqdmlahq_m_n_s32, vqdmlahq_m_n_s16, vqdmlahq_m_n_u8, vqdmlahq_m_n_u32, vqdmlahq_m_n_u16, vqdmlsdhq_m_s8, vqdmlsdhq_m_s32, vqdmlsdhq_m_s16, vqdmlsdhxq_m_s8, vqdmlsdhxq_m_s32, vqdmlsdhxq_m_s16, vqdmulhq_m_n_s8, vqdmulhq_m_n_s32, vqdmulhq_m_n_s16, vqdmulhq_m_s8, vqdmulhq_m_s32, vqdmulhq_m_s16, vqrdmladhq_m_s8, vqrdmladhq_m_s32, vqrdmladhq_m_s16, vqrdmladhxq_m_s8, vqrdmladhxq_m_s32, vqrdmladhxq_m_s16, vqrdmlahq_m_n_s8, vqrdmlahq_m_n_s32, vqrdmlahq_m_n_s16, vqrdmlahq_m_n_u8, vqrdmlahq_m_n_u32, vqrdmlahq_m_n_u16, vqrdmlashq_m_n_s8, vqrdmlashq_m_n_s32, vqrdmlashq_m_n_s16, vqrdmlashq_m_n_u8, vqrdmlashq_m_n_u32, vqrdmlashq_m_n_u16, vqrdmlsdhq_m_s8, vqrdmlsdhq_m_s32, vqrdmlsdhq_m_s16, vqrdmlsdhxq_m_s8, vqrdmlsdhxq_m_s32, vqrdmlsdhxq_m_s16, vqrdmulhq_m_n_s8, vqrdmulhq_m_n_s32, vqrdmulhq_m_n_s16, vqrdmulhq_m_s8, vqrdmulhq_m_s32, vqrdmulhq_m_s16, vqrshlq_m_s8, vqrshlq_m_s32, vqrshlq_m_s16, vqrshlq_m_u8, vqrshlq_m_u32, vqrshlq_m_u16, vqshlq_m_n_s8, vqshlq_m_n_s32, vqshlq_m_n_s16, vqshlq_m_n_u8, vqshlq_m_n_u32, vqshlq_m_n_u16, vqshlq_m_s8, vqshlq_m_s32, vqshlq_m_s16, vqshlq_m_u8, vqshlq_m_u32, vqshlq_m_u16, vqsubq_m_n_s8, vqsubq_m_n_s32, vqsubq_m_n_s16, vqsubq_m_n_u8, vqsubq_m_n_u32, vqsubq_m_n_u16, vqsubq_m_s8, vqsubq_m_s32, vqsubq_m_s16, vqsubq_m_u8, vqsubq_m_u32, vqsubq_m_u16, vrhaddq_m_s8, vrhaddq_m_s32, vrhaddq_m_s16, vrhaddq_m_u8, vrhaddq_m_u32, vrhaddq_m_u16, vrmulhq_m_s8, vrmulhq_m_s32, vrmulhq_m_s16, vrmulhq_m_u8, vrmulhq_m_u32, vrmulhq_m_u16, vrshlq_m_s8, vrshlq_m_s32, vrshlq_m_s16, vrshlq_m_u8, vrshlq_m_u32, vrshlq_m_u16, vrshrq_m_n_s8, vrshrq_m_n_s32, vrshrq_m_n_s16, vrshrq_m_n_u8, vrshrq_m_n_u32, vrshrq_m_n_u16, vshlq_m_n_s8, vshlq_m_n_s32, vshlq_m_n_s16, vshlq_m_n_u8, vshlq_m_n_u32, vshlq_m_n_u16, vshrq_m_n_s8, vshrq_m_n_s32, vshrq_m_n_s16, vshrq_m_n_u8, vshrq_m_n_u32, vshrq_m_n_u16, vsliq_m_n_s8, vsliq_m_n_s32, vsliq_m_n_s16, vsliq_m_n_u8, vsliq_m_n_u32, vsliq_m_n_u16, vsubq_m_n_s8, vsubq_m_n_s32, vsubq_m_n_s16, vsubq_m_n_u8, vsubq_m_n_u32, vsubq_m_n_u16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vabdq_m_s8): Define macro.
(vabdq_m_s32): Likewise.
(vabdq_m_s16): Likewise.
(vabdq_m_u8): Likewise.
(vabdq_m_u32): Likewise.
(vabdq_m_u16): Likewise.
(vaddq_m_n_s8): Likewise.
(vaddq_m_n_s32): Likewise.
(vaddq_m_n_s16): Likewise.
(vaddq_m_n_u8): Likewise.
(vaddq_m_n_u32): Likewise.
(vaddq_m_n_u16): Likewise.
(vaddq_m_s8): Likewise.
(vaddq_m_s32): Likewise.
(vaddq_m_s16): Likewise.
(vaddq_m_u8): Likewise.
(vaddq_m_u32): Likewise.
(vaddq_m_u16): Likewise.
(vandq_m_s8): Likewise.
(vandq_m_s32): Likewise.
(vandq_m_s16): Likewise.
(vandq_m_u8): Likewise.
(vandq_m_u32): Likewise.
(vandq_m_u16): Likewise.
(vbicq_m_s8): Likewise.
(vbicq_m_s32): Likewise.
(vbicq_m_s16): Likewise.
(vbicq_m_u8): Likewise.
(vbicq_m_u32): Likewise.
(vbicq_m_u16): Likewise.
(vbrsrq_m_n_s8): Likewise.
(vbrsrq_m_n_s32): Likewise.
(vbrsrq_m_n_s16): Likewise.
(vbrsrq_m_n_u8): Likewise.
(vbrsrq_m_n_u32): Likewise.
(vbrsrq_m_n_u16): Likewise.
(vcaddq_rot270_m_s8): Likewise.
(vcaddq_rot270_m_s32): Likewise.
(vcaddq_rot270_m_s16): Likewise.
(vcaddq_rot270_m_u8): Likewise.
(vcaddq_rot270_m_u32): Likewise.
(vcaddq_rot270_m_u16): Likewise.
(vcaddq_rot90_m_s8): Likewise.
(vcaddq_rot90_m_s32): Likewise.
(vcaddq_rot90_m_s16): Likewise.
(vcaddq_rot90_m_u8): Likewise.
(vcaddq_rot90_m_u32): Likewise.
(vcaddq_rot90_m_u16): Likewise.
(veorq_m_s8): Likewise.
(veorq_m_s32): Likewise.
(veorq_m_s16): Likewise.
(veorq_m_u8): Likewise.
(veorq_m_u32): Likewise.
(veorq_m_u16): Likewise.
(vhaddq_m_n_s8): Likewise.
(vhaddq_m_n_s32): Likewise.
(vhaddq_m_n_s16): Likewise.
(vhaddq_m_n_u8): Likewise.
(vhaddq_m_n_u32): Likewise.
(vhaddq_m_n_u16): Likewise.
(vhaddq_m_s8): Likewise.
(vhaddq_m_s32): Likewise.
(vhaddq_m_s16): Likewise.
(vhaddq_m_u8): Likewise.
(vhaddq_m_u32): Likewise.
(vhaddq_m_u16): Likewise.
(vhcaddq_rot270_m_s8): Likewise.
(vhcaddq_rot270_m_s32): Likewise.
(vhcaddq_rot270_m_s16): Likewise.
(vhcaddq_rot90_m_s8): Likewise.
(vhcaddq_rot90_m_s32): Likewise.
(vhcaddq_rot90_m_s16): Likewise.
(vhsubq_m_n_s8): Likewise.
(vhsubq_m_n_s32): Likewise.
(vhsubq_m_n_s16): Likewise.
(vhsubq_m_n_u8): Likewise.
(vhsubq_m_n_u32): Likewise.
(vhsubq_m_n_u16): Likewise.
(vhsubq_m_s8): Likewise.
(vhsubq_m_s32): Likewise.
(vhsubq_m_s16): Likewise.
(vhsubq_m_u8): Likewise.
(vhsubq_m_u32): Likewise.
(vhsubq_m_u16): Likewise.
(vmaxq_m_s8): Likewise.
(vmaxq_m_s32): Likewise.
(vmaxq_m_s16): Likewise.
(vmaxq_m_u8): Likewise.
(vmaxq_m_u32): Likewise.
(vmaxq_m_u16): Likewise.
(vminq_m_s8): Likewise.
(vminq_m_s32): Likewise.
(vminq_m_s16): Likewise.
(vminq_m_u8): Likewise.
(vminq_m_u32): Likewise.
(vminq_m_u16): Likewise.
(vmladavaq_p_s8): Likewise.
(vmladavaq_p_s32): Likewise.
(vmladavaq_p_s16): Likewise.
(vmladavaq_p_u8): Likewise.
(vmladavaq_p_u32): Likewise.
(vmladavaq_p_u16): Likewise.
(vmladavaxq_p_s8): Likewise.
(vmladavaxq_p_s32): Likewise.
(vmladavaxq_p_s16): Likewise.
(vmlaq_m_n_s8): Likewise.
(vmlaq_m_n_s32): Likewise.
(vmlaq_m_n_s16): Likewise.
(vmlaq_m_n_u8): Likewise.
(vmlaq_m_n_u32): Likewise.
(vmlaq_m_n_u16): Likewise.
(vmlasq_m_n_s8): Likewise.
(vmlasq_m_n_s32): Likewise.
(vmlasq_m_n_s16): Likewise.
(vmlasq_m_n_u8): Likewise.
(vmlasq_m_n_u32): Likewise.
(vmlasq_m_n_u16): Likewise.
(vmlsdavaq_p_s8): Likewise.
(vmlsdavaq_p_s32): Likewise.
(vmlsdavaq_p_s16): Likewise.
(vmlsdavaxq_p_s8): Likewise.
(vmlsdavaxq_p_s32): Likewise.
(vmlsdavaxq_p_s16): Likewise.
(vmulhq_m_s8): Likewise.
(vmulhq_m_s32): Likewise.
(vmulhq_m_s16): Likewise.
(vmulhq_m_u8): Likewise.
(vmulhq_m_u32): Likewise.
(vmulhq_m_u16): Likewise.
(vmullbq_int_m_s8): Likewise.
(vmullbq_int_m_s32): Likewise.
(vmullbq_int_m_s16): Likewise.
(vmullbq_int_m_u8): Likewise.
(vmullbq_int_m_u32): Likewise.
(vmullbq_int_m_u16): Likewise.
(vmulltq_int_m_s8): Likewise.
(vmulltq_int_m_s32): Likewise.
(vmulltq_int_m_s16): Likewise.
(vmulltq_int_m_u8): Likewise.
(vmulltq_int_m_u32): Likewise.
(vmulltq_int_m_u16): Likewise.
(vmulq_m_n_s8): Likewise.
(vmulq_m_n_s32): Likewise.
(vmulq_m_n_s16): Likewise.
(vmulq_m_n_u8): Likewise.
(vmulq_m_n_u32): Likewise.
(vmulq_m_n_u16): Likewise.
(vmulq_m_s8): Likewise.
(vmulq_m_s32): Likewise.
(vmulq_m_s16): Likewise.
(vmulq_m_u8): Likewise.
(vmulq_m_u32): Likewise.
(vmulq_m_u16): Likewise.
(vornq_m_s8): Likewise.
(vornq_m_s32): Likewise.
(vornq_m_s16): Likewise.
(vornq_m_u8): Likewise.
(vornq_m_u32): Likewise.
(vornq_m_u16): Likewise.
(vorrq_m_s8): Likewise.
(vorrq_m_s32): Likewise.
(vorrq_m_s16): Likewise.
(vorrq_m_u8): Likewise.
(vorrq_m_u32): Likewise.
(vorrq_m_u16): Likewise.
(vqaddq_m_n_s8): Likewise.
(vqaddq_m_n_s32): Likewise.
(vqaddq_m_n_s16): Likewise.
(vqaddq_m_n_u8): Likewise.
(vqaddq_m_n_u32): Likewise.
(vqaddq_m_n_u16): Likewise.
(vqaddq_m_s8): Likewise.
(vqaddq_m_s32): Likewise.
(vqaddq_m_s16): Likewise.
(vqaddq_m_u8): Likewise.
(vqaddq_m_u32): Likewise.
(vqaddq_m_u16): Likewise.
(vqdmladhq_m_s8): Likewise.
(vqdmladhq_m_s32): Likewise.
(vqdmladhq_m_s16): Likewise.
(vqdmladhxq_m_s8): Likewise.
(vqdmladhxq_m_s32): Likewise.
(vqdmladhxq_m_s16): Likewise.
(vqdmlahq_m_n_s8): Likewise.
(vqdmlahq_m_n_s32): Likewise.
(vqdmlahq_m_n_s16): Likewise.
(vqdmlahq_m_n_u8): Likewise.
(vqdmlahq_m_n_u32): Likewise.
(vqdmlahq_m_n_u16): Likewise.
(vqdmlsdhq_m_s8): Likewise.
(vqdmlsdhq_m_s32): Likewise.
(vqdmlsdhq_m_s16): Likewise.
(vqdmlsdhxq_m_s8): Likewise.
(vqdmlsdhxq_m_s32): Likewise.
(vqdmlsdhxq_m_s16): Likewise.
(vqdmulhq_m_n_s8): Likewise.
(vqdmulhq_m_n_s32): Likewise.
(vqdmulhq_m_n_s16): Likewise.
(vqdmulhq_m_s8): Likewise.
(vqdmulhq_m_s32): Likewise.
(vqdmulhq_m_s16): Likewise.
(vqrdmladhq_m_s8): Likewise.
(vqrdmladhq_m_s32): Likewise.
(vqrdmladhq_m_s16): Likewise.
(vqrdmladhxq_m_s8): Likewise.
(vqrdmladhxq_m_s32): Likewise.
(vqrdmladhxq_m_s16): Likewise.
(vqrdmlahq_m_n_s8): Likewise.
(vqrdmlahq_m_n_s32): Likewise.
(vqrdmlahq_m_n_s16): Likewise.
(vqrdmlahq_m_n_u8): Likewise.
(vqrdmlahq_m_n_u32): Likewise.
(vqrdmlahq_m_n_u16): Likewise.
(vqrdmlashq_m_n_s8): Likewise.
(vqrdmlashq_m_n_s32): Likewise.
(vqrdmlashq_m_n_s16): Likewise.
(vqrdmlashq_m_n_u8): Likewise.
(vqrdmlashq_m_n_u32): Likewise.
(vqrdmlashq_m_n_u16): Likewise.
(vqrdmlsdhq_m_s8): Likewise.
(vqrdmlsdhq_m_s32): Likewise.
(vqrdmlsdhq_m_s16): Likewise.
(vqrdmlsdhxq_m_s8): Likewise.
(vqrdmlsdhxq_m_s32): Likewise.
(vqrdmlsdhxq_m_s16): Likewise.
(vqrdmulhq_m_n_s8): Likewise.
(vqrdmulhq_m_n_s32): Likewise.
(vqrdmulhq_m_n_s16): Likewise.
(vqrdmulhq_m_s8): Likewise.
(vqrdmulhq_m_s32): Likewise.
(vqrdmulhq_m_s16): Likewise.
(vqrshlq_m_s8): Likewise.
(vqrshlq_m_s32): Likewise.
(vqrshlq_m_s16): Likewise.
(vqrshlq_m_u8): Likewise.
(vqrshlq_m_u32): Likewise.
(vqrshlq_m_u16): Likewise.
(vqshlq_m_n_s8): Likewise.
(vqshlq_m_n_s32): Likewise.
(vqshlq_m_n_s16): Likewise.
(vqshlq_m_n_u8): Likewise.
(vqshlq_m_n_u32): Likewise.
(vqshlq_m_n_u16): Likewise.
(vqshlq_m_s8): Likewise.
(vqshlq_m_s32): Likewise.
(vqshlq_m_s16): Likewise.
(vqshlq_m_u8): Likewise.
(vqshlq_m_u32): Likewise.
(vqshlq_m_u16): Likewise.
(vqsubq_m_n_s8): Likewise.
(vqsubq_m_n_s32): Likewise.
(vqsubq_m_n_s16): Likewise.
(vqsubq_m_n_u8): Likewise.
(vqsubq_m_n_u32): Likewise.
(vqsubq_m_n_u16): Likewise.
(vqsubq_m_s8): Likewise.
(vqsubq_m_s32): Likewise.
(vqsubq_m_s16): Likewise.
(vqsubq_m_u8): Likewise.
(vqsubq_m_u32): Likewise.
(vqsubq_m_u16): Likewise.
(vrhaddq_m_s8): Likewise.
(vrhaddq_m_s32): Likewise.
(vrhaddq_m_s16): Likewise.
(vrhaddq_m_u8): Likewise.
(vrhaddq_m_u32): Likewise.
(vrhaddq_m_u16): Likewise.
(vrmulhq_m_s8): Likewise.
(vrmulhq_m_s32): Likewise.
(vrmulhq_m_s16): Likewise.
(vrmulhq_m_u8): Likewise.
(vrmulhq_m_u32): Likewise.
(vrmulhq_m_u16): Likewise.
(vrshlq_m_s8): Likewise.
(vrshlq_m_s32): Likewise.
(vrshlq_m_s16): Likewise.
(vrshlq_m_u8): Likewise.
(vrshlq_m_u32): Likewise.
(vrshlq_m_u16): Likewise.
(vrshrq_m_n_s8): Likewise.
(vrshrq_m_n_s32): Likewise.
(vrshrq_m_n_s16): Likewise.
(vrshrq_m_n_u8): Likewise.
(vrshrq_m_n_u32): Likewise.
(vrshrq_m_n_u16): Likewise.
(vshlq_m_n_s8): Likewise.
(vshlq_m_n_s32): Likewise.
(vshlq_m_n_s16): Likewise.
(vshlq_m_n_u8): Likewise.
(vshlq_m_n_u32): Likewise.
(vshlq_m_n_u16): Likewise.
(vshrq_m_n_s8): Likewise.
(vshrq_m_n_s32): Likewise.
(vshrq_m_n_s16): Likewise.
(vshrq_m_n_u8): Likewise.
(vshrq_m_n_u32): Likewise.
(vshrq_m_n_u16): Likewise.
(vsliq_m_n_s8): Likewise.
(vsliq_m_n_s32): Likewise.
(vsliq_m_n_s16): Likewise.
(vsliq_m_n_u8): Likewise.
(vsliq_m_n_u32): Likewise.
(vsliq_m_n_u16): Likewise.
(vsubq_m_n_s8): Likewise.
(vsubq_m_n_s32): Likewise.
(vsubq_m_n_s16): Likewise.
(vsubq_m_n_u8): Likewise.
(vsubq_m_n_u32): Likewise.
(vsubq_m_n_u16): Likewise.
(__arm_vabdq_m_s8): Define intrinsic.
(__arm_vabdq_m_s32): Likewise.
(__arm_vabdq_m_s16): Likewise.
(__arm_vabdq_m_u8): Likewise.
(__arm_vabdq_m_u32): Likewise.
(__arm_vabdq_m_u16): Likewise.
(__arm_vaddq_m_n_s8): Likewise.
(__arm_vaddq_m_n_s32): Likewise.
(__arm_vaddq_m_n_s16): Likewise.
(__arm_vaddq_m_n_u8): Likewise.
(__arm_vaddq_m_n_u32): Likewise.
(__arm_vaddq_m_n_u16): Likewise.
(__arm_vaddq_m_s8): Likewise.
(__arm_vaddq_m_s32): Likewise.
(__arm_vaddq_m_s16): Likewise.
(__arm_vaddq_m_u8): Likewise.
(__arm_vaddq_m_u32): Likewise.
(__arm_vaddq_m_u16): Likewise.
(__arm_vandq_m_s8): Likewise.
(__arm_vandq_m_s32): Likewise.
(__arm_vandq_m_s16): Likewise.
(__arm_vandq_m_u8): Likewise.
(__arm_vandq_m_u32): Likewise.
(__arm_vandq_m_u16): Likewise.
(__arm_vbicq_m_s8): Likewise.
(__arm_vbicq_m_s32): Likewise.
(__arm_vbicq_m_s16): Likewise.
(__arm_vbicq_m_u8): Likewise.
(__arm_vbicq_m_u32): Likewise.
(__arm_vbicq_m_u16): Likewise.
(__arm_vbrsrq_m_n_s8): Likewise.
(__arm_vbrsrq_m_n_s32): Likewise.
(__arm_vbrsrq_m_n_s16): Likewise.
(__arm_vbrsrq_m_n_u8): Likewise.
(__arm_vbrsrq_m_n_u32): Likewise.
(__arm_vbrsrq_m_n_u16): Likewise.
(__arm_vcaddq_rot270_m_s8): Likewise.
(__arm_vcaddq_rot270_m_s32): Likewise.
(__arm_vcaddq_rot270_m_s16): Likewise.
(__arm_vcaddq_rot270_m_u8): Likewise.
(__arm_vcaddq_rot270_m_u32): Likewise.
(__arm_vcaddq_rot270_m_u16): Likewise.
(__arm_vcaddq_rot90_m_s8): Likewise.
(__arm_vcaddq_rot90_m_s32): Likewise.
(__arm_vcaddq_rot90_m_s16): Likewise.
(__arm_vcaddq_rot90_m_u8): Likewise.
(__arm_vcaddq_rot90_m_u32): Likewise.
(__arm_vcaddq_rot90_m_u16): Likewise.
(__arm_veorq_m_s8): Likewise.
(__arm_veorq_m_s32): Likewise.
(__arm_veorq_m_s16): Likewise.
(__arm_veorq_m_u8): Likewise.
(__arm_veorq_m_u32): Likewise.
(__arm_veorq_m_u16): Likewise.
(__arm_vhaddq_m_n_s8): Likewise.
(__arm_vhaddq_m_n_s32): Likewise.
(__arm_vhaddq_m_n_s16): Likewise.
(__arm_vhaddq_m_n_u8): Likewise.
(__arm_vhaddq_m_n_u32): Likewise.
(__arm_vhaddq_m_n_u16): Likewise.
(__arm_vhaddq_m_s8): Likewise.
(__arm_vhaddq_m_s32): Likewise.
(__arm_vhaddq_m_s16): Likewise.
(__arm_vhaddq_m_u8): Likewise.
(__arm_vhaddq_m_u32): Likewise.
(__arm_vhaddq_m_u16): Likewise.
(__arm_vhcaddq_rot270_m_s8): Likewise.
(__arm_vhcaddq_rot270_m_s32): Likewise.
(__arm_vhcaddq_rot270_m_s16): Likewise.
(__arm_vhcaddq_rot90_m_s8): Likewise.
(__arm_vhcaddq_rot90_m_s32): Likewise.
(__arm_vhcaddq_rot90_m_s16): Likewise.
(__arm_vhsubq_m_n_s8): Likewise.
(__arm_vhsubq_m_n_s32): Likewise.
(__arm_vhsubq_m_n_s16): Likewise.
(__arm_vhsubq_m_n_u8): Likewise.
(__arm_vhsubq_m_n_u32): Likewise.
(__arm_vhsubq_m_n_u16): Likewise.
(__arm_vhsubq_m_s8): Likewise.
(__arm_vhsubq_m_s32): Likewise.
(__arm_vhsubq_m_s16): Likewise.
(__arm_vhsubq_m_u8): Likewise.
(__arm_vhsubq_m_u32): Likewise.
(__arm_vhsubq_m_u16): Likewise.
(__arm_vmaxq_m_s8): Likewise.
(__arm_vmaxq_m_s32): Likewise.
(__arm_vmaxq_m_s16): Likewise.
(__arm_vmaxq_m_u8): Likewise.
(__arm_vmaxq_m_u32): Likewise.
(__arm_vmaxq_m_u16): Likewise.
(__arm_vminq_m_s8): Likewise.
(__arm_vminq_m_s32): Likewise.
(__arm_vminq_m_s16): Likewise.
(__arm_vminq_m_u8): Likewise.
(__arm_vminq_m_u32): Likewise.
(__arm_vminq_m_u16): Likewise.
(__arm_vmladavaq_p_s8): Likewise.
(__arm_vmladavaq_p_s32): Likewise.
(__arm_vmladavaq_p_s16): Likewise.
(__arm_vmladavaq_p_u8): Likewise.
(__arm_vmladavaq_p_u32): Likewise.
(__arm_vmladavaq_p_u16): Likewise.
(__arm_vmladavaxq_p_s8): Likewise.
(__arm_vmladavaxq_p_s32): Likewise.
(__arm_vmladavaxq_p_s16): Likewise.
(__arm_vmlaq_m_n_s8): Likewise.
(__arm_vmlaq_m_n_s32): Likewise.
(__arm_vmlaq_m_n_s16): Likewise.
(__arm_vmlaq_m_n_u8): Likewise.
(__arm_vmlaq_m_n_u32): Likewise.
(__arm_vmlaq_m_n_u16): Likewise.
(__arm_vmlasq_m_n_s8): Likewise.
(__arm_vmlasq_m_n_s32): Likewise.
(__arm_vmlasq_m_n_s16): Likewise.
(__arm_vmlasq_m_n_u8): Likewise.
(__arm_vmlasq_m_n_u32): Likewise.
(__arm_vmlasq_m_n_u16): Likewise.
(__arm_vmlsdavaq_p_s8): Likewise.
(__arm_vmlsdavaq_p_s32): Likewise.
(__arm_vmlsdavaq_p_s16): Likewise.
(__arm_vmlsdavaxq_p_s8): Likewise.
(__arm_vmlsdavaxq_p_s32): Likewise.
(__arm_vmlsdavaxq_p_s16): Likewise.
(__arm_vmulhq_m_s8): Likewise.
(__arm_vmulhq_m_s32): Likewise.
(__arm_vmulhq_m_s16): Likewise.
(__arm_vmulhq_m_u8): Likewise.
(__arm_vmulhq_m_u32): Likewise.
(__arm_vmulhq_m_u16): Likewise.
(__arm_vmullbq_int_m_s8): Likewise.
(__arm_vmullbq_int_m_s32): Likewise.
(__arm_vmullbq_int_m_s16): Likewise.
(__arm_vmullbq_int_m_u8): Likewise.
(__arm_vmullbq_int_m_u32): Likewise.
(__arm_vmullbq_int_m_u16): Likewise.
(__arm_vmulltq_int_m_s8): Likewise.
(__arm_vmulltq_int_m_s32): Likewise.
(__arm_vmulltq_int_m_s16): Likewise.
(__arm_vmulltq_int_m_u8): Likewise.
(__arm_vmulltq_int_m_u32): Likewise.
(__arm_vmulltq_int_m_u16): Likewise.
(__arm_vmulq_m_n_s8): Likewise.
(__arm_vmulq_m_n_s32): Likewise.
(__arm_vmulq_m_n_s16): Likewise.
(__arm_vmulq_m_n_u8): Likewise.
(__arm_vmulq_m_n_u32): Likewise.
(__arm_vmulq_m_n_u16): Likewise.
(__arm_vmulq_m_s8): Likewise.
(__arm_vmulq_m_s32): Likewise.
(__arm_vmulq_m_s16): Likewise.
(__arm_vmulq_m_u8): Likewise.
(__arm_vmulq_m_u32): Likewise.
(__arm_vmulq_m_u16): Likewise.
(__arm_vornq_m_s8): Likewise.
(__arm_vornq_m_s32): Likewise.
(__arm_vornq_m_s16): Likewise.
(__arm_vornq_m_u8): Likewise.
(__arm_vornq_m_u32): Likewise.
(__arm_vornq_m_u16): Likewise.
(__arm_vorrq_m_s8): Likewise.
(__arm_vorrq_m_s32): Likewise.
(__arm_vorrq_m_s16): Likewise.
(__arm_vorrq_m_u8): Likewise.
(__arm_vorrq_m_u32): Likewise.
(__arm_vorrq_m_u16): Likewise.
(__arm_vqaddq_m_n_s8): Likewise.
(__arm_vqaddq_m_n_s32): Likewise.
(__arm_vqaddq_m_n_s16): Likewise.
(__arm_vqaddq_m_n_u8): Likewise.
(__arm_vqaddq_m_n_u32): Likewise.
(__arm_vqaddq_m_n_u16): Likewise.
(__arm_vqaddq_m_s8): Likewise.
(__arm_vqaddq_m_s32): Likewise.
(__arm_vqaddq_m_s16): Likewise.
(__arm_vqaddq_m_u8): Likewise.
(__arm_vqaddq_m_u32): Likewise.
(__arm_vqaddq_m_u16): Likewise.
(__arm_vqdmladhq_m_s8): Likewise.
(__arm_vqdmladhq_m_s32): Likewise.
(__arm_vqdmladhq_m_s16): Likewise.
(__arm_vqdmladhxq_m_s8): Likewise.
(__arm_vqdmladhxq_m_s32): Likewise.
(__arm_vqdmladhxq_m_s16): Likewise.
(__arm_vqdmlahq_m_n_s8): Likewise.
(__arm_vqdmlahq_m_n_s32): Likewise.
(__arm_vqdmlahq_m_n_s16): Likewise.
(__arm_vqdmlahq_m_n_u8): Likewise.
(__arm_vqdmlahq_m_n_u32): Likewise.
(__arm_vqdmlahq_m_n_u16): Likewise.
(__arm_vqdmlsdhq_m_s8): Likewise.
(__arm_vqdmlsdhq_m_s32): Likewise.
(__arm_vqdmlsdhq_m_s16): Likewise.
(__arm_vqdmlsdhxq_m_s8): Likewise.
(__arm_vqdmlsdhxq_m_s32): Likewise.
(__arm_vqdmlsdhxq_m_s16): Likewise.
(__arm_vqdmulhq_m_n_s8): Likewise.
(__arm_vqdmulhq_m_n_s32): Likewise.
(__arm_vqdmulhq_m_n_s16): Likewise.
(__arm_vqdmulhq_m_s8): Likewise.
(__arm_vqdmulhq_m_s32): Likewise.
(__arm_vqdmulhq_m_s16): Likewise.
(__arm_vqrdmladhq_m_s8): Likewise.
(__arm_vqrdmladhq_m_s32): Likewise.
(__arm_vqrdmladhq_m_s16): Likewise.
(__arm_vqrdmladhxq_m_s8): Likewise.
(__arm_vqrdmladhxq_m_s32): Likewise.
(__arm_vqrdmladhxq_m_s16): Likewise.
(__arm_vqrdmlahq_m_n_s8): Likewise.
(__arm_vqrdmlahq_m_n_s32): Likewise.
(__arm_vqrdmlahq_m_n_s16): Likewise.
(__arm_vqrdmlahq_m_n_u8): Likewise.
(__arm_vqrdmlahq_m_n_u32): Likewise.
(__arm_vqrdmlahq_m_n_u16): Likewise.
(__arm_vqrdmlashq_m_n_s8): Likewise.
(__arm_vqrdmlashq_m_n_s32): Likewise.
(__arm_vqrdmlashq_m_n_s16): Likewise.
(__arm_vqrdmlashq_m_n_u8): Likewise.
(__arm_vqrdmlashq_m_n_u32): Likewise.
(__arm_vqrdmlashq_m_n_u16): Likewise.
(__arm_vqrdmlsdhq_m_s8): Likewise.
(__arm_vqrdmlsdhq_m_s32): Likewise.
(__arm_vqrdmlsdhq_m_s16): Likewise.
(__arm_vqrdmlsdhxq_m_s8): Likewise.
(__arm_vqrdmlsdhxq_m_s32): Likewise.
(__arm_vqrdmlsdhxq_m_s16): Likewise.
(__arm_vqrdmulhq_m_n_s8): Likewise.
(__arm_vqrdmulhq_m_n_s32): Likewise.
(__arm_vqrdmulhq_m_n_s16): Likewise.
(__arm_vqrdmulhq_m_s8): Likewise.
(__arm_vqrdmulhq_m_s32): Likewise.
(__arm_vqrdmulhq_m_s16): Likewise.
(__arm_vqrshlq_m_s8): Likewise.
(__arm_vqrshlq_m_s32): Likewise.
(__arm_vqrshlq_m_s16): Likewise.
(__arm_vqrshlq_m_u8): Likewise.
(__arm_vqrshlq_m_u32): Likewise.
(__arm_vqrshlq_m_u16): Likewise.
(__arm_vqshlq_m_n_s8): Likewise.
(__arm_vqshlq_m_n_s32): Likewise.
(__arm_vqshlq_m_n_s16): Likewise.
(__arm_vqshlq_m_n_u8): Likewise.
(__arm_vqshlq_m_n_u32): Likewise.
(__arm_vqshlq_m_n_u16): Likewise.
(__arm_vqshlq_m_s8): Likewise.
(__arm_vqshlq_m_s32): Likewise.
(__arm_vqshlq_m_s16): Likewise.
(__arm_vqshlq_m_u8): Likewise.
(__arm_vqshlq_m_u32): Likewise.
(__arm_vqshlq_m_u16): Likewise.
(__arm_vqsubq_m_n_s8): Likewise.
(__arm_vqsubq_m_n_s32): Likewise.
(__arm_vqsubq_m_n_s16): Likewise.
(__arm_vqsubq_m_n_u8): Likewise.
(__arm_vqsubq_m_n_u32): Likewise.
(__arm_vqsubq_m_n_u16): Likewise.
(__arm_vqsubq_m_s8): Likewise.
(__arm_vqsubq_m_s32): Likewise.
(__arm_vqsubq_m_s16): Likewise.
(__arm_vqsubq_m_u8): Likewise.
(__arm_vqsubq_m_u32): Likewise.
(__arm_vqsubq_m_u16): Likewise.
(__arm_vrhaddq_m_s8): Likewise.
(__arm_vrhaddq_m_s32): Likewise.
(__arm_vrhaddq_m_s16): Likewise.
(__arm_vrhaddq_m_u8): Likewise.
(__arm_vrhaddq_m_u32): Likewise.
(__arm_vrhaddq_m_u16): Likewise.
(__arm_vrmulhq_m_s8): Likewise.
(__arm_vrmulhq_m_s32): Likewise.
(__arm_vrmulhq_m_s16): Likewise.
(__arm_vrmulhq_m_u8): Likewise.
(__arm_vrmulhq_m_u32): Likewise.
(__arm_vrmulhq_m_u16): Likewise.
(__arm_vrshlq_m_s8): Likewise.
(__arm_vrshlq_m_s32): Likewise.
(__arm_vrshlq_m_s16): Likewise.
(__arm_vrshlq_m_u8): Likewise.
(__arm_vrshlq_m_u32): Likewise.
(__arm_vrshlq_m_u16): Likewise.
(__arm_vrshrq_m_n_s8): Likewise.
(__arm_vrshrq_m_n_s32): Likewise.
(__arm_vrshrq_m_n_s16): Likewise.
(__arm_vrshrq_m_n_u8): Likewise.
(__arm_vrshrq_m_n_u32): Likewise.
(__arm_vrshrq_m_n_u16): Likewise.
(__arm_vshlq_m_n_s8): Likewise.
(__arm_vshlq_m_n_s32): Likewise.
(__arm_vshlq_m_n_s16): Likewise.
(__arm_vshlq_m_n_u8): Likewise.
(__arm_vshlq_m_n_u32): Likewise.
(__arm_vshlq_m_n_u16): Likewise.
(__arm_vshrq_m_n_s8): Likewise.
(__arm_vshrq_m_n_s32): Likewise.
(__arm_vshrq_m_n_s16): Likewise.
(__arm_vshrq_m_n_u8): Likewise.
(__arm_vshrq_m_n_u32): Likewise.
(__arm_vshrq_m_n_u16): Likewise.
(__arm_vsliq_m_n_s8): Likewise.
(__arm_vsliq_m_n_s32): Likewise.
(__arm_vsliq_m_n_s16): Likewise.
(__arm_vsliq_m_n_u8): Likewise.
(__arm_vsliq_m_n_u32): Likewise.
(__arm_vsliq_m_n_u16): Likewise.
(__arm_vsubq_m_n_s8): Likewise.
(__arm_vsubq_m_n_s32): Likewise.
(__arm_vsubq_m_n_s16): Likewise.
(__arm_vsubq_m_n_u8): Likewise.
(__arm_vsubq_m_n_u32): Likewise.
(__arm_vsubq_m_n_u16): Likewise.
(vqdmladhq_m): Define polymorphic variant.
(vqdmladhxq_m): Likewise.
(vqdmlsdhq_m): Likewise.
(vqdmlsdhxq_m): Likewise.
(vabdq_m): Likewise.
(vandq_m): Likewise.
(vbicq_m): Likewise.
(vbrsrq_m_n): Likewise.
(vcaddq_rot270_m): Likewise.
(vcaddq_rot90_m): Likewise.
(veorq_m): Likewise.
(vmaxq_m): Likewise.
(vminq_m): Likewise.
(vmladavaq_p): Likewise.
(vmlaq_m_n): Likewise.
(vmlasq_m_n): Likewise.
(vmulhq_m): Likewise.
(vmullbq_int_m): Likewise.
(vmulltq_int_m): Likewise.
(vornq_m): Likewise.
(vorrq_m): Likewise.
(vqdmlahq_m_n): Likewise.
(vqrdmlahq_m_n): Likewise.
(vqrdmlashq_m_n): Likewise.
(vqrshlq_m): Likewise.
(vqshlq_m_n): Likewise.
(vqshlq_m): Likewise.
(vrhaddq_m): Likewise.
(vrmulhq_m): Likewise.
(vrshlq_m): Likewise.
(vrshrq_m_n): Likewise.
(vshlq_m_n): Likewise.
(vshrq_m_n): Likewise.
(vsliq_m): Likewise.
(vaddq_m_n): Likewise.
(vaddq_m): Likewise.
(vhaddq_m_n): Likewise.
(vhaddq_m): Likewise.
(vhcaddq_rot270_m): Likewise.
(vhcaddq_rot90_m): Likewise.
(vhsubq_m): Likewise.
(vhsubq_m_n): Likewise.
(vmulq_m_n): Likewise.
(vmulq_m): Likewise.
(vqaddq_m_n): Likewise.
(vqaddq_m): Likewise.
(vqdmulhq_m_n): Likewise.
(vqdmulhq_m): Likewise.
(vsubq_m_n): Likewise.
(vsliq_m_n): Likewise.
(vqsubq_m_n): Likewise.
(vqsubq_m): Likewise.
(vqrdmulhq_m): Likewise.
(vqrdmulhq_m_n): Likewise.
(vqrdmlsdhxq_m): Likewise.
(vqrdmlsdhq_m): Likewise.
(vqrdmladhq_m): Likewise.
(vqrdmladhxq_m): Likewise.
(vmlsdavaxq_p): Likewise.
(vmlsdavaq_p): Likewise.
(vmladavaxq_p): Likewise.
* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (VHSUBQ_M): Define iterators.
(VSLIQ_M_N): Likewise.
(VQRDMLAHQ_M_N): Likewise.
(VRSHLQ_M): Likewise.
(VMINQ_M): Likewise.
(VMULLBQ_INT_M): Likewise.
(VMULHQ_M): Likewise.
(VMULQ_M): Likewise.
(VHSUBQ_M_N): Likewise.
(VHADDQ_M_N): Likewise.
(VORRQ_M): Likewise.
(VRMULHQ_M): Likewise.
(VQADDQ_M): Likewise.
(VRSHRQ_M_N): Likewise.
(VQSUBQ_M_N): Likewise.
(VADDQ_M): Likewise.
(VORNQ_M): Likewise.
(VQDMLAHQ_M_N): Likewise.
(VRHADDQ_M): Likewise.
(VQSHLQ_M): Likewise.
(VANDQ_M): Likewise.
(VBICQ_M): Likewise.
(VSHLQ_M_N): Likewise.
(VCADDQ_ROT270_M): Likewise.
(VQRSHLQ_M): Likewise.
(VQADDQ_M_N): Likewise.
(VADDQ_M_N): Likewise.
(VMAXQ_M): Likewise.
(VQSUBQ_M): Likewise.
(VMLASQ_M_N): Likewise.
(VMLADAVAQ_P): Likewise.
(VBRSRQ_M_N): Likewise.
(VMULQ_M_N): Likewise.
(VCADDQ_ROT90_M): Likewise.
(VMULLTQ_INT_M): Likewise.
(VEORQ_M): Likewise.
(VSHRQ_M_N): Likewise.
(VSUBQ_M_N): Likewise.
(VHADDQ_M): Likewise.
(VABDQ_M): Likewise.
(VQRDMLASHQ_M_N): Likewise.
(VMLAQ_M_N): Likewise.
(VQSHLQ_M_N): Likewise.
(mve_vabdq_m_<supf><mode>): Define RTL pattern.
(mve_vaddq_m_n_<supf><mode>): Likewise.
(mve_vaddq_m_<supf><mode>): Likewise.
(mve_vandq_m_<supf><mode>): Likewise.
(mve_vbicq_m_<supf><mode>): Likewise.
(mve_vbrsrq_m_n_<supf><mode>): Likewise.
(mve_vcaddq_rot270_m_<supf><mode>): Likewise.
(mve_vcaddq_rot90_m_<supf><mode>): Likewise.
(mve_veorq_m_<supf><mode>): Likewise.
(mve_vhaddq_m_n_<supf><mode>): Likewise.
(mve_vhaddq_m_<supf><mode>): Likewise.
(mve_vhsubq_m_n_<supf><mode>): Likewise.
(mve_vhsubq_m_<supf><mode>): Likewise.
(mve_vmaxq_m_<supf><mode>): Likewise.
(mve_vminq_m_<supf><mode>): Likewise.
(mve_vmladavaq_p_<supf><mode>): Likewise.
(mve_vmlaq_m_n_<supf><mode>): Likewise.
(mve_vmlasq_m_n_<supf><mode>): Likewise.
(mve_vmulhq_m_<supf><mode>): Likewise.
(mve_vmullbq_int_m_<supf><mode>): Likewise.
(mve_vmulltq_int_m_<supf><mode>): Likewise.
(mve_vmulq_m_n_<supf><mode>): Likewise.
(mve_vmulq_m_<supf><mode>): Likewise.
(mve_vornq_m_<supf><mode>): Likewise.
(mve_vorrq_m_<supf><mode>): Likewise.
(mve_vqaddq_m_n_<supf><mode>): Likewise.
(mve_vqaddq_m_<supf><mode>): Likewise.
(mve_vqdmlahq_m_n_<supf><mode>): Likewise.
(mve_vqrdmlahq_m_n_<supf><mode>): Likewise.
(mve_vqrdmlashq_m_n_<supf><mode>): Likewise.
(mve_vqrshlq_m_<supf><mode>): Likewise.
(mve_vqshlq_m_n_<supf><mode>): Likewise.
(mve_vqshlq_m_<supf><mode>): Likewise.
(mve_vqsubq_m_n_<supf><mode>): Likewise.
(mve_vqsubq_m_<supf><mode>): Likewise.
(mve_vrhaddq_m_<supf><mode>): Likewise.
(mve_vrmulhq_m_<supf><mode>): Likewise.
(mve_vrshlq_m_<supf><mode>): Likewise.
(mve_vrshrq_m_n_<supf><mode>): Likewise.
(mve_vshlq_m_n_<supf><mode>): Likewise.
(mve_vshrq_m_n_<supf><mode>): Likewise.
(mve_vsliq_m_n_<supf><mode>): Likewise.
(mve_vsubq_m_n_<supf><mode>): Likewise.
(mve_vhcaddq_rot270_m_s<mode>): Likewise.
(mve_vhcaddq_rot90_m_s<mode>): Likewise.
(mve_vmladavaxq_p_s<mode>): Likewise.
(mve_vmlsdavaq_p_s<mode>): Likewise.
(mve_vmlsdavaxq_p_s<mode>): Likewise.
(mve_vqdmladhq_m_s<mode>): Likewise.
(mve_vqdmladhxq_m_s<mode>): Likewise.
(mve_vqdmlsdhq_m_s<mode>): Likewise.
(mve_vqdmlsdhxq_m_s<mode>): Likewise.
(mve_vqdmulhq_m_n_s<mode>): Likewise.
(mve_vqdmulhq_m_s<mode>): Likewise.
(mve_vqrdmladhq_m_s<mode>): Likewise.
(mve_vqrdmladhxq_m_s<mode>): Likewise.
(mve_vqrdmlsdhq_m_s<mode>): Likewise.
(mve_vqrdmlsdhxq_m_s<mode>): Likewise.
(mve_vqrdmulhq_m_n_s<mode>): Likewise.
(mve_vqrdmulhq_m_s<mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_m_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_u8.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:47:31 +0000 (16:47 +0000)]
[ARM][GCC][1/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vsriq_m_n_s8, vsubq_m_s8, vsubq_x_s8, vcvtq_m_n_f16_u16, vcvtq_x_n_f16_u16,
vqshluq_m_n_s8, vabavq_p_s8, vsriq_m_n_u8, vshlq_m_u8, vshlq_x_u8, vsubq_m_u8,
vsubq_x_u8, vabavq_p_u8, vshlq_m_s8, vshlq_x_s8, vcvtq_m_n_f16_s16,
vcvtq_x_n_f16_s16, vsriq_m_n_s16, vsubq_m_s16, vsubq_x_s16, vcvtq_m_n_f32_u32,
vcvtq_x_n_f32_u32, vqshluq_m_n_s16, vabavq_p_s16, vsriq_m_n_u16,
vshlq_m_u16, vshlq_x_u16, vsubq_m_u16, vsubq_x_u16, vabavq_p_u16, vshlq_m_s16,
vshlq_x_s16, vcvtq_m_n_f32_s32, vcvtq_x_n_f32_s32, vsriq_m_n_s32, vsubq_m_s32,
vsubq_x_s32, vqshluq_m_n_s32, vabavq_p_s32, vsriq_m_n_u32, vshlq_m_u32,
vshlq_x_u32, vsubq_m_u32, vsubq_x_u32, vabavq_p_u32, vshlq_m_s32, vshlq_x_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (QUADOP_UNONE_UNONE_NONE_NONE_UNONE_QUALIFIERS):
Define builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vsriq_m_n_s8): Define macro.
(vsubq_m_s8): Likewise.
(vcvtq_m_n_f16_u16): Likewise.
(vqshluq_m_n_s8): Likewise.
(vabavq_p_s8): Likewise.
(vsriq_m_n_u8): Likewise.
(vshlq_m_u8): Likewise.
(vsubq_m_u8): Likewise.
(vabavq_p_u8): Likewise.
(vshlq_m_s8): Likewise.
(vcvtq_m_n_f16_s16): Likewise.
(vsriq_m_n_s16): Likewise.
(vsubq_m_s16): Likewise.
(vcvtq_m_n_f32_u32): Likewise.
(vqshluq_m_n_s16): Likewise.
(vabavq_p_s16): Likewise.
(vsriq_m_n_u16): Likewise.
(vshlq_m_u16): Likewise.
(vsubq_m_u16): Likewise.
(vabavq_p_u16): Likewise.
(vshlq_m_s16): Likewise.
(vcvtq_m_n_f32_s32): Likewise.
(vsriq_m_n_s32): Likewise.
(vsubq_m_s32): Likewise.
(vqshluq_m_n_s32): Likewise.
(vabavq_p_s32): Likewise.
(vsriq_m_n_u32): Likewise.
(vshlq_m_u32): Likewise.
(vsubq_m_u32): Likewise.
(vabavq_p_u32): Likewise.
(vshlq_m_s32): Likewise.
(__arm_vsriq_m_n_s8): Define intrinsic.
(__arm_vsubq_m_s8): Likewise.
(__arm_vqshluq_m_n_s8): Likewise.
(__arm_vabavq_p_s8): Likewise.
(__arm_vsriq_m_n_u8): Likewise.
(__arm_vshlq_m_u8): Likewise.
(__arm_vsubq_m_u8): Likewise.
(__arm_vabavq_p_u8): Likewise.
(__arm_vshlq_m_s8): Likewise.
(__arm_vsriq_m_n_s16): Likewise.
(__arm_vsubq_m_s16): Likewise.
(__arm_vqshluq_m_n_s16): Likewise.
(__arm_vabavq_p_s16): Likewise.
(__arm_vsriq_m_n_u16): Likewise.
(__arm_vshlq_m_u16): Likewise.
(__arm_vsubq_m_u16): Likewise.
(__arm_vabavq_p_u16): Likewise.
(__arm_vshlq_m_s16): Likewise.
(__arm_vsriq_m_n_s32): Likewise.
(__arm_vsubq_m_s32): Likewise.
(__arm_vqshluq_m_n_s32): Likewise.
(__arm_vabavq_p_s32): Likewise.
(__arm_vsriq_m_n_u32): Likewise.
(__arm_vshlq_m_u32): Likewise.
(__arm_vsubq_m_u32): Likewise.
(__arm_vabavq_p_u32): Likewise.
(__arm_vshlq_m_s32): Likewise.
(__arm_vcvtq_m_n_f16_u16): Likewise.
(__arm_vcvtq_m_n_f16_s16): Likewise.
(__arm_vcvtq_m_n_f32_u32): Likewise.
(__arm_vcvtq_m_n_f32_s32): Likewise.
(vcvtq_m_n): Define polymorphic variant.
(vqshluq_m_n): Likewise.
(vshlq_m): Likewise.
(vsriq_m_n): Likewise.
(vsubq_m): Likewise.
(vabavq_p): Likewise.
* config/arm/arm_mve_builtins.def
(QUADOP_UNONE_UNONE_NONE_NONE_UNONE_QUALIFIERS): Use builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
* config/arm/mve.md (VABAVQ_P): Define iterator.
(VSHLQ_M): Likewise.
(VSRIQ_M_N): Likewise.
(VSUBQ_M): Likewise.
(VCVTQ_M_N_TO_F): Likewise.
(mve_vabavq_p_<supf><mode>): Define RTL pattern.
(mve_vqshluq_m_n_s<mode>): Likewise.
(mve_vshlq_m_<supf><mode>): Likewise.
(mve_vsriq_m_n_<supf><mode>): Likewise.
(mve_vsubq_m_<supf><mode>): Likewise.
(mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabavq_p_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_u8.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:37:18 +0000 (16:37 +0000)]
[ARM][GCC][3/3x]: MVE intrinsics with ternary operands.
This patch supports following MVE ACLE intrinsics with ternary operands.
vrmlaldavhaxq_s32, vrmlsldavhaq_s32, vrmlsldavhaxq_s32, vaddlvaq_p_s32, vcvtbq_m_f16_f32, vcvtbq_m_f32_f16, vcvttq_m_f16_f32, vcvttq_m_f32_f16, vrev16q_m_s8, vrev32q_m_f16, vrmlaldavhq_p_s32, vrmlaldavhxq_p_s32, vrmlsldavhq_p_s32, vrmlsldavhxq_p_s32, vaddlvaq_p_u32, vrev16q_m_u8, vrmlaldavhq_p_u32, vmvnq_m_n_s16, vorrq_m_n_s16, vqrshrntq_n_s16, vqshrnbq_n_s16, vqshrntq_n_s16, vrshrnbq_n_s16, vrshrntq_n_s16, vshrnbq_n_s16, vshrntq_n_s16, vcmlaq_f16, vcmlaq_rot180_f16, vcmlaq_rot270_f16, vcmlaq_rot90_f16, vfmaq_f16, vfmaq_n_f16, vfmasq_n_f16, vfmsq_f16, vmlaldavaq_s16, vmlaldavaxq_s16, vmlsldavaq_s16, vmlsldavaxq_s16, vabsq_m_f16, vcvtmq_m_s16_f16, vcvtnq_m_s16_f16, vcvtpq_m_s16_f16, vcvtq_m_s16_f16, vdupq_m_n_f16, vmaxnmaq_m_f16, vmaxnmavq_p_f16, vmaxnmvq_p_f16, vminnmaq_m_f16, vminnmavq_p_f16, vminnmvq_p_f16, vmlaldavq_p_s16, vmlaldavxq_p_s16, vmlsldavq_p_s16, vmlsldavxq_p_s16, vmovlbq_m_s8, vmovltq_m_s8, vmovnbq_m_s16, vmovntq_m_s16, vnegq_m_f16, vpselq_f16, vqmovnbq_m_s16, vqmovntq_m_s16, vrev32q_m_s8, vrev64q_m_f16, vrndaq_m_f16, vrndmq_m_f16, vrndnq_m_f16, vrndpq_m_f16, vrndq_m_f16, vrndxq_m_f16, vcmpeqq_m_n_f16, vcmpgeq_m_f16, vcmpgeq_m_n_f16, vcmpgtq_m_f16, vcmpgtq_m_n_f16, vcmpleq_m_f16, vcmpleq_m_n_f16, vcmpltq_m_f16, vcmpltq_m_n_f16, vcmpneq_m_f16, vcmpneq_m_n_f16, vmvnq_m_n_u16, vorrq_m_n_u16, vqrshruntq_n_s16, vqshrunbq_n_s16, vqshruntq_n_s16, vcvtmq_m_u16_f16, vcvtnq_m_u16_f16, vcvtpq_m_u16_f16, vcvtq_m_u16_f16, vqmovunbq_m_s16, vqmovuntq_m_s16, vqrshrntq_n_u16, vqshrnbq_n_u16, vqshrntq_n_u16, vrshrnbq_n_u16, vrshrntq_n_u16, vshrnbq_n_u16, vshrntq_n_u16, vmlaldavaq_u16, vmlaldavaxq_u16, vmlaldavq_p_u16, vmlaldavxq_p_u16, vmovlbq_m_u8, vmovltq_m_u8, vmovnbq_m_u16, vmovntq_m_u16, vqmovnbq_m_u16, vqmovntq_m_u16, vrev32q_m_u8, vmvnq_m_n_s32, vorrq_m_n_s32, vqrshrntq_n_s32, vqshrnbq_n_s32, vqshrntq_n_s32, vrshrnbq_n_s32, vrshrntq_n_s32, vshrnbq_n_s32, vshrntq_n_s32, vcmlaq_f32, vcmlaq_rot180_f32, vcmlaq_rot270_f32, vcmlaq_rot90_f32, vfmaq_f32, vfmaq_n_f32, vfmasq_n_f32, vfmsq_f32, vmlaldavaq_s32, vmlaldavaxq_s32, vmlsldavaq_s32, vmlsldavaxq_s32, vabsq_m_f32, vcvtmq_m_s32_f32, vcvtnq_m_s32_f32, vcvtpq_m_s32_f32, vcvtq_m_s32_f32, vdupq_m_n_f32, vmaxnmaq_m_f32, vmaxnmavq_p_f32, vmaxnmvq_p_f32, vminnmaq_m_f32, vminnmavq_p_f32, vminnmvq_p_f32, vmlaldavq_p_s32, vmlaldavxq_p_s32, vmlsldavq_p_s32, vmlsldavxq_p_s32, vmovlbq_m_s16, vmovltq_m_s16, vmovnbq_m_s32, vmovntq_m_s32, vnegq_m_f32, vpselq_f32, vqmovnbq_m_s32, vqmovntq_m_s32, vrev32q_m_s16, vrev64q_m_f32, vrndaq_m_f32, vrndmq_m_f32, vrndnq_m_f32, vrndpq_m_f32, vrndq_m_f32, vrndxq_m_f32, vcmpeqq_m_n_f32, vcmpgeq_m_f32, vcmpgeq_m_n_f32, vcmpgtq_m_f32, vcmpgtq_m_n_f32, vcmpleq_m_f32, vcmpleq_m_n_f32, vcmpltq_m_f32, vcmpltq_m_n_f32, vcmpneq_m_f32, vcmpneq_m_n_f32, vmvnq_m_n_u32, vorrq_m_n_u32, vqrshruntq_n_s32, vqshrunbq_n_s32, vqshruntq_n_s32, vcvtmq_m_u32_f32, vcvtnq_m_u32_f32, vcvtpq_m_u32_f32, vcvtq_m_u32_f32, vqmovunbq_m_s32, vqmovuntq_m_s32, vqrshrntq_n_u32, vqshrnbq_n_u32, vqshrntq_n_u32, vrshrnbq_n_u32, vrshrntq_n_u32, vshrnbq_n_u32, vshrntq_n_u32, vmlaldavaq_u32, vmlaldavaxq_u32, vmlaldavq_p_u32, vmlaldavxq_p_u32, vmovlbq_m_u16, vmovltq_m_u16, vmovnbq_m_u32, vmovntq_m_u32, vqmovnbq_m_u32, vqmovntq_m_u32, vrev32q_m_u16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vrmlaldavhaxq_s32): Define macro.
(vrmlsldavhaq_s32): Likewise.
(vrmlsldavhaxq_s32): Likewise.
(vaddlvaq_p_s32): Likewise.
(vcvtbq_m_f16_f32): Likewise.
(vcvtbq_m_f32_f16): Likewise.
(vcvttq_m_f16_f32): Likewise.
(vcvttq_m_f32_f16): Likewise.
(vrev16q_m_s8): Likewise.
(vrev32q_m_f16): Likewise.
(vrmlaldavhq_p_s32): Likewise.
(vrmlaldavhxq_p_s32): Likewise.
(vrmlsldavhq_p_s32): Likewise.
(vrmlsldavhxq_p_s32): Likewise.
(vaddlvaq_p_u32): Likewise.
(vrev16q_m_u8): Likewise.
(vrmlaldavhq_p_u32): Likewise.
(vmvnq_m_n_s16): Likewise.
(vorrq_m_n_s16): Likewise.
(vqrshrntq_n_s16): Likewise.
(vqshrnbq_n_s16): Likewise.
(vqshrntq_n_s16): Likewise.
(vrshrnbq_n_s16): Likewise.
(vrshrntq_n_s16): Likewise.
(vshrnbq_n_s16): Likewise.
(vshrntq_n_s16): Likewise.
(vcmlaq_f16): Likewise.
(vcmlaq_rot180_f16): Likewise.
(vcmlaq_rot270_f16): Likewise.
(vcmlaq_rot90_f16): Likewise.
(vfmaq_f16): Likewise.
(vfmaq_n_f16): Likewise.
(vfmasq_n_f16): Likewise.
(vfmsq_f16): Likewise.
(vmlaldavaq_s16): Likewise.
(vmlaldavaxq_s16): Likewise.
(vmlsldavaq_s16): Likewise.
(vmlsldavaxq_s16): Likewise.
(vabsq_m_f16): Likewise.
(vcvtmq_m_s16_f16): Likewise.
(vcvtnq_m_s16_f16): Likewise.
(vcvtpq_m_s16_f16): Likewise.
(vcvtq_m_s16_f16): Likewise.
(vdupq_m_n_f16): Likewise.
(vmaxnmaq_m_f16): Likewise.
(vmaxnmavq_p_f16): Likewise.
(vmaxnmvq_p_f16): Likewise.
(vminnmaq_m_f16): Likewise.
(vminnmavq_p_f16): Likewise.
(vminnmvq_p_f16): Likewise.
(vmlaldavq_p_s16): Likewise.
(vmlaldavxq_p_s16): Likewise.
(vmlsldavq_p_s16): Likewise.
(vmlsldavxq_p_s16): Likewise.
(vmovlbq_m_s8): Likewise.
(vmovltq_m_s8): Likewise.
(vmovnbq_m_s16): Likewise.
(vmovntq_m_s16): Likewise.
(vnegq_m_f16): Likewise.
(vpselq_f16): Likewise.
(vqmovnbq_m_s16): Likewise.
(vqmovntq_m_s16): Likewise.
(vrev32q_m_s8): Likewise.
(vrev64q_m_f16): Likewise.
(vrndaq_m_f16): Likewise.
(vrndmq_m_f16): Likewise.
(vrndnq_m_f16): Likewise.
(vrndpq_m_f16): Likewise.
(vrndq_m_f16): Likewise.
(vrndxq_m_f16): Likewise.
(vcmpeqq_m_n_f16): Likewise.
(vcmpgeq_m_f16): Likewise.
(vcmpgeq_m_n_f16): Likewise.
(vcmpgtq_m_f16): Likewise.
(vcmpgtq_m_n_f16): Likewise.
(vcmpleq_m_f16): Likewise.
(vcmpleq_m_n_f16): Likewise.
(vcmpltq_m_f16): Likewise.
(vcmpltq_m_n_f16): Likewise.
(vcmpneq_m_f16): Likewise.
(vcmpneq_m_n_f16): Likewise.
(vmvnq_m_n_u16): Likewise.
(vorrq_m_n_u16): Likewise.
(vqrshruntq_n_s16): Likewise.
(vqshrunbq_n_s16): Likewise.
(vqshruntq_n_s16): Likewise.
(vcvtmq_m_u16_f16): Likewise.
(vcvtnq_m_u16_f16): Likewise.
(vcvtpq_m_u16_f16): Likewise.
(vcvtq_m_u16_f16): Likewise.
(vqmovunbq_m_s16): Likewise.
(vqmovuntq_m_s16): Likewise.
(vqrshrntq_n_u16): Likewise.
(vqshrnbq_n_u16): Likewise.
(vqshrntq_n_u16): Likewise.
(vrshrnbq_n_u16): Likewise.
(vrshrntq_n_u16): Likewise.
(vshrnbq_n_u16): Likewise.
(vshrntq_n_u16): Likewise.
(vmlaldavaq_u16): Likewise.
(vmlaldavaxq_u16): Likewise.
(vmlaldavq_p_u16): Likewise.
(vmlaldavxq_p_u16): Likewise.
(vmovlbq_m_u8): Likewise.
(vmovltq_m_u8): Likewise.
(vmovnbq_m_u16): Likewise.
(vmovntq_m_u16): Likewise.
(vqmovnbq_m_u16): Likewise.
(vqmovntq_m_u16): Likewise.
(vrev32q_m_u8): Likewise.
(vmvnq_m_n_s32): Likewise.
(vorrq_m_n_s32): Likewise.
(vqrshrntq_n_s32): Likewise.
(vqshrnbq_n_s32): Likewise.
(vqshrntq_n_s32): Likewise.
(vrshrnbq_n_s32): Likewise.
(vrshrntq_n_s32): Likewise.
(vshrnbq_n_s32): Likewise.
(vshrntq_n_s32): Likewise.
(vcmlaq_f32): Likewise.
(vcmlaq_rot180_f32): Likewise.
(vcmlaq_rot270_f32): Likewise.
(vcmlaq_rot90_f32): Likewise.
(vfmaq_f32): Likewise.
(vfmaq_n_f32): Likewise.
(vfmasq_n_f32): Likewise.
(vfmsq_f32): Likewise.
(vmlaldavaq_s32): Likewise.
(vmlaldavaxq_s32): Likewise.
(vmlsldavaq_s32): Likewise.
(vmlsldavaxq_s32): Likewise.
(vabsq_m_f32): Likewise.
(vcvtmq_m_s32_f32): Likewise.
(vcvtnq_m_s32_f32): Likewise.
(vcvtpq_m_s32_f32): Likewise.
(vcvtq_m_s32_f32): Likewise.
(vdupq_m_n_f32): Likewise.
(vmaxnmaq_m_f32): Likewise.
(vmaxnmavq_p_f32): Likewise.
(vmaxnmvq_p_f32): Likewise.
(vminnmaq_m_f32): Likewise.
(vminnmavq_p_f32): Likewise.
(vminnmvq_p_f32): Likewise.
(vmlaldavq_p_s32): Likewise.
(vmlaldavxq_p_s32): Likewise.
(vmlsldavq_p_s32): Likewise.
(vmlsldavxq_p_s32): Likewise.
(vmovlbq_m_s16): Likewise.
(vmovltq_m_s16): Likewise.
(vmovnbq_m_s32): Likewise.
(vmovntq_m_s32): Likewise.
(vnegq_m_f32): Likewise.
(vpselq_f32): Likewise.
(vqmovnbq_m_s32): Likewise.
(vqmovntq_m_s32): Likewise.
(vrev32q_m_s16): Likewise.
(vrev64q_m_f32): Likewise.
(vrndaq_m_f32): Likewise.
(vrndmq_m_f32): Likewise.
(vrndnq_m_f32): Likewise.
(vrndpq_m_f32): Likewise.
(vrndq_m_f32): Likewise.
(vrndxq_m_f32): Likewise.
(vcmpeqq_m_n_f32): Likewise.
(vcmpgeq_m_f32): Likewise.
(vcmpgeq_m_n_f32): Likewise.
(vcmpgtq_m_f32): Likewise.
(vcmpgtq_m_n_f32): Likewise.
(vcmpleq_m_f32): Likewise.
(vcmpleq_m_n_f32): Likewise.
(vcmpltq_m_f32): Likewise.
(vcmpltq_m_n_f32): Likewise.
(vcmpneq_m_f32): Likewise.
(vcmpneq_m_n_f32): Likewise.
(vmvnq_m_n_u32): Likewise.
(vorrq_m_n_u32): Likewise.
(vqrshruntq_n_s32): Likewise.
(vqshrunbq_n_s32): Likewise.
(vqshruntq_n_s32): Likewise.
(vcvtmq_m_u32_f32): Likewise.
(vcvtnq_m_u32_f32): Likewise.
(vcvtpq_m_u32_f32): Likewise.
(vcvtq_m_u32_f32): Likewise.
(vqmovunbq_m_s32): Likewise.
(vqmovuntq_m_s32): Likewise.
(vqrshrntq_n_u32): Likewise.
(vqshrnbq_n_u32): Likewise.
(vqshrntq_n_u32): Likewise.
(vrshrnbq_n_u32): Likewise.
(vrshrntq_n_u32): Likewise.
(vshrnbq_n_u32): Likewise.
(vshrntq_n_u32): Likewise.
(vmlaldavaq_u32): Likewise.
(vmlaldavaxq_u32): Likewise.
(vmlaldavq_p_u32): Likewise.
(vmlaldavxq_p_u32): Likewise.
(vmovlbq_m_u16): Likewise.
(vmovltq_m_u16): Likewise.
(vmovnbq_m_u32): Likewise.
(vmovntq_m_u32): Likewise.
(vqmovnbq_m_u32): Likewise.
(vqmovntq_m_u32): Likewise.
(vrev32q_m_u16): Likewise.
(__arm_vrmlaldavhaxq_s32): Define intrinsic.
(__arm_vrmlsldavhaq_s32): Likewise.
(__arm_vrmlsldavhaxq_s32): Likewise.
(__arm_vaddlvaq_p_s32): Likewise.
(__arm_vrev16q_m_s8): Likewise.
(__arm_vrmlaldavhq_p_s32): Likewise.
(__arm_vrmlaldavhxq_p_s32): Likewise.
(__arm_vrmlsldavhq_p_s32): Likewise.
(__arm_vrmlsldavhxq_p_s32): Likewise.
(__arm_vaddlvaq_p_u32): Likewise.
(__arm_vrev16q_m_u8): Likewise.
(__arm_vrmlaldavhq_p_u32): Likewise.
(__arm_vmvnq_m_n_s16): Likewise.
(__arm_vorrq_m_n_s16): Likewise.
(__arm_vqrshrntq_n_s16): Likewise.
(__arm_vqshrnbq_n_s16): Likewise.
(__arm_vqshrntq_n_s16): Likewise.
(__arm_vrshrnbq_n_s16): Likewise.
(__arm_vrshrntq_n_s16): Likewise.
(__arm_vshrnbq_n_s16): Likewise.
(__arm_vshrntq_n_s16): Likewise.
(__arm_vmlaldavaq_s16): Likewise.
(__arm_vmlaldavaxq_s16): Likewise.
(__arm_vmlsldavaq_s16): Likewise.
(__arm_vmlsldavaxq_s16): Likewise.
(__arm_vmlaldavq_p_s16): Likewise.
(__arm_vmlaldavxq_p_s16): Likewise.
(__arm_vmlsldavq_p_s16): Likewise.
(__arm_vmlsldavxq_p_s16): Likewise.
(__arm_vmovlbq_m_s8): Likewise.
(__arm_vmovltq_m_s8): Likewise.
(__arm_vmovnbq_m_s16): Likewise.
(__arm_vmovntq_m_s16): Likewise.
(__arm_vqmovnbq_m_s16): Likewise.
(__arm_vqmovntq_m_s16): Likewise.
(__arm_vrev32q_m_s8): Likewise.
(__arm_vmvnq_m_n_u16): Likewise.
(__arm_vorrq_m_n_u16): Likewise.
(__arm_vqrshruntq_n_s16): Likewise.
(__arm_vqshrunbq_n_s16): Likewise.
(__arm_vqshruntq_n_s16): Likewise.
(__arm_vqmovunbq_m_s16): Likewise.
(__arm_vqmovuntq_m_s16): Likewise.
(__arm_vqrshrntq_n_u16): Likewise.
(__arm_vqshrnbq_n_u16): Likewise.
(__arm_vqshrntq_n_u16): Likewise.
(__arm_vrshrnbq_n_u16): Likewise.
(__arm_vrshrntq_n_u16): Likewise.
(__arm_vshrnbq_n_u16): Likewise.
(__arm_vshrntq_n_u16): Likewise.
(__arm_vmlaldavaq_u16): Likewise.
(__arm_vmlaldavaxq_u16): Likewise.
(__arm_vmlaldavq_p_u16): Likewise.
(__arm_vmlaldavxq_p_u16): Likewise.
(__arm_vmovlbq_m_u8): Likewise.
(__arm_vmovltq_m_u8): Likewise.
(__arm_vmovnbq_m_u16): Likewise.
(__arm_vmovntq_m_u16): Likewise.
(__arm_vqmovnbq_m_u16): Likewise.
(__arm_vqmovntq_m_u16): Likewise.
(__arm_vrev32q_m_u8): Likewise.
(__arm_vmvnq_m_n_s32): Likewise.
(__arm_vorrq_m_n_s32): Likewise.
(__arm_vqrshrntq_n_s32): Likewise.
(__arm_vqshrnbq_n_s32): Likewise.
(__arm_vqshrntq_n_s32): Likewise.
(__arm_vrshrnbq_n_s32): Likewise.
(__arm_vrshrntq_n_s32): Likewise.
(__arm_vshrnbq_n_s32): Likewise.
(__arm_vshrntq_n_s32): Likewise.
(__arm_vmlaldavaq_s32): Likewise.
(__arm_vmlaldavaxq_s32): Likewise.
(__arm_vmlsldavaq_s32): Likewise.
(__arm_vmlsldavaxq_s32): Likewise.
(__arm_vmlaldavq_p_s32): Likewise.
(__arm_vmlaldavxq_p_s32): Likewise.
(__arm_vmlsldavq_p_s32): Likewise.
(__arm_vmlsldavxq_p_s32): Likewise.
(__arm_vmovlbq_m_s16): Likewise.
(__arm_vmovltq_m_s16): Likewise.
(__arm_vmovnbq_m_s32): Likewise.
(__arm_vmovntq_m_s32): Likewise.
(__arm_vqmovnbq_m_s32): Likewise.
(__arm_vqmovntq_m_s32): Likewise.
(__arm_vrev32q_m_s16): Likewise.
(__arm_vmvnq_m_n_u32): Likewise.
(__arm_vorrq_m_n_u32): Likewise.
(__arm_vqrshruntq_n_s32): Likewise.
(__arm_vqshrunbq_n_s32): Likewise.
(__arm_vqshruntq_n_s32): Likewise.
(__arm_vqmovunbq_m_s32): Likewise.
(__arm_vqmovuntq_m_s32): Likewise.
(__arm_vqrshrntq_n_u32): Likewise.
(__arm_vqshrnbq_n_u32): Likewise.
(__arm_vqshrntq_n_u32): Likewise.
(__arm_vrshrnbq_n_u32): Likewise.
(__arm_vrshrntq_n_u32): Likewise.
(__arm_vshrnbq_n_u32): Likewise.
(__arm_vshrntq_n_u32): Likewise.
(__arm_vmlaldavaq_u32): Likewise.
(__arm_vmlaldavaxq_u32): Likewise.
(__arm_vmlaldavq_p_u32): Likewise.
(__arm_vmlaldavxq_p_u32): Likewise.
(__arm_vmovlbq_m_u16): Likewise.
(__arm_vmovltq_m_u16): Likewise.
(__arm_vmovnbq_m_u32): Likewise.
(__arm_vmovntq_m_u32): Likewise.
(__arm_vqmovnbq_m_u32): Likewise.
(__arm_vqmovntq_m_u32): Likewise.
(__arm_vrev32q_m_u16): Likewise.
(__arm_vcvtbq_m_f16_f32): Likewise.
(__arm_vcvtbq_m_f32_f16): Likewise.
(__arm_vcvttq_m_f16_f32): Likewise.
(__arm_vcvttq_m_f32_f16): Likewise.
(__arm_vrev32q_m_f16): Likewise.
(__arm_vcmlaq_f16): Likewise.
(__arm_vcmlaq_rot180_f16): Likewise.
(__arm_vcmlaq_rot270_f16): Likewise.
(__arm_vcmlaq_rot90_f16): Likewise.
(__arm_vfmaq_f16): Likewise.
(__arm_vfmaq_n_f16): Likewise.
(__arm_vfmasq_n_f16): Likewise.
(__arm_vfmsq_f16): Likewise.
(__arm_vabsq_m_f16): Likewise.
(__arm_vcvtmq_m_s16_f16): Likewise.
(__arm_vcvtnq_m_s16_f16): Likewise.
(__arm_vcvtpq_m_s16_f16): Likewise.
(__arm_vcvtq_m_s16_f16): Likewise.
(__arm_vdupq_m_n_f16): Likewise.
(__arm_vmaxnmaq_m_f16): Likewise.
(__arm_vmaxnmavq_p_f16): Likewise.
(__arm_vmaxnmvq_p_f16): Likewise.
(__arm_vminnmaq_m_f16): Likewise.
(__arm_vminnmavq_p_f16): Likewise.
(__arm_vminnmvq_p_f16): Likewise.
(__arm_vnegq_m_f16): Likewise.
(__arm_vpselq_f16): Likewise.
(__arm_vrev64q_m_f16): Likewise.
(__arm_vrndaq_m_f16): Likewise.
(__arm_vrndmq_m_f16): Likewise.
(__arm_vrndnq_m_f16): Likewise.
(__arm_vrndpq_m_f16): Likewise.
(__arm_vrndq_m_f16): Likewise.
(__arm_vrndxq_m_f16): Likewise.
(__arm_vcmpeqq_m_n_f16): Likewise.
(__arm_vcmpgeq_m_f16): Likewise.
(__arm_vcmpgeq_m_n_f16): Likewise.
(__arm_vcmpgtq_m_f16): Likewise.
(__arm_vcmpgtq_m_n_f16): Likewise.
(__arm_vcmpleq_m_f16): Likewise.
(__arm_vcmpleq_m_n_f16): Likewise.
(__arm_vcmpltq_m_f16): Likewise.
(__arm_vcmpltq_m_n_f16): Likewise.
(__arm_vcmpneq_m_f16): Likewise.
(__arm_vcmpneq_m_n_f16): Likewise.
(__arm_vcvtmq_m_u16_f16): Likewise.
(__arm_vcvtnq_m_u16_f16): Likewise.
(__arm_vcvtpq_m_u16_f16): Likewise.
(__arm_vcvtq_m_u16_f16): Likewise.
(__arm_vcmlaq_f32): Likewise.
(__arm_vcmlaq_rot180_f32): Likewise.
(__arm_vcmlaq_rot270_f32): Likewise.
(__arm_vcmlaq_rot90_f32): Likewise.
(__arm_vfmaq_f32): Likewise.
(__arm_vfmaq_n_f32): Likewise.
(__arm_vfmasq_n_f32): Likewise.
(__arm_vfmsq_f32): Likewise.
(__arm_vabsq_m_f32): Likewise.
(__arm_vcvtmq_m_s32_f32): Likewise.
(__arm_vcvtnq_m_s32_f32): Likewise.
(__arm_vcvtpq_m_s32_f32): Likewise.
(__arm_vcvtq_m_s32_f32): Likewise.
(__arm_vdupq_m_n_f32): Likewise.
(__arm_vmaxnmaq_m_f32): Likewise.
(__arm_vmaxnmavq_p_f32): Likewise.
(__arm_vmaxnmvq_p_f32): Likewise.
(__arm_vminnmaq_m_f32): Likewise.
(__arm_vminnmavq_p_f32): Likewise.
(__arm_vminnmvq_p_f32): Likewise.
(__arm_vnegq_m_f32): Likewise.
(__arm_vpselq_f32): Likewise.
(__arm_vrev64q_m_f32): Likewise.
(__arm_vrndaq_m_f32): Likewise.
(__arm_vrndmq_m_f32): Likewise.
(__arm_vrndnq_m_f32): Likewise.
(__arm_vrndpq_m_f32): Likewise.
(__arm_vrndq_m_f32): Likewise.
(__arm_vrndxq_m_f32): Likewise.
(__arm_vcmpeqq_m_n_f32): Likewise.
(__arm_vcmpgeq_m_f32): Likewise.
(__arm_vcmpgeq_m_n_f32): Likewise.
(__arm_vcmpgtq_m_f32): Likewise.
(__arm_vcmpgtq_m_n_f32): Likewise.
(__arm_vcmpleq_m_f32): Likewise.
(__arm_vcmpleq_m_n_f32): Likewise.
(__arm_vcmpltq_m_f32): Likewise.
(__arm_vcmpltq_m_n_f32): Likewise.
(__arm_vcmpneq_m_f32): Likewise.
(__arm_vcmpneq_m_n_f32): Likewise.
(__arm_vcvtmq_m_u32_f32): Likewise.
(__arm_vcvtnq_m_u32_f32): Likewise.
(__arm_vcvtpq_m_u32_f32): Likewise.
(__arm_vcvtq_m_u32_f32): Likewise.
(vcvtq_m): Define polymorphic variant.
(vabsq_m): Likewise.
(vcmlaq): Likewise.
(vcmlaq_rot180): Likewise.
(vcmlaq_rot270): Likewise.
(vcmlaq_rot90): Likewise.
(vcmpeqq_m_n): Likewise.
(vcmpgeq_m_n): Likewise.
(vrndxq_m): Likewise.
(vrndq_m): Likewise.
(vrndpq_m): Likewise.
(vcmpgtq_m_n): Likewise.
(vcmpgtq_m): Likewise.
(vcmpleq_m): Likewise.
(vcmpleq_m_n): Likewise.
(vcmpltq_m_n): Likewise.
(vcmpltq_m): Likewise.
(vcmpneq_m): Likewise.
(vcmpneq_m_n): Likewise.
(vcvtbq_m): Likewise.
(vcvttq_m): Likewise.
(vcvtmq_m): Likewise.
(vcvtnq_m): Likewise.
(vcvtpq_m): Likewise.
(vdupq_m_n): Likewise.
(vfmaq_n): Likewise.
(vfmaq): Likewise.
(vfmasq_n): Likewise.
(vfmsq): Likewise.
(vmaxnmaq_m): Likewise.
(vmaxnmavq_m): Likewise.
(vmaxnmvq_m): Likewise.
(vmaxnmavq_p): Likewise.
(vmaxnmvq_p): Likewise.
(vminnmaq_m): Likewise.
(vminnmavq_p): Likewise.
(vminnmvq_p): Likewise.
(vrndnq_m): Likewise.
(vrndaq_m): Likewise.
(vrndmq_m): Likewise.
(vrev64q_m): Likewise.
(vrev32q_m): Likewise.
(vpselq): Likewise.
(vnegq_m): Likewise.
(vcmpgeq_m): Likewise.
(vshrntq_n): Likewise.
(vrshrntq_n): Likewise.
(vmovlbq_m): Likewise.
(vmovnbq_m): Likewise.
(vmovntq_m): Likewise.
(vmvnq_m_n): Likewise.
(vmvnq_m): Likewise.
(vshrnbq_n): Likewise.
(vrshrnbq_n): Likewise.
(vqshruntq_n): Likewise.
(vrev16q_m): Likewise.
(vqshrunbq_n): Likewise.
(vqshrntq_n): Likewise.
(vqrshruntq_n): Likewise.
(vqrshrntq_n): Likewise.
(vqshrnbq_n): Likewise.
(vqmovuntq_m): Likewise.
(vqmovntq_m): Likewise.
(vqmovnbq_m): Likewise.
(vorrq_m_n): Likewise.
(vmovltq_m): Likewise.
(vqmovunbq_m): Likewise.
(vaddlvaq_p): Likewise.
(vmlaldavaq): Likewise.
(vmlaldavaxq): Likewise.
(vmlaldavq_p): Likewise.
(vmlaldavxq_p): Likewise.
(vmlsldavaq): Likewise.
(vmlsldavaxq): Likewise.
(vmlsldavq_p): Likewise.
(vmlsldavxq_p): Likewise.
(vrmlaldavhaxq): Likewise.
(vrmlaldavhq_p): Likewise.
(vrmlaldavhxq_p): Likewise.
(vrmlsldavhaq): Likewise.
(vrmlsldavhaxq): Likewise.
(vrmlsldavhq_p): Likewise.
(vrmlsldavhxq_p): Likewise.
* config/arm/arm_mve_builtins.def (TERNOP_NONE_NONE_IMM_UNONE): Use
builtin qualifier.
(TERNOP_NONE_NONE_NONE_IMM): Likewise.
(TERNOP_NONE_NONE_NONE_NONE): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_IMM_UNONE): Likewise.
(TERNOP_UNONE_UNONE_NONE_IMM): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_UNONE_IMM): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (MVE_constraint3): Define mode attribute iterator.
(MVE_pred3): Likewise.
(MVE_constraint1): Likewise.
(MVE_pred1): Likewise.
(VMLALDAVQ_P): Define iterator.
(VQMOVNBQ_M): Likewise.
(VMOVLTQ_M): Likewise.
(VMOVNBQ_M): Likewise.
(VRSHRNTQ_N): Likewise.
(VORRQ_M_N): Likewise.
(VREV32Q_M): Likewise.
(VREV16Q_M): Likewise.
(VQRSHRNTQ_N): Likewise.
(VMOVNTQ_M): Likewise.
(VMOVLBQ_M): Likewise.
(VMLALDAVAQ): Likewise.
(VQSHRNBQ_N): Likewise.
(VSHRNBQ_N): Likewise.
(VRSHRNBQ_N): Likewise.
(VMLALDAVXQ_P): Likewise.
(VQMOVNTQ_M): Likewise.
(VMVNQ_M_N): Likewise.
(VQSHRNTQ_N): Likewise.
(VMLALDAVAXQ): Likewise.
(VSHRNTQ_N): Likewise.
(VCVTMQ_M): Likewise.
(VCVTNQ_M): Likewise.
(VCVTPQ_M): Likewise.
(VCVTQ_M_N_FROM_F): Likewise.
(VCVTQ_M_FROM_F): Likewise.
(VRMLALDAVHQ_P): Likewise.
(VADDLVAQ_P): Likewise.
(mve_vrndq_m_f<mode>): Define RTL pattern.
(mve_vabsq_m_f<mode>): Likewise.
(mve_vaddlvaq_p_<supf>v4si): Likewise.
(mve_vcmlaq_f<mode>): Likewise.
(mve_vcmlaq_rot180_f<mode>): Likewise.
(mve_vcmlaq_rot270_f<mode>): Likewise.
(mve_vcmlaq_rot90_f<mode>): Likewise.
(mve_vcmpeqq_m_n_f<mode>): Likewise.
(mve_vcmpgeq_m_f<mode>): Likewise.
(mve_vcmpgeq_m_n_f<mode>): Likewise.
(mve_vcmpgtq_m_f<mode>): Likewise.
(mve_vcmpgtq_m_n_f<mode>): Likewise.
(mve_vcmpleq_m_f<mode>): Likewise.
(mve_vcmpleq_m_n_f<mode>): Likewise.
(mve_vcmpltq_m_f<mode>): Likewise.
(mve_vcmpltq_m_n_f<mode>): Likewise.
(mve_vcmpneq_m_f<mode>): Likewise.
(mve_vcmpneq_m_n_f<mode>): Likewise.
(mve_vcvtbq_m_f16_f32v8hf): Likewise.
(mve_vcvtbq_m_f32_f16v4sf): Likewise.
(mve_vcvttq_m_f16_f32v8hf): Likewise.
(mve_vcvttq_m_f32_f16v4sf): Likewise.
(mve_vdupq_m_n_f<mode>): Likewise.
(mve_vfmaq_f<mode>): Likewise.
(mve_vfmaq_n_f<mode>): Likewise.
(mve_vfmasq_n_f<mode>): Likewise.
(mve_vfmsq_f<mode>): Likewise.
(mve_vmaxnmaq_m_f<mode>): Likewise.
(mve_vmaxnmavq_p_f<mode>): Likewise.
(mve_vmaxnmvq_p_f<mode>): Likewise.
(mve_vminnmaq_m_f<mode>): Likewise.
(mve_vminnmavq_p_f<mode>): Likewise.
(mve_vminnmvq_p_f<mode>): Likewise.
(mve_vmlaldavaq_<supf><mode>): Likewise.
(mve_vmlaldavaxq_<supf><mode>): Likewise.
(mve_vmlaldavq_p_<supf><mode>): Likewise.
(mve_vmlaldavxq_p_<supf><mode>): Likewise.
(mve_vmlsldavaq_s<mode>): Likewise.
(mve_vmlsldavaxq_s<mode>): Likewise.
(mve_vmlsldavq_p_s<mode>): Likewise.
(mve_vmlsldavxq_p_s<mode>): Likewise.
(mve_vmovlbq_m_<supf><mode>): Likewise.
(mve_vmovltq_m_<supf><mode>): Likewise.
(mve_vmovnbq_m_<supf><mode>): Likewise.
(mve_vmovntq_m_<supf><mode>): Likewise.
(mve_vmvnq_m_n_<supf><mode>): Likewise.
(mve_vnegq_m_f<mode>): Likewise.
(mve_vorrq_m_n_<supf><mode>): Likewise.
(mve_vpselq_f<mode>): Likewise.
(mve_vqmovnbq_m_<supf><mode>): Likewise.
(mve_vqmovntq_m_<supf><mode>): Likewise.
(mve_vqmovunbq_m_s<mode>): Likewise.
(mve_vqmovuntq_m_s<mode>): Likewise.
(mve_vqrshrntq_n_<supf><mode>): Likewise.
(mve_vqrshruntq_n_s<mode>): Likewise.
(mve_vqshrnbq_n_<supf><mode>): Likewise.
(mve_vqshrntq_n_<supf><mode>): Likewise.
(mve_vqshrunbq_n_s<mode>): Likewise.
(mve_vqshruntq_n_s<mode>): Likewise.
(mve_vrev32q_m_fv8hf): Likewise.
(mve_vrev32q_m_<supf><mode>): Likewise.
(mve_vrev64q_m_f<mode>): Likewise.
(mve_vrmlaldavhaxq_sv4si): Likewise.
(mve_vrmlaldavhxq_p_sv4si): Likewise.
(mve_vrmlsldavhaxq_sv4si): Likewise.
(mve_vrmlsldavhq_p_sv4si): Likewise.
(mve_vrmlsldavhxq_p_sv4si): Likewise.
(mve_vrndaq_m_f<mode>): Likewise.
(mve_vrndmq_m_f<mode>): Likewise.
(mve_vrndnq_m_f<mode>): Likewise.
(mve_vrndpq_m_f<mode>): Likewise.
(mve_vrndxq_m_f<mode>): Likewise.
(mve_vrshrnbq_n_<supf><mode>): Likewise.
(mve_vrshrntq_n_<supf><mode>): Likewise.
(mve_vshrnbq_n_<supf><mode>): Likewise.
(mve_vshrntq_n_<supf><mode>): Likewise.
(mve_vcvtmq_m_<supf><mode>): Likewise.
(mve_vcvtpq_m_<supf><mode>): Likewise.
(mve_vcvtnq_m_<supf><mode>): Likewise.
(mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
(mve_vrev16q_m_<supf>v16qi): Likewise.
(mve_vcvtq_m_from_f_<supf><mode>): Likewise.
(mve_vrmlaldavhq_p_<supf>v4si): Likewise.
(mve_vrmlsldavhaq_sv4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabsq_m_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabsq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_m_f16_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_m_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_m_f16_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_m_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmavq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmavq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmvq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmvq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmavq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmavq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmvq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmvq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovunbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovunbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovuntq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovuntq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:26:53 +0000 (16:26 +0000)]
[ARM][GCC][2/3x]: MVE intrinsics with ternary operands.
This patch supports following MVE ACLE intrinsics with ternary operands.
vpselq_u8, vpselq_s8, vrev64q_m_u8, vqrdmlashq_n_u8, vqrdmlahq_n_u8, vqdmlahq_n_u8, vmvnq_m_u8, vmlasq_n_u8, vmlaq_n_u8, vmladavq_p_u8, vmladavaq_u8, vminvq_p_u8, vmaxvq_p_u8, vdupq_m_n_u8, vcmpneq_m_u8, vcmpneq_m_n_u8, vcmphiq_m_u8, vcmphiq_m_n_u8, vcmpeqq_m_u8, vcmpeqq_m_n_u8, vcmpcsq_m_u8, vcmpcsq_m_n_u8, vclzq_m_u8, vaddvaq_p_u8, vsriq_n_u8, vsliq_n_u8, vshlq_m_r_u8, vrshlq_m_n_u8, vqshlq_m_r_u8, vqrshlq_m_n_u8, vminavq_p_s8, vminaq_m_s8, vmaxavq_p_s8, vmaxaq_m_s8, vcmpneq_m_s8, vcmpneq_m_n_s8, vcmpltq_m_s8, vcmpltq_m_n_s8, vcmpleq_m_s8, vcmpleq_m_n_s8, vcmpgtq_m_s8, vcmpgtq_m_n_s8, vcmpgeq_m_s8, vcmpgeq_m_n_s8, vcmpeqq_m_s8, vcmpeqq_m_n_s8, vshlq_m_r_s8, vrshlq_m_n_s8, vrev64q_m_s8, vqshlq_m_r_s8, vqrshlq_m_n_s8, vqnegq_m_s8, vqabsq_m_s8, vnegq_m_s8, vmvnq_m_s8, vmlsdavxq_p_s8, vmlsdavq_p_s8, vmladavxq_p_s8, vmladavq_p_s8, vminvq_p_s8, vmaxvq_p_s8, vdupq_m_n_s8, vclzq_m_s8, vclsq_m_s8, vaddvaq_p_s8, vabsq_m_s8, vqrdmlsdhxq_s8, vqrdmlsdhq_s8, vqrdmlashq_n_s8, vqrdmlahq_n_s8, vqrdmladhxq_s8, vqrdmladhq_s8, vqdmlsdhxq_s8, vqdmlsdhq_s8, vqdmlahq_n_s8, vqdmladhxq_s8, vqdmladhq_s8, vmlsdavaxq_s8, vmlsdavaq_s8, vmlasq_n_s8, vmlaq_n_s8, vmladavaxq_s8, vmladavaq_s8, vsriq_n_s8, vsliq_n_s8, vpselq_u16, vpselq_s16, vrev64q_m_u16, vqrdmlashq_n_u16, vqrdmlahq_n_u16, vqdmlahq_n_u16, vmvnq_m_u16, vmlasq_n_u16, vmlaq_n_u16, vmladavq_p_u16, vmladavaq_u16, vminvq_p_u16, vmaxvq_p_u16, vdupq_m_n_u16, vcmpneq_m_u16, vcmpneq_m_n_u16, vcmphiq_m_u16, vcmphiq_m_n_u16, vcmpeqq_m_u16, vcmpeqq_m_n_u16, vcmpcsq_m_u16, vcmpcsq_m_n_u16, vclzq_m_u16, vaddvaq_p_u16, vsriq_n_u16, vsliq_n_u16, vshlq_m_r_u16, vrshlq_m_n_u16, vqshlq_m_r_u16, vqrshlq_m_n_u16, vminavq_p_s16, vminaq_m_s16, vmaxavq_p_s16, vmaxaq_m_s16, vcmpneq_m_s16, vcmpneq_m_n_s16, vcmpltq_m_s16, vcmpltq_m_n_s16, vcmpleq_m_s16, vcmpleq_m_n_s16, vcmpgtq_m_s16, vcmpgtq_m_n_s16, vcmpgeq_m_s16, vcmpgeq_m_n_s16, vcmpeqq_m_s16, vcmpeqq_m_n_s16, vshlq_m_r_s16, vrshlq_m_n_s16, vrev64q_m_s16, vqshlq_m_r_s16, vqrshlq_m_n_s16, vqnegq_m_s16, vqabsq_m_s16, vnegq_m_s16, vmvnq_m_s16, vmlsdavxq_p_s16, vmlsdavq_p_s16, vmladavxq_p_s16, vmladavq_p_s16, vminvq_p_s16, vmaxvq_p_s16, vdupq_m_n_s16, vclzq_m_s16, vclsq_m_s16, vaddvaq_p_s16, vabsq_m_s16, vqrdmlsdhxq_s16, vqrdmlsdhq_s16, vqrdmlashq_n_s16, vqrdmlahq_n_s16, vqrdmladhxq_s16, vqrdmladhq_s16, vqdmlsdhxq_s16, vqdmlsdhq_s16, vqdmlahq_n_s16, vqdmladhxq_s16, vqdmladhq_s16, vmlsdavaxq_s16, vmlsdavaq_s16, vmlasq_n_s16, vmlaq_n_s16, vmladavaxq_s16, vmladavaq_s16, vsriq_n_s16, vsliq_n_s16, vpselq_u32, vpselq_s32, vrev64q_m_u32, vqrdmlashq_n_u32, vqrdmlahq_n_u32, vqdmlahq_n_u32, vmvnq_m_u32, vmlasq_n_u32, vmlaq_n_u32, vmladavq_p_u32, vmladavaq_u32, vminvq_p_u32, vmaxvq_p_u32, vdupq_m_n_u32, vcmpneq_m_u32, vcmpneq_m_n_u32, vcmphiq_m_u32, vcmphiq_m_n_u32, vcmpeqq_m_u32, vcmpeqq_m_n_u32, vcmpcsq_m_u32, vcmpcsq_m_n_u32, vclzq_m_u32, vaddvaq_p_u32, vsriq_n_u32, vsliq_n_u32, vshlq_m_r_u32, vrshlq_m_n_u32, vqshlq_m_r_u32, vqrshlq_m_n_u32, vminavq_p_s32, vminaq_m_s32, vmaxavq_p_s32, vmaxaq_m_s32, vcmpneq_m_s32, vcmpneq_m_n_s32, vcmpltq_m_s32, vcmpltq_m_n_s32, vcmpleq_m_s32, vcmpleq_m_n_s32, vcmpgtq_m_s32, vcmpgtq_m_n_s32, vcmpgeq_m_s32, vcmpgeq_m_n_s32, vcmpeqq_m_s32, vcmpeqq_m_n_s32, vshlq_m_r_s32, vrshlq_m_n_s32, vrev64q_m_s32, vqshlq_m_r_s32, vqrshlq_m_n_s32, vqnegq_m_s32, vqabsq_m_s32, vnegq_m_s32, vmvnq_m_s32, vmlsdavxq_p_s32, vmlsdavq_p_s32, vmladavxq_p_s32, vmladavq_p_s32, vminvq_p_s32, vmaxvq_p_s32, vdupq_m_n_s32, vclzq_m_s32, vclsq_m_s32, vaddvaq_p_s32, vabsq_m_s32, vqrdmlsdhxq_s32, vqrdmlsdhq_s32, vqrdmlashq_n_s32, vqrdmlahq_n_s32, vqrdmladhxq_s32, vqrdmladhq_s32, vqdmlsdhxq_s32, vqdmlsdhq_s32, vqdmlahq_n_s32, vqdmladhxq_s32, vqdmladhq_s32, vmlsdavaxq_s32, vmlsdavaq_s32, vmlasq_n_s32, vmlaq_n_s32, vmladavaxq_s32, vmladavaq_s32, vsriq_n_s32, vsliq_n_s32, vpselq_u64, vpselq_s64.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch new constraints "Rc" and "Re" are added, which checks the constant is with in the range of 0 to 15 and 0 to 31 respectively.
Also a new predicates "mve_imm_15" and "mve_imm_31" are added, to check the the matching constraint Rc and Re respectively.
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vpselq_u8): Define macro.
(vpselq_s8): Likewise.
(vrev64q_m_u8): Likewise.
(vqrdmlashq_n_u8): Likewise.
(vqrdmlahq_n_u8): Likewise.
(vqdmlahq_n_u8): Likewise.
(vmvnq_m_u8): Likewise.
(vmlasq_n_u8): Likewise.
(vmlaq_n_u8): Likewise.
(vmladavq_p_u8): Likewise.
(vmladavaq_u8): Likewise.
(vminvq_p_u8): Likewise.
(vmaxvq_p_u8): Likewise.
(vdupq_m_n_u8): Likewise.
(vcmpneq_m_u8): Likewise.
(vcmpneq_m_n_u8): Likewise.
(vcmphiq_m_u8): Likewise.
(vcmphiq_m_n_u8): Likewise.
(vcmpeqq_m_u8): Likewise.
(vcmpeqq_m_n_u8): Likewise.
(vcmpcsq_m_u8): Likewise.
(vcmpcsq_m_n_u8): Likewise.
(vclzq_m_u8): Likewise.
(vaddvaq_p_u8): Likewise.
(vsriq_n_u8): Likewise.
(vsliq_n_u8): Likewise.
(vshlq_m_r_u8): Likewise.
(vrshlq_m_n_u8): Likewise.
(vqshlq_m_r_u8): Likewise.
(vqrshlq_m_n_u8): Likewise.
(vminavq_p_s8): Likewise.
(vminaq_m_s8): Likewise.
(vmaxavq_p_s8): Likewise.
(vmaxaq_m_s8): Likewise.
(vcmpneq_m_s8): Likewise.
(vcmpneq_m_n_s8): Likewise.
(vcmpltq_m_s8): Likewise.
(vcmpltq_m_n_s8): Likewise.
(vcmpleq_m_s8): Likewise.
(vcmpleq_m_n_s8): Likewise.
(vcmpgtq_m_s8): Likewise.
(vcmpgtq_m_n_s8): Likewise.
(vcmpgeq_m_s8): Likewise.
(vcmpgeq_m_n_s8): Likewise.
(vcmpeqq_m_s8): Likewise.
(vcmpeqq_m_n_s8): Likewise.
(vshlq_m_r_s8): Likewise.
(vrshlq_m_n_s8): Likewise.
(vrev64q_m_s8): Likewise.
(vqshlq_m_r_s8): Likewise.
(vqrshlq_m_n_s8): Likewise.
(vqnegq_m_s8): Likewise.
(vqabsq_m_s8): Likewise.
(vnegq_m_s8): Likewise.
(vmvnq_m_s8): Likewise.
(vmlsdavxq_p_s8): Likewise.
(vmlsdavq_p_s8): Likewise.
(vmladavxq_p_s8): Likewise.
(vmladavq_p_s8): Likewise.
(vminvq_p_s8): Likewise.
(vmaxvq_p_s8): Likewise.
(vdupq_m_n_s8): Likewise.
(vclzq_m_s8): Likewise.
(vclsq_m_s8): Likewise.
(vaddvaq_p_s8): Likewise.
(vabsq_m_s8): Likewise.
(vqrdmlsdhxq_s8): Likewise.
(vqrdmlsdhq_s8): Likewise.
(vqrdmlashq_n_s8): Likewise.
(vqrdmlahq_n_s8): Likewise.
(vqrdmladhxq_s8): Likewise.
(vqrdmladhq_s8): Likewise.
(vqdmlsdhxq_s8): Likewise.
(vqdmlsdhq_s8): Likewise.
(vqdmlahq_n_s8): Likewise.
(vqdmladhxq_s8): Likewise.
(vqdmladhq_s8): Likewise.
(vmlsdavaxq_s8): Likewise.
(vmlsdavaq_s8): Likewise.
(vmlasq_n_s8): Likewise.
(vmlaq_n_s8): Likewise.
(vmladavaxq_s8): Likewise.
(vmladavaq_s8): Likewise.
(vsriq_n_s8): Likewise.
(vsliq_n_s8): Likewise.
(vpselq_u16): Likewise.
(vpselq_s16): Likewise.
(vrev64q_m_u16): Likewise.
(vqrdmlashq_n_u16): Likewise.
(vqrdmlahq_n_u16): Likewise.
(vqdmlahq_n_u16): Likewise.
(vmvnq_m_u16): Likewise.
(vmlasq_n_u16): Likewise.
(vmlaq_n_u16): Likewise.
(vmladavq_p_u16): Likewise.
(vmladavaq_u16): Likewise.
(vminvq_p_u16): Likewise.
(vmaxvq_p_u16): Likewise.
(vdupq_m_n_u16): Likewise.
(vcmpneq_m_u16): Likewise.
(vcmpneq_m_n_u16): Likewise.
(vcmphiq_m_u16): Likewise.
(vcmphiq_m_n_u16): Likewise.
(vcmpeqq_m_u16): Likewise.
(vcmpeqq_m_n_u16): Likewise.
(vcmpcsq_m_u16): Likewise.
(vcmpcsq_m_n_u16): Likewise.
(vclzq_m_u16): Likewise.
(vaddvaq_p_u16): Likewise.
(vsriq_n_u16): Likewise.
(vsliq_n_u16): Likewise.
(vshlq_m_r_u16): Likewise.
(vrshlq_m_n_u16): Likewise.
(vqshlq_m_r_u16): Likewise.
(vqrshlq_m_n_u16): Likewise.
(vminavq_p_s16): Likewise.
(vminaq_m_s16): Likewise.
(vmaxavq_p_s16): Likewise.
(vmaxaq_m_s16): Likewise.
(vcmpneq_m_s16): Likewise.
(vcmpneq_m_n_s16): Likewise.
(vcmpltq_m_s16): Likewise.
(vcmpltq_m_n_s16): Likewise.
(vcmpleq_m_s16): Likewise.
(vcmpleq_m_n_s16): Likewise.
(vcmpgtq_m_s16): Likewise.
(vcmpgtq_m_n_s16): Likewise.
(vcmpgeq_m_s16): Likewise.
(vcmpgeq_m_n_s16): Likewise.
(vcmpeqq_m_s16): Likewise.
(vcmpeqq_m_n_s16): Likewise.
(vshlq_m_r_s16): Likewise.
(vrshlq_m_n_s16): Likewise.
(vrev64q_m_s16): Likewise.
(vqshlq_m_r_s16): Likewise.
(vqrshlq_m_n_s16): Likewise.
(vqnegq_m_s16): Likewise.
(vqabsq_m_s16): Likewise.
(vnegq_m_s16): Likewise.
(vmvnq_m_s16): Likewise.
(vmlsdavxq_p_s16): Likewise.
(vmlsdavq_p_s16): Likewise.
(vmladavxq_p_s16): Likewise.
(vmladavq_p_s16): Likewise.
(vminvq_p_s16): Likewise.
(vmaxvq_p_s16): Likewise.
(vdupq_m_n_s16): Likewise.
(vclzq_m_s16): Likewise.
(vclsq_m_s16): Likewise.
(vaddvaq_p_s16): Likewise.
(vabsq_m_s16): Likewise.
(vqrdmlsdhxq_s16): Likewise.
(vqrdmlsdhq_s16): Likewise.
(vqrdmlashq_n_s16): Likewise.
(vqrdmlahq_n_s16): Likewise.
(vqrdmladhxq_s16): Likewise.
(vqrdmladhq_s16): Likewise.
(vqdmlsdhxq_s16): Likewise.
(vqdmlsdhq_s16): Likewise.
(vqdmlahq_n_s16): Likewise.
(vqdmladhxq_s16): Likewise.
(vqdmladhq_s16): Likewise.
(vmlsdavaxq_s16): Likewise.
(vmlsdavaq_s16): Likewise.
(vmlasq_n_s16): Likewise.
(vmlaq_n_s16): Likewise.
(vmladavaxq_s16): Likewise.
(vmladavaq_s16): Likewise.
(vsriq_n_s16): Likewise.
(vsliq_n_s16): Likewise.
(vpselq_u32): Likewise.
(vpselq_s32): Likewise.
(vrev64q_m_u32): Likewise.
(vqrdmlashq_n_u32): Likewise.
(vqrdmlahq_n_u32): Likewise.
(vqdmlahq_n_u32): Likewise.
(vmvnq_m_u32): Likewise.
(vmlasq_n_u32): Likewise.
(vmlaq_n_u32): Likewise.
(vmladavq_p_u32): Likewise.
(vmladavaq_u32): Likewise.
(vminvq_p_u32): Likewise.
(vmaxvq_p_u32): Likewise.
(vdupq_m_n_u32): Likewise.
(vcmpneq_m_u32): Likewise.
(vcmpneq_m_n_u32): Likewise.
(vcmphiq_m_u32): Likewise.
(vcmphiq_m_n_u32): Likewise.
(vcmpeqq_m_u32): Likewise.
(vcmpeqq_m_n_u32): Likewise.
(vcmpcsq_m_u32): Likewise.
(vcmpcsq_m_n_u32): Likewise.
(vclzq_m_u32): Likewise.
(vaddvaq_p_u32): Likewise.
(vsriq_n_u32): Likewise.
(vsliq_n_u32): Likewise.
(vshlq_m_r_u32): Likewise.
(vrshlq_m_n_u32): Likewise.
(vqshlq_m_r_u32): Likewise.
(vqrshlq_m_n_u32): Likewise.
(vminavq_p_s32): Likewise.
(vminaq_m_s32): Likewise.
(vmaxavq_p_s32): Likewise.
(vmaxaq_m_s32): Likewise.
(vcmpneq_m_s32): Likewise.
(vcmpneq_m_n_s32): Likewise.
(vcmpltq_m_s32): Likewise.
(vcmpltq_m_n_s32): Likewise.
(vcmpleq_m_s32): Likewise.
(vcmpleq_m_n_s32): Likewise.
(vcmpgtq_m_s32): Likewise.
(vcmpgtq_m_n_s32): Likewise.
(vcmpgeq_m_s32): Likewise.
(vcmpgeq_m_n_s32): Likewise.
(vcmpeqq_m_s32): Likewise.
(vcmpeqq_m_n_s32): Likewise.
(vshlq_m_r_s32): Likewise.
(vrshlq_m_n_s32): Likewise.
(vrev64q_m_s32): Likewise.
(vqshlq_m_r_s32): Likewise.
(vqrshlq_m_n_s32): Likewise.
(vqnegq_m_s32): Likewise.
(vqabsq_m_s32): Likewise.
(vnegq_m_s32): Likewise.
(vmvnq_m_s32): Likewise.
(vmlsdavxq_p_s32): Likewise.
(vmlsdavq_p_s32): Likewise.
(vmladavxq_p_s32): Likewise.
(vmladavq_p_s32): Likewise.
(vminvq_p_s32): Likewise.
(vmaxvq_p_s32): Likewise.
(vdupq_m_n_s32): Likewise.
(vclzq_m_s32): Likewise.
(vclsq_m_s32): Likewise.
(vaddvaq_p_s32): Likewise.
(vabsq_m_s32): Likewise.
(vqrdmlsdhxq_s32): Likewise.
(vqrdmlsdhq_s32): Likewise.
(vqrdmlashq_n_s32): Likewise.
(vqrdmlahq_n_s32): Likewise.
(vqrdmladhxq_s32): Likewise.
(vqrdmladhq_s32): Likewise.
(vqdmlsdhxq_s32): Likewise.
(vqdmlsdhq_s32): Likewise.
(vqdmlahq_n_s32): Likewise.
(vqdmladhxq_s32): Likewise.
(vqdmladhq_s32): Likewise.
(vmlsdavaxq_s32): Likewise.
(vmlsdavaq_s32): Likewise.
(vmlasq_n_s32): Likewise.
(vmlaq_n_s32): Likewise.
(vmladavaxq_s32): Likewise.
(vmladavaq_s32): Likewise.
(vsriq_n_s32): Likewise.
(vsliq_n_s32): Likewise.
(vpselq_u64): Likewise.
(vpselq_s64): Likewise.
(__arm_vpselq_u8): Define intrinsic.
(__arm_vpselq_s8): Likewise.
(__arm_vrev64q_m_u8): Likewise.
(__arm_vqrdmlashq_n_u8): Likewise.
(__arm_vqrdmlahq_n_u8): Likewise.
(__arm_vqdmlahq_n_u8): Likewise.
(__arm_vmvnq_m_u8): Likewise.
(__arm_vmlasq_n_u8): Likewise.
(__arm_vmlaq_n_u8): Likewise.
(__arm_vmladavq_p_u8): Likewise.
(__arm_vmladavaq_u8): Likewise.
(__arm_vminvq_p_u8): Likewise.
(__arm_vmaxvq_p_u8): Likewise.
(__arm_vdupq_m_n_u8): Likewise.
(__arm_vcmpneq_m_u8): Likewise.
(__arm_vcmpneq_m_n_u8): Likewise.
(__arm_vcmphiq_m_u8): Likewise.
(__arm_vcmphiq_m_n_u8): Likewise.
(__arm_vcmpeqq_m_u8): Likewise.
(__arm_vcmpeqq_m_n_u8): Likewise.
(__arm_vcmpcsq_m_u8): Likewise.
(__arm_vcmpcsq_m_n_u8): Likewise.
(__arm_vclzq_m_u8): Likewise.
(__arm_vaddvaq_p_u8): Likewise.
(__arm_vsriq_n_u8): Likewise.
(__arm_vsliq_n_u8): Likewise.
(__arm_vshlq_m_r_u8): Likewise.
(__arm_vrshlq_m_n_u8): Likewise.
(__arm_vqshlq_m_r_u8): Likewise.
(__arm_vqrshlq_m_n_u8): Likewise.
(__arm_vminavq_p_s8): Likewise.
(__arm_vminaq_m_s8): Likewise.
(__arm_vmaxavq_p_s8): Likewise.
(__arm_vmaxaq_m_s8): Likewise.
(__arm_vcmpneq_m_s8): Likewise.
(__arm_vcmpneq_m_n_s8): Likewise.
(__arm_vcmpltq_m_s8): Likewise.
(__arm_vcmpltq_m_n_s8): Likewise.
(__arm_vcmpleq_m_s8): Likewise.
(__arm_vcmpleq_m_n_s8): Likewise.
(__arm_vcmpgtq_m_s8): Likewise.
(__arm_vcmpgtq_m_n_s8): Likewise.
(__arm_vcmpgeq_m_s8): Likewise.
(__arm_vcmpgeq_m_n_s8): Likewise.
(__arm_vcmpeqq_m_s8): Likewise.
(__arm_vcmpeqq_m_n_s8): Likewise.
(__arm_vshlq_m_r_s8): Likewise.
(__arm_vrshlq_m_n_s8): Likewise.
(__arm_vrev64q_m_s8): Likewise.
(__arm_vqshlq_m_r_s8): Likewise.
(__arm_vqrshlq_m_n_s8): Likewise.
(__arm_vqnegq_m_s8): Likewise.
(__arm_vqabsq_m_s8): Likewise.
(__arm_vnegq_m_s8): Likewise.
(__arm_vmvnq_m_s8): Likewise.
(__arm_vmlsdavxq_p_s8): Likewise.
(__arm_vmlsdavq_p_s8): Likewise.
(__arm_vmladavxq_p_s8): Likewise.
(__arm_vmladavq_p_s8): Likewise.
(__arm_vminvq_p_s8): Likewise.
(__arm_vmaxvq_p_s8): Likewise.
(__arm_vdupq_m_n_s8): Likewise.
(__arm_vclzq_m_s8): Likewise.
(__arm_vclsq_m_s8): Likewise.
(__arm_vaddvaq_p_s8): Likewise.
(__arm_vabsq_m_s8): Likewise.
(__arm_vqrdmlsdhxq_s8): Likewise.
(__arm_vqrdmlsdhq_s8): Likewise.
(__arm_vqrdmlashq_n_s8): Likewise.
(__arm_vqrdmlahq_n_s8): Likewise.
(__arm_vqrdmladhxq_s8): Likewise.
(__arm_vqrdmladhq_s8): Likewise.
(__arm_vqdmlsdhxq_s8): Likewise.
(__arm_vqdmlsdhq_s8): Likewise.
(__arm_vqdmlahq_n_s8): Likewise.
(__arm_vqdmladhxq_s8): Likewise.
(__arm_vqdmladhq_s8): Likewise.
(__arm_vmlsdavaxq_s8): Likewise.
(__arm_vmlsdavaq_s8): Likewise.
(__arm_vmlasq_n_s8): Likewise.
(__arm_vmlaq_n_s8): Likewise.
(__arm_vmladavaxq_s8): Likewise.
(__arm_vmladavaq_s8): Likewise.
(__arm_vsriq_n_s8): Likewise.
(__arm_vsliq_n_s8): Likewise.
(__arm_vpselq_u16): Likewise.
(__arm_vpselq_s16): Likewise.
(__arm_vrev64q_m_u16): Likewise.
(__arm_vqrdmlashq_n_u16): Likewise.
(__arm_vqrdmlahq_n_u16): Likewise.
(__arm_vqdmlahq_n_u16): Likewise.
(__arm_vmvnq_m_u16): Likewise.
(__arm_vmlasq_n_u16): Likewise.
(__arm_vmlaq_n_u16): Likewise.
(__arm_vmladavq_p_u16): Likewise.
(__arm_vmladavaq_u16): Likewise.
(__arm_vminvq_p_u16): Likewise.
(__arm_vmaxvq_p_u16): Likewise.
(__arm_vdupq_m_n_u16): Likewise.
(__arm_vcmpneq_m_u16): Likewise.
(__arm_vcmpneq_m_n_u16): Likewise.
(__arm_vcmphiq_m_u16): Likewise.
(__arm_vcmphiq_m_n_u16): Likewise.
(__arm_vcmpeqq_m_u16): Likewise.
(__arm_vcmpeqq_m_n_u16): Likewise.
(__arm_vcmpcsq_m_u16): Likewise.
(__arm_vcmpcsq_m_n_u16): Likewise.
(__arm_vclzq_m_u16): Likewise.
(__arm_vaddvaq_p_u16): Likewise.
(__arm_vsriq_n_u16): Likewise.
(__arm_vsliq_n_u16): Likewise.
(__arm_vshlq_m_r_u16): Likewise.
(__arm_vrshlq_m_n_u16): Likewise.
(__arm_vqshlq_m_r_u16): Likewise.
(__arm_vqrshlq_m_n_u16): Likewise.
(__arm_vminavq_p_s16): Likewise.
(__arm_vminaq_m_s16): Likewise.
(__arm_vmaxavq_p_s16): Likewise.
(__arm_vmaxaq_m_s16): Likewise.
(__arm_vcmpneq_m_s16): Likewise.
(__arm_vcmpneq_m_n_s16): Likewise.
(__arm_vcmpltq_m_s16): Likewise.
(__arm_vcmpltq_m_n_s16): Likewise.
(__arm_vcmpleq_m_s16): Likewise.
(__arm_vcmpleq_m_n_s16): Likewise.
(__arm_vcmpgtq_m_s16): Likewise.
(__arm_vcmpgtq_m_n_s16): Likewise.
(__arm_vcmpgeq_m_s16): Likewise.
(__arm_vcmpgeq_m_n_s16): Likewise.
(__arm_vcmpeqq_m_s16): Likewise.
(__arm_vcmpeqq_m_n_s16): Likewise.
(__arm_vshlq_m_r_s16): Likewise.
(__arm_vrshlq_m_n_s16): Likewise.
(__arm_vrev64q_m_s16): Likewise.
(__arm_vqshlq_m_r_s16): Likewise.
(__arm_vqrshlq_m_n_s16): Likewise.
(__arm_vqnegq_m_s16): Likewise.
(__arm_vqabsq_m_s16): Likewise.
(__arm_vnegq_m_s16): Likewise.
(__arm_vmvnq_m_s16): Likewise.
(__arm_vmlsdavxq_p_s16): Likewise.
(__arm_vmlsdavq_p_s16): Likewise.
(__arm_vmladavxq_p_s16): Likewise.
(__arm_vmladavq_p_s16): Likewise.
(__arm_vminvq_p_s16): Likewise.
(__arm_vmaxvq_p_s16): Likewise.
(__arm_vdupq_m_n_s16): Likewise.
(__arm_vclzq_m_s16): Likewise.
(__arm_vclsq_m_s16): Likewise.
(__arm_vaddvaq_p_s16): Likewise.
(__arm_vabsq_m_s16): Likewise.
(__arm_vqrdmlsdhxq_s16): Likewise.
(__arm_vqrdmlsdhq_s16): Likewise.
(__arm_vqrdmlashq_n_s16): Likewise.
(__arm_vqrdmlahq_n_s16): Likewise.
(__arm_vqrdmladhxq_s16): Likewise.
(__arm_vqrdmladhq_s16): Likewise.
(__arm_vqdmlsdhxq_s16): Likewise.
(__arm_vqdmlsdhq_s16): Likewise.
(__arm_vqdmlahq_n_s16): Likewise.
(__arm_vqdmladhxq_s16): Likewise.
(__arm_vqdmladhq_s16): Likewise.
(__arm_vmlsdavaxq_s16): Likewise.
(__arm_vmlsdavaq_s16): Likewise.
(__arm_vmlasq_n_s16): Likewise.
(__arm_vmlaq_n_s16): Likewise.
(__arm_vmladavaxq_s16): Likewise.
(__arm_vmladavaq_s16): Likewise.
(__arm_vsriq_n_s16): Likewise.
(__arm_vsliq_n_s16): Likewise.
(__arm_vpselq_u32): Likewise.
(__arm_vpselq_s32): Likewise.
(__arm_vrev64q_m_u32): Likewise.
(__arm_vqrdmlashq_n_u32): Likewise.
(__arm_vqrdmlahq_n_u32): Likewise.
(__arm_vqdmlahq_n_u32): Likewise.
(__arm_vmvnq_m_u32): Likewise.
(__arm_vmlasq_n_u32): Likewise.
(__arm_vmlaq_n_u32): Likewise.
(__arm_vmladavq_p_u32): Likewise.
(__arm_vmladavaq_u32): Likewise.
(__arm_vminvq_p_u32): Likewise.
(__arm_vmaxvq_p_u32): Likewise.
(__arm_vdupq_m_n_u32): Likewise.
(__arm_vcmpneq_m_u32): Likewise.
(__arm_vcmpneq_m_n_u32): Likewise.
(__arm_vcmphiq_m_u32): Likewise.
(__arm_vcmphiq_m_n_u32): Likewise.
(__arm_vcmpeqq_m_u32): Likewise.
(__arm_vcmpeqq_m_n_u32): Likewise.
(__arm_vcmpcsq_m_u32): Likewise.
(__arm_vcmpcsq_m_n_u32): Likewise.
(__arm_vclzq_m_u32): Likewise.
(__arm_vaddvaq_p_u32): Likewise.
(__arm_vsriq_n_u32): Likewise.
(__arm_vsliq_n_u32): Likewise.
(__arm_vshlq_m_r_u32): Likewise.
(__arm_vrshlq_m_n_u32): Likewise.
(__arm_vqshlq_m_r_u32): Likewise.
(__arm_vqrshlq_m_n_u32): Likewise.
(__arm_vminavq_p_s32): Likewise.
(__arm_vminaq_m_s32): Likewise.
(__arm_vmaxavq_p_s32): Likewise.
(__arm_vmaxaq_m_s32): Likewise.
(__arm_vcmpneq_m_s32): Likewise.
(__arm_vcmpneq_m_n_s32): Likewise.
(__arm_vcmpltq_m_s32): Likewise.
(__arm_vcmpltq_m_n_s32): Likewise.
(__arm_vcmpleq_m_s32): Likewise.
(__arm_vcmpleq_m_n_s32): Likewise.
(__arm_vcmpgtq_m_s32): Likewise.
(__arm_vcmpgtq_m_n_s32): Likewise.
(__arm_vcmpgeq_m_s32): Likewise.
(__arm_vcmpgeq_m_n_s32): Likewise.
(__arm_vcmpeqq_m_s32): Likewise.
(__arm_vcmpeqq_m_n_s32): Likewise.
(__arm_vshlq_m_r_s32): Likewise.
(__arm_vrshlq_m_n_s32): Likewise.
(__arm_vrev64q_m_s32): Likewise.
(__arm_vqshlq_m_r_s32): Likewise.
(__arm_vqrshlq_m_n_s32): Likewise.
(__arm_vqnegq_m_s32): Likewise.
(__arm_vqabsq_m_s32): Likewise.
(__arm_vnegq_m_s32): Likewise.
(__arm_vmvnq_m_s32): Likewise.
(__arm_vmlsdavxq_p_s32): Likewise.
(__arm_vmlsdavq_p_s32): Likewise.
(__arm_vmladavxq_p_s32): Likewise.
(__arm_vmladavq_p_s32): Likewise.
(__arm_vminvq_p_s32): Likewise.
(__arm_vmaxvq_p_s32): Likewise.
(__arm_vdupq_m_n_s32): Likewise.
(__arm_vclzq_m_s32): Likewise.
(__arm_vclsq_m_s32): Likewise.
(__arm_vaddvaq_p_s32): Likewise.
(__arm_vabsq_m_s32): Likewise.
(__arm_vqrdmlsdhxq_s32): Likewise.
(__arm_vqrdmlsdhq_s32): Likewise.
(__arm_vqrdmlashq_n_s32): Likewise.
(__arm_vqrdmlahq_n_s32): Likewise.
(__arm_vqrdmladhxq_s32): Likewise.
(__arm_vqrdmladhq_s32): Likewise.
(__arm_vqdmlsdhxq_s32): Likewise.
(__arm_vqdmlsdhq_s32): Likewise.
(__arm_vqdmlahq_n_s32): Likewise.
(__arm_vqdmladhxq_s32): Likewise.
(__arm_vqdmladhq_s32): Likewise.
(__arm_vmlsdavaxq_s32): Likewise.
(__arm_vmlsdavaq_s32): Likewise.
(__arm_vmlasq_n_s32): Likewise.
(__arm_vmlaq_n_s32): Likewise.
(__arm_vmladavaxq_s32): Likewise.
(__arm_vmladavaq_s32): Likewise.
(__arm_vsriq_n_s32): Likewise.
(__arm_vsliq_n_s32): Likewise.
(__arm_vpselq_u64): Likewise.
(__arm_vpselq_s64): Likewise.
(vcmpneq_m_n): Define polymorphic variant.
(vcmpneq_m): Likewise.
(vqrdmlsdhq): Likewise.
(vqrdmlsdhxq): Likewise.
(vqrshlq_m_n): Likewise.
(vqshlq_m_r): Likewise.
(vrev64q_m): Likewise.
(vrshlq_m_n): Likewise.
(vshlq_m_r): Likewise.
(vsliq_n): Likewise.
(vsriq_n): Likewise.
(vqrdmlashq_n): Likewise.
(vqrdmlahq): Likewise.
(vqrdmladhxq): Likewise.
(vqrdmladhq): Likewise.
(vqnegq_m): Likewise.
(vqdmlsdhxq): Likewise.
(vabsq_m): Likewise.
(vclsq_m): Likewise.
(vclzq_m): Likewise.
(vcmpgeq_m): Likewise.
(vcmpgeq_m_n): Likewise.
(vdupq_m_n): Likewise.
(vmaxaq_m): Likewise.
(vmlaq_n): Likewise.
(vmlasq_n): Likewise.
(vmvnq_m): Likewise.
(vnegq_m): Likewise.
(vpselq): Likewise.
(vqdmlahq_n): Likewise.
(vqrdmlahq_n): Likewise.
(vqdmlsdhq): Likewise.
(vqdmladhq): Likewise.
(vqabsq_m): Likewise.
(vminaq_m): Likewise.
(vrmlaldavhaq): Likewise.
(vmlsdavxq_p): Likewise.
(vmlsdavq_p): Likewise.
(vmlsdavaxq): Likewise.
(vmlsdavaq): Likewise.
(vaddvaq_p): Likewise.
(vcmpcsq_m_n): Likewise.
(vcmpcsq_m): Likewise.
(vcmpeqq_m_n): Likewise.
(vcmpeqq_m): Likewise.
(vmladavxq_p): Likewise.
(vmladavq_p): Likewise.
(vmladavaxq): Likewise.
(vmladavaq): Likewise.
(vminvq_p): Likewise.
(vminavq_p): Likewise.
(vmaxvq_p): Likewise.
(vmaxavq_p): Likewise.
(vcmpltq_m_n): Likewise.
(vcmpltq_m): Likewise.
(vcmpleq_m): Likewise.
(vcmpleq_m_n): Likewise.
(vcmphiq_m_n): Likewise.
(vcmphiq_m): Likewise.
(vcmpgtq_m_n): Likewise.
(vcmpgtq_m): Likewise.
* config/arm/arm_mve_builtins.def (TERNOP_NONE_NONE_NONE_IMM): Use
builtin qualifier.
(TERNOP_NONE_NONE_NONE_NONE): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_UNONE_IMM): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/constraints.md (Rc): Define constraint to check constant is
in the range of 0 to 15.
(Re): Define constraint to check constant is in the range of 0 to 31.
* config/arm/mve.md (VADDVAQ_P): Define iterator.
(VCLZQ_M): Likewise.
(VCMPEQQ_M_N): Likewise.
(VCMPEQQ_M): Likewise.
(VCMPNEQ_M_N): Likewise.
(VCMPNEQ_M): Likewise.
(VDUPQ_M_N): Likewise.
(VMAXVQ_P): Likewise.
(VMINVQ_P): Likewise.
(VMLADAVAQ): Likewise.
(VMLADAVQ_P): Likewise.
(VMLAQ_N): Likewise.
(VMLASQ_N): Likewise.
(VMVNQ_M): Likewise.
(VPSELQ): Likewise.
(VQDMLAHQ_N): Likewise.
(VQRDMLAHQ_N): Likewise.
(VQRDMLASHQ_N): Likewise.
(VQRSHLQ_M_N): Likewise.
(VQSHLQ_M_R): Likewise.
(VREV64Q_M): Likewise.
(VRSHLQ_M_N): Likewise.
(VSHLQ_M_R): Likewise.
(VSLIQ_N): Likewise.
(VSRIQ_N): Likewise.
(mve_vabsq_m_s<mode>): Define RTL pattern.
(mve_vaddvaq_p_<supf><mode>): Likewise.
(mve_vclsq_m_s<mode>): Likewise.
(mve_vclzq_m_<supf><mode>): Likewise.
(mve_vcmpcsq_m_n_u<mode>): Likewise.
(mve_vcmpcsq_m_u<mode>): Likewise.
(mve_vcmpeqq_m_n_<supf><mode>): Likewise.
(mve_vcmpeqq_m_<supf><mode>): Likewise.
(mve_vcmpgeq_m_n_s<mode>): Likewise.
(mve_vcmpgeq_m_s<mode>): Likewise.
(mve_vcmpgtq_m_n_s<mode>): Likewise.
(mve_vcmpgtq_m_s<mode>): Likewise.
(mve_vcmphiq_m_n_u<mode>): Likewise.
(mve_vcmphiq_m_u<mode>): Likewise.
(mve_vcmpleq_m_n_s<mode>): Likewise.
(mve_vcmpleq_m_s<mode>): Likewise.
(mve_vcmpltq_m_n_s<mode>): Likewise.
(mve_vcmpltq_m_s<mode>): Likewise.
(mve_vcmpneq_m_n_<supf><mode>): Likewise.
(mve_vcmpneq_m_<supf><mode>): Likewise.
(mve_vdupq_m_n_<supf><mode>): Likewise.
(mve_vmaxaq_m_s<mode>): Likewise.
(mve_vmaxavq_p_s<mode>): Likewise.
(mve_vmaxvq_p_<supf><mode>): Likewise.
(mve_vminaq_m_s<mode>): Likewise.
(mve_vminavq_p_s<mode>): Likewise.
(mve_vminvq_p_<supf><mode>): Likewise.
(mve_vmladavaq_<supf><mode>): Likewise.
(mve_vmladavq_p_<supf><mode>): Likewise.
(mve_vmladavxq_p_s<mode>): Likewise.
(mve_vmlaq_n_<supf><mode>): Likewise.
(mve_vmlasq_n_<supf><mode>): Likewise.
(mve_vmlsdavq_p_s<mode>): Likewise.
(mve_vmlsdavxq_p_s<mode>): Likewise.
(mve_vmvnq_m_<supf><mode>): Likewise.
(mve_vnegq_m_s<mode>): Likewise.
(mve_vpselq_<supf><mode>): Likewise.
(mve_vqabsq_m_s<mode>): Likewise.
(mve_vqdmlahq_n_<supf><mode>): Likewise.
(mve_vqnegq_m_s<mode>): Likewise.
(mve_vqrdmladhq_s<mode>): Likewise.
(mve_vqrdmladhxq_s<mode>): Likewise.
(mve_vqrdmlahq_n_<supf><mode>): Likewise.
(mve_vqrdmlashq_n_<supf><mode>): Likewise.
(mve_vqrdmlsdhq_s<mode>): Likewise.
(mve_vqrdmlsdhxq_s<mode>): Likewise.
(mve_vqrshlq_m_n_<supf><mode>): Likewise.
(mve_vqshlq_m_r_<supf><mode>): Likewise.
(mve_vrev64q_m_<supf><mode>): Likewise.
(mve_vrshlq_m_n_<supf><mode>): Likewise.
(mve_vshlq_m_r_<supf><mode>): Likewise.
(mve_vsliq_n_<supf><mode>): Likewise.
(mve_vsriq_n_<supf><mode>): Likewise.
(mve_vqdmlsdhxq_s<mode>): Likewise.
(mve_vqdmlsdhq_s<mode>): Likewise.
(mve_vqdmladhxq_s<mode>): Likewise.
(mve_vqdmladhq_s<mode>): Likewise.
(mve_vmlsdavaxq_s<mode>): Likewise.
(mve_vmlsdavaq_s<mode>): Likewise.
(mve_vmladavaxq_s<mode>): Likewise.
* config/arm/predicates.md (mve_imm_15):Define predicate to check the
matching constraint Rc.
(mve_imm_31): Define predicate to check the matching constraint Re.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabsq_m_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabsq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_u8.c: Likewise.
Tobias Burnus [Wed, 18 Mar 2020 15:28:08 +0000 (16:28 +0100)]
Fix libgomp.oacc-fortran/atomic_capture-1.f90
2020-03-18 Julian Brown <julian@codesourcery.com>
Tobias Burnus <tobias@codesourcery.com>
* testsuite/libgomp.oacc-fortran/atomic_capture-1.f90: Really make
it work concurrently.
David Malcolm [Wed, 11 Mar 2020 21:06:41 +0000 (17:06 -0400)]
analyzer: make summarized dumps more comprehensive
The previous implementation of summarized dumps within
region_model::dump_to_pp showed only the "top-level" keys within the
current frame and for globals, and thus didn't e.g. show the values
of fields of structs, or elements of arrays.
This patch rewrites it to gather a vec of representative path_vars
for all regions, using this to generate the dump, so that all expressible
lvalues ought to make it to the summarized dump.
gcc/analyzer/ChangeLog:
* region-model.cc: Include "stor-layout.h".
(region_model::dump_to_pp): Rather than calling
dump_summary_of_map on each of the current frame and the globals,
instead get a vec of representative path_vars for all regions,
and then dump a summary of all of them.
(region_model::dump_summary_of_map): Delete, rewriting into...
(region_model::dump_summary_of_rep_path_vars): ...this new
function, working on a vec of path_vars.
(region_model::set_value): New overload.
(region_model::get_representative_path_var): Rename
"parent_region" local to "parent_reg" and consolidate with other
local. Guard test for grandparent being stack on parent_reg being
non-NULL. Move handling for parent being an array_region to
within guard for parent_reg being non-NULL.
(selftest::make_test_compound_type): New function.
(selftest::test_dump_2): New selftest.
(selftest::test_dump_3): New selftest.
(selftest::test_stack_frames): Update expected output from
simplified dump to show "a" and "b" from parent frame and "y" in
child frame.
(selftest::analyzer_region_model_cc_tests): Call test_dump_2 and
test_dump_3.
* region-model.h (region_model::set_value): New overload decl.
(region_model::dump_summary_of_map): Delete.
(region_model::dump_summary_of_rep_path_vars): New.
David Malcolm [Tue, 10 Mar 2020 22:50:03 +0000 (18:50 -0400)]
analyzer: add test coverage for fixed ICE [PR94047]
PR analyzer/94047 reports an ICE, which turned out to be caused
by the erroneous use of TREE_TYPE on the view region's type
in region_model::get_representative_path_var that I introduced
in r10-7024-ge516294a1acb28aaaad44cfd583cc6a80354044e and
fixed in g:
787477a226033e36be3f6d16b71be13dd917e982.
This patch adds a regression test for the ICE.
gcc/testsuite/ChangeLog:
PR analyzer/94047
* gcc.dg/analyzer/pr94047.c: New test.
David Malcolm [Tue, 17 Mar 2020 18:43:43 +0000 (14:43 -0400)]
analyzer: introduce noop_region_model_context
tentative_region_model_context and test_region_model_context are both
forced to implement numerous pure virtual vfuncs of the abstract
region_model_context.
This patch adds a noop_region_model_context which provides empty
implementations of all of region_model_context's pure virtual functions,
and subclasses the above classes from that, rather than from
region_model_context directly.
gcc/analyzer/ChangeLog:
* region-model.h (class noop_region_model_context): New subclass
of region_model_context.
(class tentative_region_model_context): Inherit from
noop_region_model_context rather than from region_model_context;
drop redundant vfunc implementations.
(class test_region_model_context): Likewise.
David Malcolm [Tue, 17 Mar 2020 14:25:14 +0000 (10:25 -0400)]
analyzer: tweaks to exploded_node ctor
I have followup work that touches this, so it's easiest to get this
cleanup in first.
gcc/analyzer/ChangeLog:
* engine.cc (exploded_node::exploded_node): Move implementation
here from header; accept point_and_state by const reference rather
than by value.
* exploded-graph.h (exploded_node::exploded_node): Pass
point_and_state by const reference rather than by value. Move
body to engine.cc.
Jonathan Wakely [Wed, 18 Mar 2020 12:55:29 +0000 (12:55 +0000)]
libstdc++ Fix compilation of <stop_token> with Clang
Clang 9 supports C++20 via -std=c++2a but doesn't support three-way
comparisons, so <stop_token> fails to compile. When the compiler doesn't
support default comparisons, this patch defines operator== and
operator!= for the _Stop_state_ref class. That is enough for the header
to be compiled with Clang. It allows operator== for stop_token and
stop_source to work, but not operator!= because that isn't explicitly
defined.
* include/std/stop_token (stop_token::_Stop_state_ref): Define
comparison operators explicitly if the compiler won't synthesize them.
Jonathan Wakely [Wed, 18 Mar 2020 12:55:29 +0000 (12:55 +0000)]
libstdc++: Fix compilation with released versions of Clang
Clang 9 supports C++20 via -std=c++2a but doesn't support Concepts, so
several of the new additions related to the Ranges library fail to
compile with -std=c++2a. The new definition of iterator_traits and the
definition of default_sentinel_t are guarded by __cpp_lib_concepts, so
check that in addition to __cplusplus > 201703L.
* include/bits/stl_algobase.h (__lexicographical_compare_aux): Check
__cpp_lib_concepts before using iter_reference_t.
* include/bits/stream_iterator.h (istream_iterator): Check
__cpp_lib_concepts before using default_sentinel_t.
* include/bits/streambuf_iterator.h (istreambuf_iterator): Likewise.
Andrew Stubbs [Tue, 17 Mar 2020 12:49:19 +0000 (12:49 +0000)]
amdgcn: Fix vector compare modes
The GCN VCC register has 64 CC values in one registers, one bit for each
vector lane.
Previously we avoided problems with invalid optimizations by not declaring
a mode for the comparison operators, but it turns out that causes other
problems (and build warnings).
Instead, the optimization issues can be avoided by setting
STORE_REGISTER_VALUE to -1, meaning that all the bits are significant.
(It would be better if we could set STORE_REGISTER_VALUE according to the
known mask or vector size, but we can't.)
2020-03-18 Andrew Stubbs <ams@codesourcery.com>
gcc/
* config/gcn/gcn-valu.md (vec_cmp<mode>di): Set operand 1 to DImode.
(vec_cmp<mode>di_dup): Likewise.
* config/gcn/gcn.h (STORE_FLAG_VALUE): Set to -1.
Andrew Stubbs [Tue, 3 Mar 2020 17:36:49 +0000 (17:36 +0000)]
amdgcn: Add cond_add/sub/and/ior/xor for all vector modes
2020-03-18 Andrew Stubbs <ams@codesourcery.com>
gcc/
* config/gcn/gcn-valu.md (COND_MODE): Delete.
(COND_INT_MODE): Delete.
(cond_op): Add "mult".
(cond_<expander><mode>): Use VEC_ALLREG_MODE.
(cond_<expander><mode>): Use VEC_ALLREG_INT_MODE.
Nathan Sidwell [Wed, 18 Mar 2020 12:16:28 +0000 (05:16 -0700)]
PR c++/94147 - mangling of lambdas assigned to globals
This patch implements Jason's suggestion of pushing a lambda scope
when parsing a global variable initializer. That bit worked fine, but
happened to cause g++.dg/opt/dump1.C to not give any
used-but-not-defined warnings.
The reason was no_linkage_check, which considers any lambda that has
an extra-scope to have linkage. Which is technically correct. Except
that we think that all types that have linkage have external linkage.
Our representation of linkage and visibility is somewhat inaccurate,
particularly when it comes to types. We have TREE_PUBLIC,
DECL_EXTERNAL, DECL_VISIBILITY, DECL_COMDAT, DECL_NOT_REALLY_EXTERN.
It could really do with a through cleanup, but that won't be a simple
task.
The best I could come up with was seeing if the extra scope was a
VAR_DECL, and if that was TREE_PUBLIC and the var was inline (its
COMDATness is sadly not set at that point) or a template
instantiation, then the lambda had linkage. Otherwise it's as-if it
has no-linkage from the POV of compiler internals.
This is an ABI change (so we should document it), but it's changing
mangling from an unpredictable (in practice) counter, to something the
ABI defines. So I'm not concerned about mangling-changed warnings, or
preserving the broken mangling under some ABI selection flag. Code
that did this worked by accident within a single TU. It'll continue
to work by design there, and across TUs.
* parser.c (cp_parser_init_declarator): Namespace-scope variables
provide a lambda scope.
* tree.c (no_linkage_check): Lambdas with a variable for extra
scope have a linkage from the variable.
Richard Biener [Wed, 18 Mar 2020 12:11:30 +0000 (13:11 +0100)]
middle-end/94206 fix memset folding to avoid types with padding
This makes sure that the store a memset is folded to uses a type
covering all bits.
2020-03-18 Richard Biener <rguenther@suse.de>
PR middle-end/94206
* gimple-fold.c (gimple_fold_builtin_memset): Avoid using
partial int modes or not mode-precision integer types for
the store.
* gcc.dg/torture/pr94206.c: New testcase.
Jakub Jelinek [Wed, 18 Mar 2020 11:56:26 +0000 (12:56 +0100)]
Fix up duplicated duplicated words in comments
Another set of duplicated word fixes for things I've missed last time.
These include e.g. *.cc files I forgot about, or duplicated words at the start
or end of line.
2020-03-18 Jakub Jelinek <jakub@redhat.com>
* asan.c (get_mem_refs_of_builtin_call): Fix up duplicated word issue
in a comment.
* config/arc/arc.c (frame_stack_add): Likewise.
* gimple-loop-versioning.cc (loop_versioning::analyze_arbitrary_term):
Likewise.
* ipa-predicate.c (predicate::remap_after_inlining): Likewise.
* tree-ssa-strlen.h (handle_printf_call): Likewise.
* tree-ssa-strlen.c (is_strlen_related_p): Likewise.
* optinfo-emit-json.cc (optrecord_json_writer::add_record): Likewise.
analyzer/
* sm-malloc.cc (malloc_state_machine::on_stmt): Fix up duplicated word
issue in a comment.
* region-model.cc (region_model::make_region_for_unexpected_tree_code,
region_model::delete_region_and_descendents): Likewise.
* engine.cc (class exploded_cluster): Likewise.
* diagnostic-manager.cc (class path_builder): Likewise.
cp/
* constraint.cc (resolve_function_concept_check, subsumes_constraints,
strictly_subsumes): Fix up duplicated word issue in a comment.
* coroutines.cc (build_init_or_final_await, captures_temporary):
Likewise.
* logic.cc (dnf_size_r, cnf_size_r): Likewise.
* pt.c (append_type_to_template_for_access_check): Likewise.
d/
* expr.cc (ExprVisitor::visit (CatAssignExp *)): Fix up duplicated
word issue in a comment.
* d-target.cc (Target::FPTypeProperties<T>::max): Likewise.
fortran/
* class.c (generate_finalization_wrapper): Fix up duplicated word
issue in a comment.
* trans-types.c (gfc_get_nodesc_array_type): Likewise.
Duan bo [Wed, 18 Mar 2020 10:18:39 +0000 (10:18 +0000)]
aarch64: Fix SYMBOL_TINY_GOT handling for ILP32 [PR94201]
The SYMBOL_TINY_GOT case in aarch64_load_symref_appropriately was
missing support for ILP32. This caused an ICE on the testcase.
2020-03-18 Duan bo <duanbo3@huawei.com>
gcc/
PR target/94201
* config/aarch64/aarch64.md (ldr_got_tiny): Delete.
(@ldr_got_tiny_<mode>): New pattern.
(ldr_got_tiny_sidi): Likewise.
* config/aarch64/aarch64.c (aarch64_load_symref_appropriately): Use
them to handle SYMBOL_TINY_GOT for ILP32.
gcc/testsuite/
PR target/94201
* gcc.target/aarch64/pr94201.c:New test.
Richard Sandiford [Mon, 16 Mar 2020 15:26:35 +0000 (15:26 +0000)]
aarch64: Treat p12-p15 as call-preserved in SVE PCS functions
Due to a stupid mistake that I can't really explain, I'd got the
treatment of p12-p15 mixed up when adding support for the SVE PCS.
The registers are supposed to be call-preserved rather than
call-clobbered.
The fix is simple, but it has quite a big effect on the PCS tests
(as it should!).
2020-03-18 Richard Sandiford <richard.sandiford@arm.com>
gcc/
* config/aarch64/aarch64.c (aarch64_sve_abi): Treat p12-p15 as
call-preserved for SVE PCS functions.
(aarch64_layout_frame): Cope with up to 12 predicate save slots.
Optimize the case in which there are no following vector save slots.
gcc/testsuite/
* gcc.target/aarch64/sve/acle/general/cpy_1.c: Leave gaps for in the
check-function-bodies patterns for p15 to be saved.
* gcc.target/aarch64/sve/pcs/args_1.c (callee_pred): Expect two
predicates to be saved.
* gcc.target/aarch64/sve/pcs/saves_1_be_nowrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_1_be_wrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_1_le_nowrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_1_le_wrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_2_be_nowrap.c: Expect p12-p15
to be saved and restored.
* gcc.target/aarch64/sve/pcs/saves_2_be_wrap.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_2_le_nowrap.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_2_le_wrap.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_4_be.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_4_le.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_5_be.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_5_le.c: Likewise.
* gcc.target/aarch64/sve/pcs/stack_clash_1.c (test_1): Likewise.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/stack_clash_1_128.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/stack_clash_1_256.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 16 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_1_512.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 16 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_1_1024.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 16 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_1_2048.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 32 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_2_256.c: Use z16 rather
than p4 to create a vector-sized save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_2_512.c: Likewise.
* gcc.target/aarch64/sve/pcs/stack_clash_2_1024.c: Likewise.
* gcc.target/aarch64/sve/pcs/stack_clash_2_2048.c: Likewise.
Tobias Burnus [Wed, 18 Mar 2020 11:07:54 +0000 (12:07 +0100)]
libgomp testsuite - disable long double for AMDGCN
* testsuite/libgomp.oacc-c++/firstprivate-mappings-1.C: Add
#define DO_LONG_DOUBLE; set to 1, except for nvidia + gcn.
* libgomp.oacc-c-c++-common/firstprivate-mappings-1.c: Likewise.
* g++.dg/goacc/firstprivate-mappings-1.C: Only set DO_LONG_DOUBLE if
not defined; update comments.
* c-c++-common/goacc/firstprivate-mappings-1.c: Likewise.
Richard Biener [Wed, 18 Mar 2020 08:13:17 +0000 (09:13 +0100)]
middle-end/94188 fix fold of addr expression generation
This adds a missing type conversion to build_fold_addr_expr and adjusts
fallout - build_fold_addr_expr was used as a convenience to build an
ADDR_EXPR but some callers do not expect the result to be simplified
to something else.
2020-03-18 Richard Biener <rguenther@suse.de>
PR middle-end/94188
* fold-const.c (build_fold_addr_expr): Convert address to
correct type.
* asan.c (maybe_create_ssa_name): Strip useless type conversions.
* gimple-fold.c (gimple_fold_stmt_to_constant_1): Use build1
to build the ADDR_EXPR which we don't really want to simplify.
* tree-ssa-dom.c (record_equivalences_from_stmt): Likewise.
* tree-ssa-loop-im.c (gather_mem_refs_stmt): Likewise.
* tree-ssa-forwprop.c (forward_propagate_addr_expr_1): Likewise.
(simplify_builtin_call): Strip useless type conversions.
* tree-ssa-strlen.c (new_strinfo): Likewise.
* gcc.dg/pr94188.c: New testcase.
Jakub Jelinek [Wed, 18 Mar 2020 07:53:23 +0000 (08:53 +0100)]
c++: Diagnose a deduction guide in a wrong scope [PR91759]
The following testcase is accepts-invalid since r7-6608-ga56c0ac08242269b.
Before that change we had this
"deduction guide %qD must be declared in the same scope as %qT"
diagnostics for it, after the change it is expected to be diagnosed
in set_decl_namespace at the not_found: label in there. On this testcase
nothing is diagnosed though, because set_decl_namespace isn't called at all,
as in_namespace is NULL.
The following patch restores the old warning but does it only in case we
don't call set_decl_namespace.
2020-03-18 Jakub Jelinek <jakub@redhat.com>
PR c++/91759
* decl.c (grokfndecl): Restore old diagnostics about deduction
guide declared in different scope if in_namespace is NULL_TREE.
* g++.dg/cpp1z/class-deduction72.C: New test.
Alexey Neyman [Sun, 15 Mar 2020 00:05:36 +0000 (17:05 -0700)]
dwarf: Generate DIEs for external variables with -g1 [93751]
-g1 is described in the manual to generate debug info for functions and
external variables. It does that for older debugging formats but not for
DWARF. This change brings DWARF in line with the rest of the debugging
formats and with the manual.
gcc/ChangeLog
2020-03-17 Alexey Neyman <stilor@att.net>
PR debug/93751
* dwarf2out.c (gen_decl_die): Proceed to generating the DIE if
the debug level is terse and the declaration is public. Do not
generate type info.
(dwarf2out_decl): Same.
(add_type_attribute): Return immediately if debug level is
terse.
Signed-off-by: Alexey Neyman <stilor@att.net>
Jason Merrill [Tue, 17 Mar 2020 09:45:02 +0000 (05:45 -0400)]
c++: Fix comment typo.
Jonathan Wakely [Wed, 18 Mar 2020 00:23:39 +0000 (00:23 +0000)]
libstdc++: Fix type-erasure in experimental::net::executor (PR 94203)
The _Tgt and _TgtImpl types that implement type-erasure didn't agree on
the virtual interface, so failed as soon as they were instantiated. With
Clang they failed even sooner. The interface was also dependent on
whether RTTI was enabled or not.
This patch fixes the broken virtual functions and makes the type work
without RTTI, by using a pointer to a specialization of a function
template (similar to the approaches in std::function and std::any).
The changes to the virtual functions would be an ABI change, except that
the previous code didn't even compile if instantiated. This is
experimental TS material anyway.
PR libstdc++/94203
* include/experimental/executor (executor::executor(Executor)): Call
make_shared directly instead of _M_create. Create _Tgt1 object.
(executor::executor(allocator_arg_t, const ProtoAlloc&, Executor)):
Call allocate_shared directly instead of _M_create. Create _Tgt2
object.
(executor::target_type): Add cast needed for new _Tgt interface.
(executor::target): Define when RTTI is disabled. Use _Tgt::_M_func.
(executor::_Tgt): Define the same interface whether RTTI is enabled or
not.
(executor::_Tgt::target_type, executor::_Tgt::target): Do not use
std::type_info in the interface.
(executor::_Tgt::_M_func): Add data member.
(executor::_TgtImpl): Replace with _Tgt1 and _Tgt2 class templates.
(executor::_Tgt1::_S_func): Define function to access target without
depending on RTTI.
(executor::_M_create): Remove.
(operator==, operator!=): Simplify comparisons for executor.
* include/experimental/socket (is_error_code_enum<socket_errc>):
Define specialization before use.
* testsuite/experimental/net/executor/1.cc: New test.
GCC Administrator [Wed, 18 Mar 2020 00:16:17 +0000 (00:16 +0000)]
Daily bump.
Uros Bizjak [Tue, 17 Mar 2020 22:01:04 +0000 (23:01 +0100)]
testsuite: Fix g++.dg/debug/dwarf2/const2b.C target selector
* g++.dg/debug/dwarf2/const2b.C (dg-do): Fix target selector.
Jakub Jelinek [Tue, 17 Mar 2020 21:32:34 +0000 (22:32 +0100)]
c: Handle C_TYPE_INCOMPLETE_VARS even for ENUMERAL_TYPEs [PR94172]
The following testcases ICE, because they contain extern variable
declarations with incomplete enum types that is later completed and after
that those variables are accessed. The ICEs are because the vars then may have
incorrect DECL_MODE etc., e.g. in the first case the var has SImode
DECL_MODE (the guessed mode for the enum), but the enum then actually has
DImode because its enumerators don't fit into unsigned int.
The following patch fixes it by using C_TYPE_INCOMPLETE_VARS not just on
incomplete struct/union types, but also incomplete enum types.
TYPE_VFIELD can't be used as it is TYPE_MIN_VALUE on ENUMERAL_TYPE,
thankfully TYPE_LANG_SLOT_1 has been used in the C FE only on
FUNCTION_TYPEs.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR c/94172
* c-tree.h (C_TYPE_INCOMPLETE_VARS): Define to TYPE_LANG_SLOT_1
instead of TYPE_VFIELD, and support it on {RECORD,UNION,ENUMERAL}_TYPE.
(TYPE_ACTUAL_ARG_TYPES): Check that it is only used on FUNCTION_TYPEs.
* c-decl.c (pushdecl): Push C_TYPE_INCOMPLETE_VARS also to
ENUMERAL_TYPEs.
(finish_incomplete_vars): New function, moved from finish_struct. Use
relayout_decl instead of layout_decl.
(finish_struct): Remove obsolete comment about C_TYPE_INCOMPLETE_VARS
being TYPE_VFIELD. Use finish_incomplete_vars.
(finish_enum): Clear C_TYPE_INCOMPLETE_VARS. Call
finish_incomplete_vars.
* c-typeck.c (c_build_qualified_type): Clear C_TYPE_INCOMPLETE_VARS
also on ENUMERAL_TYPEs.
* gcc.dg/pr94172-1.c: New test.
* gcc.dg/pr94172-2.c: New test.
Jakub Jelinek [Tue, 17 Mar 2020 20:21:16 +0000 (21:21 +0100)]
c++: Fix parsing of invalid enum specifiers [PR90995]
The testcase shows some accepts-invalid (the ones without alignas) and
ice-on-invalid-code (the ones with alignas) cases.
If the enum doesn't have an underlying type and is not a definition,
the caller retries to parse it as elaborated type specifier.
E.g. for enum struct S s it will then pedwarn that elaborated type specifier
shouldn't have the struct/class keywords.
The problem is if the enum specifier is not followed by { when it has
underlying type. In that case we have already called
cp_parser_parse_definitely to end the tentative parsing started at the
beginning of cp_parser_enum_specifier. But the
cp_parser_error (parser, "expected %<;%> or %<{%>");
doesn't emit any error because the whole function is called from yet another
tentative parse and the caller starts parsing the elaborated type
specifier where the cp_parser_enum_specifier stopped (i.e. after the
underlying type token(s)). The ultimate caller than commits the tentative
parsing (and even if it wouldn't, it wouldn't know what kind of error
to report). I think after seeing enum {,struct,class} : type not being
followed by { or ;, there is no reason not to report it right away, as it
can't be valid C++, which is what the patch does. Not sure if we shouldn't
also return error_mark_node instead of NULL_TREE, so that the caller doesn't
try to parse it as elaborated type specifier (the patch doesn't do that
right now).
Furthermore, while reading the code, I've noticed that
parser->colon_corrects_to_scope_p is saved and set to false at the start
of the function, but not restored back in some cases. Don't have a testcase
where this would be a problem, but it just seems wrong. Either we can in
the two spots replace return NULL_TREE; with { type = NULL_TREE; goto out; }
or we could perhaps abuse warning_sentinel or create a special class with
dtor to clean the flag up.
And lastly, I've fixed some formatting issues in the function while reading
it.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR c++/90995
* parser.c (cp_parser_enum_specifier): Use temp_override for
parser->colon_corrects_to_scope_p, replace goto out with return.
If scoped enum or enum with underlying type is not followed by
{ or ;, call cp_parser_commit_to_tentative_parse before calling
cp_parser_error and make sure to return error_mark_node instead of
NULL_TREE. Formatting fixes.
* g++.dg/cpp0x/enum40.C: New test.
Richard Sandiford [Tue, 17 Mar 2020 15:39:42 +0000 (15:39 +0000)]
testsuite: Fix gcc.target/aarch64/advsimd-intrinsics/bfcvt-nosimd.c
2020-03-17 Richard Sandiford <richard.sandiford@arm.com>
gcc/testsuite/
* gcc.target/aarch64/advsimd-intrinsics/bfcvt-nosimd.c: Skip for
-fno-fat-lto-objects. Use tabs rather than spaces in the
check-function-bodies code.
Richard Sandiford [Tue, 17 Mar 2020 15:36:37 +0000 (15:36 +0000)]
aarch64: Fix bf16_v(ld|st)n.c failures for big-endian
gcc.target/aarch64/advsimd-intrinsics/bf16_vldn.c and
gcc.target/aarch64/advsimd-intrinsics/bf16_vstn.c were
failing for big-endian targets because the <Vmtype> in
aarch64_be_ld1<mode> and aarch64_be_st1<mode> had no
expansion for the bfloat16 modes.
2020-03-17 Richard Sandiford <richard.sandiford@arm.com>
gcc/
* config/aarch64/iterators.md (Vmtype): Handle V4BF and V8BF.
Ville Voutilainen [Tue, 17 Mar 2020 16:43:21 +0000 (18:43 +0200)]
Fix the ChangeLog after the __is_assignable/__is_constructible fix
Iain Sandoe [Tue, 17 Mar 2020 14:12:54 +0000 (14:12 +0000)]
coroutines, testsuite: Fix single test execution.
Invocations of the coro-torture.exp like 'coro-torture.exp=some-test.C' were
failing because DEFAULT_CXXFLAGS was undefined. Fixed by defining this
locally, if it has no pre-existing global value.
Srinath Parvathaneni [Tue, 17 Mar 2020 15:56:35 +0000 (15:56 +0000)]
[ARM][GCC][1/3x]: MVE intrinsics with ternary operands.
This patch supports following MVE ACLE intrinsics with ternary operands.
vabavq_s8, vabavq_s16, vabavq_s32, vbicq_m_n_s16, vbicq_m_n_s32, vbicq_m_n_u16, vbicq_m_n_u32, vcmpeqq_m_f16, vcmpeqq_m_f32, vcvtaq_m_s16_f16, vcvtaq_m_u16_f16, vcvtaq_m_s32_f32, vcvtaq_m_u32_f32, vcvtq_m_f16_s16, vcvtq_m_f16_u16, vcvtq_m_f32_s32, vcvtq_m_f32_u32, vqrshrnbq_n_s16, vqrshrnbq_n_u16, vqrshrnbq_n_s32, vqrshrnbq_n_u32, vqrshrunbq_n_s16, vqrshrunbq_n_s32, vrmlaldavhaq_s32, vrmlaldavhaq_u32, vshlcq_s8, vshlcq_u8, vshlcq_s16, vshlcq_u16, vshlcq_s32, vshlcq_u32, vabavq_s8, vabavq_s16, vabavq_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (TERNOP_UNONE_UNONE_UNONE_IMM_QUALIFIERS):
Define qualifier for ternary operands.
(TERNOP_UNONE_UNONE_NONE_NONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_NONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vabavq_s8): Define macro.
(vabavq_s16): Likewise.
(vabavq_s32): Likewise.
(vbicq_m_n_s16): Likewise.
(vbicq_m_n_s32): Likewise.
(vbicq_m_n_u16): Likewise.
(vbicq_m_n_u32): Likewise.
(vcmpeqq_m_f16): Likewise.
(vcmpeqq_m_f32): Likewise.
(vcvtaq_m_s16_f16): Likewise.
(vcvtaq_m_u16_f16): Likewise.
(vcvtaq_m_s32_f32): Likewise.
(vcvtaq_m_u32_f32): Likewise.
(vcvtq_m_f16_s16): Likewise.
(vcvtq_m_f16_u16): Likewise.
(vcvtq_m_f32_s32): Likewise.
(vcvtq_m_f32_u32): Likewise.
(vqrshrnbq_n_s16): Likewise.
(vqrshrnbq_n_u16): Likewise.
(vqrshrnbq_n_s32): Likewise.
(vqrshrnbq_n_u32): Likewise.
(vqrshrunbq_n_s16): Likewise.
(vqrshrunbq_n_s32): Likewise.
(vrmlaldavhaq_s32): Likewise.
(vrmlaldavhaq_u32): Likewise.
(vshlcq_s8): Likewise.
(vshlcq_u8): Likewise.
(vshlcq_s16): Likewise.
(vshlcq_u16): Likewise.
(vshlcq_s32): Likewise.
(vshlcq_u32): Likewise.
(vabavq_u8): Likewise.
(vabavq_u16): Likewise.
(vabavq_u32): Likewise.
(__arm_vabavq_s8): Define intrinsic.
(__arm_vabavq_s16): Likewise.
(__arm_vabavq_s32): Likewise.
(__arm_vabavq_u8): Likewise.
(__arm_vabavq_u16): Likewise.
(__arm_vabavq_u32): Likewise.
(__arm_vbicq_m_n_s16): Likewise.
(__arm_vbicq_m_n_s32): Likewise.
(__arm_vbicq_m_n_u16): Likewise.
(__arm_vbicq_m_n_u32): Likewise.
(__arm_vqrshrnbq_n_s16): Likewise.
(__arm_vqrshrnbq_n_u16): Likewise.
(__arm_vqrshrnbq_n_s32): Likewise.
(__arm_vqrshrnbq_n_u32): Likewise.
(__arm_vqrshrunbq_n_s16): Likewise.
(__arm_vqrshrunbq_n_s32): Likewise.
(__arm_vrmlaldavhaq_s32): Likewise.
(__arm_vrmlaldavhaq_u32): Likewise.
(__arm_vshlcq_s8): Likewise.
(__arm_vshlcq_u8): Likewise.
(__arm_vshlcq_s16): Likewise.
(__arm_vshlcq_u16): Likewise.
(__arm_vshlcq_s32): Likewise.
(__arm_vshlcq_u32): Likewise.
(__arm_vcmpeqq_m_f16): Likewise.
(__arm_vcmpeqq_m_f32): Likewise.
(__arm_vcvtaq_m_s16_f16): Likewise.
(__arm_vcvtaq_m_u16_f16): Likewise.
(__arm_vcvtaq_m_s32_f32): Likewise.
(__arm_vcvtaq_m_u32_f32): Likewise.
(__arm_vcvtq_m_f16_s16): Likewise.
(__arm_vcvtq_m_f16_u16): Likewise.
(__arm_vcvtq_m_f32_s32): Likewise.
(__arm_vcvtq_m_f32_u32): Likewise.
(vcvtaq_m): Define polymorphic variant.
(vcvtq_m): Likewise.
(vabavq): Likewise.
(vshlcq): Likewise.
(vbicq_m_n): Likewise.
(vqrshrnbq_n): Likewise.
(vqrshrunbq_n): Likewise.
* config/arm/arm_mve_builtins.def
(TERNOP_UNONE_UNONE_UNONE_IMM_QUALIFIERS): Use the builtin qualifer.
(TERNOP_UNONE_UNONE_NONE_NONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_NONE_QUALIFIERS): Likewise.
* config/arm/mve.md (VBICQ_M_N): Define iterator.
(VCVTAQ_M): Likewise.
(VCVTQ_M_TO_F): Likewise.
(VQRSHRNBQ_N): Likewise.
(VABAVQ): Likewise.
(VSHLCQ): Likewise.
(VRMLALDAVHAQ): Likewise.
(mve_vbicq_m_n_<supf><mode>): Define RTL pattern.
(mve_vcmpeqq_m_f<mode>): Likewise.
(mve_vcvtaq_m_<supf><mode>): Likewise.
(mve_vcvtq_m_to_f_<supf><mode>): Likewise.
(mve_vqrshrnbq_n_<supf><mode>): Likewise.
(mve_vqrshrunbq_n_s<mode>): Likewise.
(mve_vrmlaldavhaq_<supf>v4si): Likewise.
(mve_vabavq_<supf><mode>): Likewise.
(mve_vshlcq_<supf><mode>): Likewise.
(mve_vshlcq_<supf><mode>): Likewise.
(mve_vshlcq_vec_<supf><mode>): Define RTL expand.
(mve_vshlcq_carry_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabavq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_u8.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 15:44:52 +0000 (15:44 +0000)]
[ARM][GCC][5/2x]: MVE intrinsics with binary operands.
This patch supports following MVE ACLE intrinsics with binary operands.
vqmovntq_u16, vqmovnbq_u16, vmulltq_poly_p8, vmullbq_poly_p8, vmovntq_u16, vmovnbq_u16, vmlaldavxq_u16, vmlaldavq_u16, vqmovuntq_s16, vqmovunbq_s16, vshlltq_n_u8, vshllbq_n_u8, vorrq_n_u16, vbicq_n_u16, vcmpneq_n_f16, vcmpneq_f16, vcmpltq_n_f16, vcmpltq_f16, vcmpleq_n_f16, vcmpleq_f16, vcmpgtq_n_f16, vcmpgtq_f16, vcmpgeq_n_f16, vcmpgeq_f16, vcmpeqq_n_f16, vcmpeqq_f16, vsubq_f16, vqmovntq_s16, vqmovnbq_s16, vqdmulltq_s16, vqdmulltq_n_s16, vqdmullbq_s16, vqdmullbq_n_s16, vorrq_f16, vornq_f16, vmulq_n_f16, vmulq_f16, vmovntq_s16, vmovnbq_s16, vmlsldavxq_s16, vmlsldavq_s16, vmlaldavxq_s16, vmlaldavq_s16, vminnmvq_f16, vminnmq_f16, vminnmavq_f16, vminnmaq_f16, vmaxnmvq_f16, vmaxnmq_f16, vmaxnmavq_f16, vmaxnmaq_f16, veorq_f16, vcmulq_rot90_f16, vcmulq_rot270_f16, vcmulq_rot180_f16, vcmulq_f16, vcaddq_rot90_f16, vcaddq_rot270_f16, vbicq_f16, vandq_f16, vaddq_n_f16, vabdq_f16, vshlltq_n_s8, vshllbq_n_s8, vorrq_n_s16, vbicq_n_s16, vqmovntq_u32, vqmovnbq_u32, vmulltq_poly_p16, vmullbq_poly_p16, vmovntq_u32, vmovnbq_u32, vmlaldavxq_u32, vmlaldavq_u32, vqmovuntq_s32, vqmovunbq_s32, vshlltq_n_u16, vshllbq_n_u16, vorrq_n_u32, vbicq_n_u32, vcmpneq_n_f32, vcmpneq_f32, vcmpltq_n_f32, vcmpltq_f32, vcmpleq_n_f32, vcmpleq_f32, vcmpgtq_n_f32, vcmpgtq_f32, vcmpgeq_n_f32, vcmpgeq_f32, vcmpeqq_n_f32, vcmpeqq_f32, vsubq_f32, vqmovntq_s32, vqmovnbq_s32, vqdmulltq_s32, vqdmulltq_n_s32, vqdmullbq_s32, vqdmullbq_n_s32, vorrq_f32, vornq_f32, vmulq_n_f32, vmulq_f32, vmovntq_s32, vmovnbq_s32, vmlsldavxq_s32, vmlsldavq_s32, vmlaldavxq_s32, vmlaldavq_s32, vminnmvq_f32, vminnmq_f32, vminnmavq_f32, vminnmaq_f32, vmaxnmvq_f32, vmaxnmq_f32, vmaxnmavq_f32, vmaxnmaq_f32, veorq_f32, vcmulq_rot90_f32, vcmulq_rot270_f32, vcmulq_rot180_f32, vcmulq_f32, vcaddq_rot90_f32, vcaddq_rot270_f32, vbicq_f32, vandq_f32, vaddq_n_f32, vabdq_f32, vshlltq_n_s16, vshllbq_n_s16, vorrq_n_s32, vbicq_n_s32, vrmlaldavhq_u32, vctp8q_m, vctp64q_m, vctp32q_m, vctp16q_m, vaddlvaq_u32, vrmlsldavhxq_s32, vrmlsldavhq_s32, vrmlaldavhxq_s32, vrmlaldavhq_s32, vcvttq_f16_f32, vcvtbq_f16_f32, vaddlvaq_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
The above intrinsics are defined using the already defined builtin qualifiers BINOP_NONE_NONE_IMM, BINOP_NONE_NONE_NONE, BINOP_UNONE_NONE_NONE, BINOP_UNONE_UNONE_IMM, BINOP_UNONE_UNONE_NONE, BINOP_UNONE_UNONE_UNONE.
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vqmovntq_u16): Define macro.
(vqmovnbq_u16): Likewise.
(vmulltq_poly_p8): Likewise.
(vmullbq_poly_p8): Likewise.
(vmovntq_u16): Likewise.
(vmovnbq_u16): Likewise.
(vmlaldavxq_u16): Likewise.
(vmlaldavq_u16): Likewise.
(vqmovuntq_s16): Likewise.
(vqmovunbq_s16): Likewise.
(vshlltq_n_u8): Likewise.
(vshllbq_n_u8): Likewise.
(vorrq_n_u16): Likewise.
(vbicq_n_u16): Likewise.
(vcmpneq_n_f16): Likewise.
(vcmpneq_f16): Likewise.
(vcmpltq_n_f16): Likewise.
(vcmpltq_f16): Likewise.
(vcmpleq_n_f16): Likewise.
(vcmpleq_f16): Likewise.
(vcmpgtq_n_f16): Likewise.
(vcmpgtq_f16): Likewise.
(vcmpgeq_n_f16): Likewise.
(vcmpgeq_f16): Likewise.
(vcmpeqq_n_f16): Likewise.
(vcmpeqq_f16): Likewise.
(vsubq_f16): Likewise.
(vqmovntq_s16): Likewise.
(vqmovnbq_s16): Likewise.
(vqdmulltq_s16): Likewise.
(vqdmulltq_n_s16): Likewise.
(vqdmullbq_s16): Likewise.
(vqdmullbq_n_s16): Likewise.
(vorrq_f16): Likewise.
(vornq_f16): Likewise.
(vmulq_n_f16): Likewise.
(vmulq_f16): Likewise.
(vmovntq_s16): Likewise.
(vmovnbq_s16): Likewise.
(vmlsldavxq_s16): Likewise.
(vmlsldavq_s16): Likewise.
(vmlaldavxq_s16): Likewise.
(vmlaldavq_s16): Likewise.
(vminnmvq_f16): Likewise.
(vminnmq_f16): Likewise.
(vminnmavq_f16): Likewise.
(vminnmaq_f16): Likewise.
(vmaxnmvq_f16): Likewise.
(vmaxnmq_f16): Likewise.
(vmaxnmavq_f16): Likewise.
(vmaxnmaq_f16): Likewise.
(veorq_f16): Likewise.
(vcmulq_rot90_f16): Likewise.
(vcmulq_rot270_f16): Likewise.
(vcmulq_rot180_f16): Likewise.
(vcmulq_f16): Likewise.
(vcaddq_rot90_f16): Likewise.
(vcaddq_rot270_f16): Likewise.
(vbicq_f16): Likewise.
(vandq_f16): Likewise.
(vaddq_n_f16): Likewise.
(vabdq_f16): Likewise.
(vshlltq_n_s8): Likewise.
(vshllbq_n_s8): Likewise.
(vorrq_n_s16): Likewise.
(vbicq_n_s16): Likewise.
(vqmovntq_u32): Likewise.
(vqmovnbq_u32): Likewise.
(vmulltq_poly_p16): Likewise.
(vmullbq_poly_p16): Likewise.
(vmovntq_u32): Likewise.
(vmovnbq_u32): Likewise.
(vmlaldavxq_u32): Likewise.
(vmlaldavq_u32): Likewise.
(vqmovuntq_s32): Likewise.
(vqmovunbq_s32): Likewise.
(vshlltq_n_u16): Likewise.
(vshllbq_n_u16): Likewise.
(vorrq_n_u32): Likewise.
(vbicq_n_u32): Likewise.
(vcmpneq_n_f32): Likewise.
(vcmpneq_f32): Likewise.
(vcmpltq_n_f32): Likewise.
(vcmpltq_f32): Likewise.
(vcmpleq_n_f32): Likewise.
(vcmpleq_f32): Likewise.
(vcmpgtq_n_f32): Likewise.
(vcmpgtq_f32): Likewise.
(vcmpgeq_n_f32): Likewise.
(vcmpgeq_f32): Likewise.
(vcmpeqq_n_f32): Likewise.
(vcmpeqq_f32): Likewise.
(vsubq_f32): Likewise.
(vqmovntq_s32): Likewise.
(vqmovnbq_s32): Likewise.
(vqdmulltq_s32): Likewise.
(vqdmulltq_n_s32): Likewise.
(vqdmullbq_s32): Likewise.
(vqdmullbq_n_s32): Likewise.
(vorrq_f32): Likewise.
(vornq_f32): Likewise.
(vmulq_n_f32): Likewise.
(vmulq_f32): Likewise.
(vmovntq_s32): Likewise.
(vmovnbq_s32): Likewise.
(vmlsldavxq_s32): Likewise.
(vmlsldavq_s32): Likewise.
(vmlaldavxq_s32): Likewise.
(vmlaldavq_s32): Likewise.
(vminnmvq_f32): Likewise.
(vminnmq_f32): Likewise.
(vminnmavq_f32): Likewise.
(vminnmaq_f32): Likewise.
(vmaxnmvq_f32): Likewise.
(vmaxnmq_f32): Likewise.
(vmaxnmavq_f32): Likewise.
(vmaxnmaq_f32): Likewise.
(veorq_f32): Likewise.
(vcmulq_rot90_f32): Likewise.
(vcmulq_rot270_f32): Likewise.
(vcmulq_rot180_f32): Likewise.
(vcmulq_f32): Likewise.
(vcaddq_rot90_f32): Likewise.
(vcaddq_rot270_f32): Likewise.
(vbicq_f32): Likewise.
(vandq_f32): Likewise.
(vaddq_n_f32): Likewise.
(vabdq_f32): Likewise.
(vshlltq_n_s16): Likewise.
(vshllbq_n_s16): Likewise.
(vorrq_n_s32): Likewise.
(vbicq_n_s32): Likewise.
(vrmlaldavhq_u32): Likewise.
(vctp8q_m): Likewise.
(vctp64q_m): Likewise.
(vctp32q_m): Likewise.
(vctp16q_m): Likewise.
(vaddlvaq_u32): Likewise.
(vrmlsldavhxq_s32): Likewise.
(vrmlsldavhq_s32): Likewise.
(vrmlaldavhxq_s32): Likewise.
(vrmlaldavhq_s32): Likewise.
(vcvttq_f16_f32): Likewise.
(vcvtbq_f16_f32): Likewise.
(vaddlvaq_s32): Likewise.
(__arm_vqmovntq_u16): Define intrinsic.
(__arm_vqmovnbq_u16): Likewise.
(__arm_vmulltq_poly_p8): Likewise.
(__arm_vmullbq_poly_p8): Likewise.
(__arm_vmovntq_u16): Likewise.
(__arm_vmovnbq_u16): Likewise.
(__arm_vmlaldavxq_u16): Likewise.
(__arm_vmlaldavq_u16): Likewise.
(__arm_vqmovuntq_s16): Likewise.
(__arm_vqmovunbq_s16): Likewise.
(__arm_vshlltq_n_u8): Likewise.
(__arm_vshllbq_n_u8): Likewise.
(__arm_vorrq_n_u16): Likewise.
(__arm_vbicq_n_u16): Likewise.
(__arm_vcmpneq_n_f16): Likewise.
(__arm_vcmpneq_f16): Likewise.
(__arm_vcmpltq_n_f16): Likewise.
(__arm_vcmpltq_f16): Likewise.
(__arm_vcmpleq_n_f16): Likewise.
(__arm_vcmpleq_f16): Likewise.
(__arm_vcmpgtq_n_f16): Likewise.
(__arm_vcmpgtq_f16): Likewise.
(__arm_vcmpgeq_n_f16): Likewise.
(__arm_vcmpgeq_f16): Likewise.
(__arm_vcmpeqq_n_f16): Likewise.
(__arm_vcmpeqq_f16): Likewise.
(__arm_vsubq_f16): Likewise.
(__arm_vqmovntq_s16): Likewise.
(__arm_vqmovnbq_s16): Likewise.
(__arm_vqdmulltq_s16): Likewise.
(__arm_vqdmulltq_n_s16): Likewise.
(__arm_vqdmullbq_s16): Likewise.
(__arm_vqdmullbq_n_s16): Likewise.
(__arm_vorrq_f16): Likewise.
(__arm_vornq_f16): Likewise.
(__arm_vmulq_n_f16): Likewise.
(__arm_vmulq_f16): Likewise.
(__arm_vmovntq_s16): Likewise.
(__arm_vmovnbq_s16): Likewise.
(__arm_vmlsldavxq_s16): Likewise.
(__arm_vmlsldavq_s16): Likewise.
(__arm_vmlaldavxq_s16): Likewise.
(__arm_vmlaldavq_s16): Likewise.
(__arm_vminnmvq_f16): Likewise.
(__arm_vminnmq_f16): Likewise.
(__arm_vminnmavq_f16): Likewise.
(__arm_vminnmaq_f16): Likewise.
(__arm_vmaxnmvq_f16): Likewise.
(__arm_vmaxnmq_f16): Likewise.
(__arm_vmaxnmavq_f16): Likewise.
(__arm_vmaxnmaq_f16): Likewise.
(__arm_veorq_f16): Likewise.
(__arm_vcmulq_rot90_f16): Likewise.
(__arm_vcmulq_rot270_f16): Likewise.
(__arm_vcmulq_rot180_f16): Likewise.
(__arm_vcmulq_f16): Likewise.
(__arm_vcaddq_rot90_f16): Likewise.
(__arm_vcaddq_rot270_f16): Likewise.
(__arm_vbicq_f16): Likewise.
(__arm_vandq_f16): Likewise.
(__arm_vaddq_n_f16): Likewise.
(__arm_vabdq_f16): Likewise.
(__arm_vshlltq_n_s8): Likewise.
(__arm_vshllbq_n_s8): Likewise.
(__arm_vorrq_n_s16): Likewise.
(__arm_vbicq_n_s16): Likewise.
(__arm_vqmovntq_u32): Likewise.
(__arm_vqmovnbq_u32): Likewise.
(__arm_vmulltq_poly_p16): Likewise.
(__arm_vmullbq_poly_p16): Likewise.
(__arm_vmovntq_u32): Likewise.
(__arm_vmovnbq_u32): Likewise.
(__arm_vmlaldavxq_u32): Likewise.
(__arm_vmlaldavq_u32): Likewise.
(__arm_vqmovuntq_s32): Likewise.
(__arm_vqmovunbq_s32): Likewise.
(__arm_vshlltq_n_u16): Likewise.
(__arm_vshllbq_n_u16): Likewise.
(__arm_vorrq_n_u32): Likewise.
(__arm_vbicq_n_u32): Likewise.
(__arm_vcmpneq_n_f32): Likewise.
(__arm_vcmpneq_f32): Likewise.
(__arm_vcmpltq_n_f32): Likewise.
(__arm_vcmpltq_f32): Likewise.
(__arm_vcmpleq_n_f32): Likewise.
(__arm_vcmpleq_f32): Likewise.
(__arm_vcmpgtq_n_f32): Likewise.
(__arm_vcmpgtq_f32): Likewise.
(__arm_vcmpgeq_n_f32): Likewise.
(__arm_vcmpgeq_f32): Likewise.
(__arm_vcmpeqq_n_f32): Likewise.
(__arm_vcmpeqq_f32): Likewise.
(__arm_vsubq_f32): Likewise.
(__arm_vqmovntq_s32): Likewise.
(__arm_vqmovnbq_s32): Likewise.
(__arm_vqdmulltq_s32): Likewise.
(__arm_vqdmulltq_n_s32): Likewise.
(__arm_vqdmullbq_s32): Likewise.
(__arm_vqdmullbq_n_s32): Likewise.
(__arm_vorrq_f32): Likewise.
(__arm_vornq_f32): Likewise.
(__arm_vmulq_n_f32): Likewise.
(__arm_vmulq_f32): Likewise.
(__arm_vmovntq_s32): Likewise.
(__arm_vmovnbq_s32): Likewise.
(__arm_vmlsldavxq_s32): Likewise.
(__arm_vmlsldavq_s32): Likewise.
(__arm_vmlaldavxq_s32): Likewise.
(__arm_vmlaldavq_s32): Likewise.
(__arm_vminnmvq_f32): Likewise.
(__arm_vminnmq_f32): Likewise.
(__arm_vminnmavq_f32): Likewise.
(__arm_vminnmaq_f32): Likewise.
(__arm_vmaxnmvq_f32): Likewise.
(__arm_vmaxnmq_f32): Likewise.
(__arm_vmaxnmavq_f32): Likewise.
(__arm_vmaxnmaq_f32): Likewise.
(__arm_veorq_f32): Likewise.
(__arm_vcmulq_rot90_f32): Likewise.
(__arm_vcmulq_rot270_f32): Likewise.
(__arm_vcmulq_rot180_f32): Likewise.
(__arm_vcmulq_f32): Likewise.
(__arm_vcaddq_rot90_f32): Likewise.
(__arm_vcaddq_rot270_f32): Likewise.
(__arm_vbicq_f32): Likewise.
(__arm_vandq_f32): Likewise.
(__arm_vaddq_n_f32): Likewise.
(__arm_vabdq_f32): Likewise.
(__arm_vshlltq_n_s16): Likewise.
(__arm_vshllbq_n_s16): Likewise.
(__arm_vorrq_n_s32): Likewise.
(__arm_vbicq_n_s32): Likewise.
(__arm_vrmlaldavhq_u32): Likewise.
(__arm_vctp8q_m): Likewise.
(__arm_vctp64q_m): Likewise.
(__arm_vctp32q_m): Likewise.
(__arm_vctp16q_m): Likewise.
(__arm_vaddlvaq_u32): Likewise.
(__arm_vrmlsldavhxq_s32): Likewise.
(__arm_vrmlsldavhq_s32): Likewise.
(__arm_vrmlaldavhxq_s32): Likewise.
(__arm_vrmlaldavhq_s32): Likewise.
(__arm_vcvttq_f16_f32): Likewise.
(__arm_vcvtbq_f16_f32): Likewise.
(__arm_vaddlvaq_s32): Likewise.
(vst4q): Define polymorphic variant.
(vrndxq): Likewise.
(vrndq): Likewise.
(vrndpq): Likewise.
(vrndnq): Likewise.
(vrndmq): Likewise.
(vrndaq): Likewise.
(vrev64q): Likewise.
(vnegq): Likewise.
(vdupq_n): Likewise.
(vabsq): Likewise.
(vrev32q): Likewise.
(vcvtbq_f32): Likewise.
(vcvttq_f32): Likewise.
(vcvtq): Likewise.
(vsubq_n): Likewise.
(vbrsrq_n): Likewise.
(vcvtq_n): Likewise.
(vsubq): Likewise.
(vorrq): Likewise.
(vabdq): Likewise.
(vaddq_n): Likewise.
(vandq): Likewise.
(vbicq): Likewise.
(vornq): Likewise.
(vmulq_n): Likewise.
(vmulq): Likewise.
(vcaddq_rot270): Likewise.
(vcmpeqq_n): Likewise.
(vcmpeqq): Likewise.
(vcaddq_rot90): Likewise.
(vcmpgeq_n): Likewise.
(vcmpgeq): Likewise.
(vcmpgtq_n): Likewise.
(vcmpgtq): Likewise.
(vcmpgtq): Likewise.
(vcmpleq_n): Likewise.
(vcmpleq_n): Likewise.
(vcmpleq): Likewise.
(vcmpleq): Likewise.
(vcmpltq_n): Likewise.
(vcmpltq_n): Likewise.
(vcmpltq): Likewise.
(vcmpltq): Likewise.
(vcmpneq_n): Likewise.
(vcmpneq_n): Likewise.
(vcmpneq): Likewise.
(vcmpneq): Likewise.
(vcmulq): Likewise.
(vcmulq): Likewise.
(vcmulq_rot180): Likewise.
(vcmulq_rot180): Likewise.
(vcmulq_rot270): Likewise.
(vcmulq_rot270): Likewise.
(vcmulq_rot90): Likewise.
(vcmulq_rot90): Likewise.
(veorq): Likewise.
(veorq): Likewise.
(vmaxnmaq): Likewise.
(vmaxnmaq): Likewise.
(vmaxnmavq): Likewise.
(vmaxnmavq): Likewise.
(vmaxnmq): Likewise.
(vmaxnmq): Likewise.
(vmaxnmvq): Likewise.
(vmaxnmvq): Likewise.
(vminnmaq): Likewise.
(vminnmaq): Likewise.
(vminnmavq): Likewise.
(vminnmavq): Likewise.
(vminnmq): Likewise.
(vminnmq): Likewise.
(vminnmvq): Likewise.
(vminnmvq): Likewise.
(vbicq_n): Likewise.
(vqmovntq): Likewise.
(vqmovntq): Likewise.
(vqmovnbq): Likewise.
(vqmovnbq): Likewise.
(vmulltq_poly): Likewise.
(vmulltq_poly): Likewise.
(vmullbq_poly): Likewise.
(vmullbq_poly): Likewise.
(vmovntq): Likewise.
(vmovntq): Likewise.
(vmovnbq): Likewise.
(vmovnbq): Likewise.
(vmlaldavxq): Likewise.
(vmlaldavxq): Likewise.
(vqmovuntq): Likewise.
(vqmovuntq): Likewise.
(vshlltq_n): Likewise.
(vshlltq_n): Likewise.
(vshllbq_n): Likewise.
(vshllbq_n): Likewise.
(vorrq_n): Likewise.
(vorrq_n): Likewise.
(vmlaldavq): Likewise.
(vmlaldavq): Likewise.
(vqmovunbq): Likewise.
(vqmovunbq): Likewise.
(vqdmulltq_n): Likewise.
(vqdmulltq_n): Likewise.
(vqdmulltq): Likewise.
(vqdmulltq): Likewise.
(vqdmullbq_n): Likewise.
(vqdmullbq_n): Likewise.
(vqdmullbq): Likewise.
(vqdmullbq): Likewise.
(vaddlvaq): Likewise.
(vaddlvaq): Likewise.
(vrmlaldavhq): Likewise.
(vrmlaldavhq): Likewise.
(vrmlaldavhxq): Likewise.
(vrmlaldavhxq): Likewise.
(vrmlsldavhq): Likewise.
(vrmlsldavhq): Likewise.
(vrmlsldavhxq): Likewise.
(vrmlsldavhxq): Likewise.
(vmlsldavxq): Likewise.
(vmlsldavxq): Likewise.
(vmlsldavq): Likewise.
(vmlsldavq): Likewise.
* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_IMM): Use it.
(BINOP_NONE_NONE_NONE): Likewise.
(BINOP_UNONE_NONE_NONE): Likewise.
(BINOP_UNONE_UNONE_IMM): Likewise.
(BINOP_UNONE_UNONE_NONE): Likewise.
(BINOP_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (mve_vabdq_f<mode>): Define RTL pattern.
(mve_vaddlvaq_<supf>v4si): Likewise.
(mve_vaddq_n_f<mode>): Likewise.
(mve_vandq_f<mode>): Likewise.
(mve_vbicq_f<mode>): Likewise.
(mve_vbicq_n_<supf><mode>): Likewise.
(mve_vcaddq_rot270_f<mode>): Likewise.
(mve_vcaddq_rot90_f<mode>): Likewise.
(mve_vcmpeqq_f<mode>): Likewise.
(mve_vcmpeqq_n_f<mode>): Likewise.
(mve_vcmpgeq_f<mode>): Likewise.
(mve_vcmpgeq_n_f<mode>): Likewise.
(mve_vcmpgtq_f<mode>): Likewise.
(mve_vcmpgtq_n_f<mode>): Likewise.
(mve_vcmpleq_f<mode>): Likewise.
(mve_vcmpleq_n_f<mode>): Likewise.
(mve_vcmpltq_f<mode>): Likewise.
(mve_vcmpltq_n_f<mode>): Likewise.
(mve_vcmpneq_f<mode>): Likewise.
(mve_vcmpneq_n_f<mode>): Likewise.
(mve_vcmulq_f<mode>): Likewise.
(mve_vcmulq_rot180_f<mode>): Likewise.
(mve_vcmulq_rot270_f<mode>): Likewise.
(mve_vcmulq_rot90_f<mode>): Likewise.
(mve_vctp<mode1>q_mhi): Likewise.
(mve_vcvtbq_f16_f32v8hf): Likewise.
(mve_vcvttq_f16_f32v8hf): Likewise.
(mve_veorq_f<mode>): Likewise.
(mve_vmaxnmaq_f<mode>): Likewise.
(mve_vmaxnmavq_f<mode>): Likewise.
(mve_vmaxnmq_f<mode>): Likewise.
(mve_vmaxnmvq_f<mode>): Likewise.
(mve_vminnmaq_f<mode>): Likewise.
(mve_vminnmavq_f<mode>): Likewise.
(mve_vminnmq_f<mode>): Likewise.
(mve_vminnmvq_f<mode>): Likewise.
(mve_vmlaldavq_<supf><mode>): Likewise.
(mve_vmlaldavxq_<supf><mode>): Likewise.
(mve_vmlsldavq_s<mode>): Likewise.
(mve_vmlsldavxq_s<mode>): Likewise.
(mve_vmovnbq_<supf><mode>): Likewise.
(mve_vmovntq_<supf><mode>): Likewise.
(mve_vmulq_f<mode>): Likewise.
(mve_vmulq_n_f<mode>): Likewise.
(mve_vornq_f<mode>): Likewise.
(mve_vorrq_f<mode>): Likewise.
(mve_vorrq_n_<supf><mode>): Likewise.
(mve_vqdmullbq_n_s<mode>): Likewise.
(mve_vqdmullbq_s<mode>): Likewise.
(mve_vqdmulltq_n_s<mode>): Likewise.
(mve_vqdmulltq_s<mode>): Likewise.
(mve_vqmovnbq_<supf><mode>): Likewise.
(mve_vqmovntq_<supf><mode>): Likewise.
(mve_vqmovunbq_s<mode>): Likewise.
(mve_vqmovuntq_s<mode>): Likewise.
(mve_vrmlaldavhxq_sv4si): Likewise.
(mve_vrmlsldavhq_sv4si): Likewise.
(mve_vrmlsldavhxq_sv4si): Likewise.
(mve_vshllbq_n_<supf><mode>): Likewise.
(mve_vshlltq_n_<supf><mode>): Likewise.
(mve_vsubq_f<mode>): Likewise.
(mve_vmulltq_poly_p<mode>): Likewise.
(mve_vmullbq_poly_p<mode>): Likewise.
(mve_vrmlaldavhq_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vctp16q_m.c: Likewise.
* gcc.target/arm/mve/intrinsics/vctp32q_m.c: Likewise.
* gcc.target/arm/mve/intrinsics/vctp64q_m.c: Likewise.
* gcc.target/arm/mve/intrinsics/vctp8q_m.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_f16_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_f16_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmavq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmavq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmvq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmvq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmavq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmavq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmvq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmvq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovunbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovunbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovuntq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovuntq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_f32.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 15:32:36 +0000 (15:32 +0000)]
[ARM][GCC][4/2x]: MVE intrinsics with binary operands.
This patch supports following MVE ACLE intrinsics with binary operands.
vsubq_u8, vsubq_n_u8, vrmulhq_u8, vrhaddq_u8, vqsubq_u8, vqsubq_n_u8, vqaddq_u8,
vqaddq_n_u8, vorrq_u8, vornq_u8, vmulq_u8, vmulq_n_u8, vmulltq_int_u8, vmullbq_int_u8,
vmulhq_u8, vmladavq_u8, vminvq_u8, vminq_u8, vmaxvq_u8, vmaxq_u8, vhsubq_u8, vhsubq_n_u8,
vhaddq_u8, vhaddq_n_u8, veorq_u8, vcmpneq_n_u8, vcmphiq_u8, vcmphiq_n_u8, vcmpeqq_u8,
vcmpeqq_n_u8, vcmpcsq_u8, vcmpcsq_n_u8, vcaddq_rot90_u8, vcaddq_rot270_u8, vbicq_u8,
vandq_u8, vaddvq_p_u8, vaddvaq_u8, vaddq_n_u8, vabdq_u8, vshlq_r_u8, vrshlq_u8,
vrshlq_n_u8, vqshlq_u8, vqshlq_r_u8, vqrshlq_u8, vqrshlq_n_u8, vminavq_s8, vminaq_s8,
vmaxavq_s8, vmaxaq_s8, vbrsrq_n_u8, vshlq_n_u8, vrshrq_n_u8, vqshlq_n_u8, vcmpneq_n_s8,
vcmpltq_s8, vcmpltq_n_s8, vcmpleq_s8, vcmpleq_n_s8, vcmpgtq_s8, vcmpgtq_n_s8, vcmpgeq_s8,
vcmpgeq_n_s8, vcmpeqq_s8, vcmpeqq_n_s8, vqshluq_n_s8, vaddvq_p_s8, vsubq_s8, vsubq_n_s8,
vshlq_r_s8, vrshlq_s8, vrshlq_n_s8, vrmulhq_s8, vrhaddq_s8, vqsubq_s8, vqsubq_n_s8,
vqshlq_s8, vqshlq_r_s8, vqrshlq_s8, vqrshlq_n_s8, vqrdmulhq_s8, vqrdmulhq_n_s8, vqdmulhq_s8,
vqdmulhq_n_s8, vqaddq_s8, vqaddq_n_s8, vorrq_s8, vornq_s8, vmulq_s8, vmulq_n_s8, vmulltq_int_s8,
vmullbq_int_s8, vmulhq_s8, vmlsdavxq_s8, vmlsdavq_s8, vmladavxq_s8, vmladavq_s8, vminvq_s8,
vminq_s8, vmaxvq_s8, vmaxq_s8, vhsubq_s8, vhsubq_n_s8, vhcaddq_rot90_s8, vhcaddq_rot270_s8,
vhaddq_s8, vhaddq_n_s8, veorq_s8, vcaddq_rot90_s8, vcaddq_rot270_s8, vbrsrq_n_s8, vbicq_s8,
vandq_s8, vaddvaq_s8, vaddq_n_s8, vabdq_s8, vshlq_n_s8, vrshrq_n_s8, vqshlq_n_s8, vsubq_u16,
vsubq_n_u16, vrmulhq_u16, vrhaddq_u16, vqsubq_u16, vqsubq_n_u16, vqaddq_u16, vqaddq_n_u16,
vorrq_u16, vornq_u16, vmulq_u16, vmulq_n_u16, vmulltq_int_u16, vmullbq_int_u16, vmulhq_u16,
vmladavq_u16, vminvq_u16, vminq_u16, vmaxvq_u16, vmaxq_u16, vhsubq_u16, vhsubq_n_u16,
vhaddq_u16, vhaddq_n_u16, veorq_u16, vcmpneq_n_u16, vcmphiq_u16, vcmphiq_n_u16, vcmpeqq_u16,
vcmpeqq_n_u16, vcmpcsq_u16, vcmpcsq_n_u16, vcaddq_rot90_u16, vcaddq_rot270_u16, vbicq_u16,
vandq_u16, vaddvq_p_u16, vaddvaq_u16, vaddq_n_u16, vabdq_u16, vshlq_r_u16, vrshlq_u16,
vrshlq_n_u16, vqshlq_u16, vqshlq_r_u16, vqrshlq_u16, vqrshlq_n_u16, vminavq_s16, vminaq_s16,
vmaxavq_s16, vmaxaq_s16, vbrsrq_n_u16, vshlq_n_u16, vrshrq_n_u16, vqshlq_n_u16, vcmpneq_n_s16,
vcmpltq_s16, vcmpltq_n_s16, vcmpleq_s16, vcmpleq_n_s16, vcmpgtq_s16, vcmpgtq_n_s16,
vcmpgeq_s16, vcmpgeq_n_s16, vcmpeqq_s16, vcmpeqq_n_s16, vqshluq_n_s16, vaddvq_p_s16, vsubq_s16,
vsubq_n_s16, vshlq_r_s16, vrshlq_s16, vrshlq_n_s16, vrmulhq_s16, vrhaddq_s16, vqsubq_s16,
vqsubq_n_s16, vqshlq_s16, vqshlq_r_s16, vqrshlq_s16, vqrshlq_n_s16, vqrdmulhq_s16,
vqrdmulhq_n_s16, vqdmulhq_s16, vqdmulhq_n_s16, vqaddq_s16, vqaddq_n_s16, vorrq_s16, vornq_s16,
vmulq_s16, vmulq_n_s16, vmulltq_int_s16, vmullbq_int_s16, vmulhq_s16, vmlsdavxq_s16, vmlsdavq_s16,
vmladavxq_s16, vmladavq_s16, vminvq_s16, vminq_s16, vmaxvq_s16, vmaxq_s16, vhsubq_s16,
vhsubq_n_s16, vhcaddq_rot90_s16, vhcaddq_rot270_s16, vhaddq_s16, vhaddq_n_s16, veorq_s16,
vcaddq_rot90_s16, vcaddq_rot270_s16, vbrsrq_n_s16, vbicq_s16, vandq_s16, vaddvaq_s16, vaddq_n_s16,
vabdq_s16, vshlq_n_s16, vrshrq_n_s16, vqshlq_n_s16, vsubq_u32, vsubq_n_u32, vrmulhq_u32,
vrhaddq_u32, vqsubq_u32, vqsubq_n_u32, vqaddq_u32, vqaddq_n_u32, vorrq_u32, vornq_u32, vmulq_u32,
vmulq_n_u32, vmulltq_int_u32, vmullbq_int_u32, vmulhq_u32, vmladavq_u32, vminvq_u32, vminq_u32,
vmaxvq_u32, vmaxq_u32, vhsubq_u32, vhsubq_n_u32, vhaddq_u32, vhaddq_n_u32, veorq_u32, vcmpneq_n_u32,
vcmphiq_u32, vcmphiq_n_u32, vcmpeqq_u32, vcmpeqq_n_u32, vcmpcsq_u32, vcmpcsq_n_u32,
vcaddq_rot90_u32, vcaddq_rot270_u32, vbicq_u32, vandq_u32, vaddvq_p_u32, vaddvaq_u32, vaddq_n_u32,
vabdq_u32, vshlq_r_u32, vrshlq_u32, vrshlq_n_u32, vqshlq_u32, vqshlq_r_u32, vqrshlq_u32, vqrshlq_n_u32,
vminavq_s32, vminaq_s32, vmaxavq_s32, vmaxaq_s32, vbrsrq_n_u32, vshlq_n_u32, vrshrq_n_u32,
vqshlq_n_u32, vcmpneq_n_s32, vcmpltq_s32, vcmpltq_n_s32, vcmpleq_s32, vcmpleq_n_s32, vcmpgtq_s32,
vcmpgtq_n_s32, vcmpgeq_s32, vcmpgeq_n_s32, vcmpeqq_s32, vcmpeqq_n_s32, vqshluq_n_s32, vaddvq_p_s32,
vsubq_s32, vsubq_n_s32, vshlq_r_s32, vrshlq_s32, vrshlq_n_s32, vrmulhq_s32, vrhaddq_s32, vqsubq_s32,
vqsubq_n_s32, vqshlq_s32, vqshlq_r_s32, vqrshlq_s32, vqrshlq_n_s32, vqrdmulhq_s32, vqrdmulhq_n_s32,
vqdmulhq_s32, vqdmulhq_n_s32, vqaddq_s32, vqaddq_n_s32, vorrq_s32, vornq_s32, vmulq_s32, vmulq_n_s32,
vmulltq_int_s32, vmullbq_int_s32, vmulhq_s32, vmlsdavxq_s32, vmlsdavq_s32, vmladavxq_s32, vmladavq_s32,
vminvq_s32, vminq_s32, vmaxvq_s32, vmaxq_s32, vhsubq_s32, vhsubq_n_s32, vhcaddq_rot90_s32,
vhcaddq_rot270_s32, vhaddq_s32, vhaddq_n_s32, veorq_s32, vcaddq_rot90_s32, vcaddq_rot270_s32,
vbrsrq_n_s32, vbicq_s32, vandq_s32, vaddvaq_s32, vaddq_n_s32, vabdq_s32, vshlq_n_s32, vrshrq_n_s32,
vqshlq_n_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch new constraints "Ra" and "Rg" are added.
Ra checks the constant is with in the range of 0 to 7 where as Rg checks that the constant is one among
1, 2, 4 and 8.
Also a new predicates "mve_imm_7" and "mve_imm_selective_upto_8" are added, to check the the matching
constraint Ra and Rg respectively.
The above intrinsics are defined using the already defined builtin qualifiers BINOP_NONE_NONE_IMM, BINOP_NONE_NONE_NONE,
BINOP_NONE_NONE_UNONE, BINOP_UNONE_NONE_IMM, BINOP_UNONE_NONE_NONE, BINOP_UNONE_UNONE_IMM, BINOP_UNONE_UNONE_NONE,
BINOP_UNONE_UNONE_UNONE.
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vsubq_u8): Define macro.
(vsubq_n_u8): Likewise.
(vrmulhq_u8): Likewise.
(vrhaddq_u8): Likewise.
(vqsubq_u8): Likewise.
(vqsubq_n_u8): Likewise.
(vqaddq_u8): Likewise.
(vqaddq_n_u8): Likewise.
(vorrq_u8): Likewise.
(vornq_u8): Likewise.
(vmulq_u8): Likewise.
(vmulq_n_u8): Likewise.
(vmulltq_int_u8): Likewise.
(vmullbq_int_u8): Likewise.
(vmulhq_u8): Likewise.
(vmladavq_u8): Likewise.
(vminvq_u8): Likewise.
(vminq_u8): Likewise.
(vmaxvq_u8): Likewise.
(vmaxq_u8): Likewise.
(vhsubq_u8): Likewise.
(vhsubq_n_u8): Likewise.
(vhaddq_u8): Likewise.
(vhaddq_n_u8): Likewise.
(veorq_u8): Likewise.
(vcmpneq_n_u8): Likewise.
(vcmphiq_u8): Likewise.
(vcmphiq_n_u8): Likewise.
(vcmpeqq_u8): Likewise.
(vcmpeqq_n_u8): Likewise.
(vcmpcsq_u8): Likewise.
(vcmpcsq_n_u8): Likewise.
(vcaddq_rot90_u8): Likewise.
(vcaddq_rot270_u8): Likewise.
(vbicq_u8): Likewise.
(vandq_u8): Likewise.
(vaddvq_p_u8): Likewise.
(vaddvaq_u8): Likewise.
(vaddq_n_u8): Likewise.
(vabdq_u8): Likewise.
(vshlq_r_u8): Likewise.
(vrshlq_u8): Likewise.
(vrshlq_n_u8): Likewise.
(vqshlq_u8): Likewise.
(vqshlq_r_u8): Likewise.
(vqrshlq_u8): Likewise.
(vqrshlq_n_u8): Likewise.
(vminavq_s8): Likewise.
(vminaq_s8): Likewise.
(vmaxavq_s8): Likewise.
(vmaxaq_s8): Likewise.
(vbrsrq_n_u8): Likewise.
(vshlq_n_u8): Likewise.
(vrshrq_n_u8): Likewise.
(vqshlq_n_u8): Likewise.
(vcmpneq_n_s8): Likewise.
(vcmpltq_s8): Likewise.
(vcmpltq_n_s8): Likewise.
(vcmpleq_s8): Likewise.
(vcmpleq_n_s8): Likewise.
(vcmpgtq_s8): Likewise.
(vcmpgtq_n_s8): Likewise.
(vcmpgeq_s8): Likewise.
(vcmpgeq_n_s8): Likewise.
(vcmpeqq_s8): Likewise.
(vcmpeqq_n_s8): Likewise.
(vqshluq_n_s8): Likewise.
(vaddvq_p_s8): Likewise.
(vsubq_s8): Likewise.
(vsubq_n_s8): Likewise.
(vshlq_r_s8): Likewise.
(vrshlq_s8): Likewise.
(vrshlq_n_s8): Likewise.
(vrmulhq_s8): Likewise.
(vrhaddq_s8): Likewise.
(vqsubq_s8): Likewise.
(vqsubq_n_s8): Likewise.
(vqshlq_s8): Likewise.
(vqshlq_r_s8): Likewise.
(vqrshlq_s8): Likewise.
(vqrshlq_n_s8): Likewise.
(vqrdmulhq_s8): Likewise.
(vqrdmulhq_n_s8): Likewise.
(vqdmulhq_s8): Likewise.
(vqdmulhq_n_s8): Likewise.
(vqaddq_s8): Likewise.
(vqaddq_n_s8): Likewise.
(vorrq_s8): Likewise.
(vornq_s8): Likewise.
(vmulq_s8): Likewise.
(vmulq_n_s8): Likewise.
(vmulltq_int_s8): Likewise.
(vmullbq_int_s8): Likewise.
(vmulhq_s8): Likewise.
(vmlsdavxq_s8): Likewise.
(vmlsdavq_s8): Likewise.
(vmladavxq_s8): Likewise.
(vmladavq_s8): Likewise.
(vminvq_s8): Likewise.
(vminq_s8): Likewise.
(vmaxvq_s8): Likewise.
(vmaxq_s8): Likewise.
(vhsubq_s8): Likewise.
(vhsubq_n_s8): Likewise.
(vhcaddq_rot90_s8): Likewise.
(vhcaddq_rot270_s8): Likewise.
(vhaddq_s8): Likewise.
(vhaddq_n_s8): Likewise.
(veorq_s8): Likewise.
(vcaddq_rot90_s8): Likewise.
(vcaddq_rot270_s8): Likewise.
(vbrsrq_n_s8): Likewise.
(vbicq_s8): Likewise.
(vandq_s8): Likewise.
(vaddvaq_s8): Likewise.
(vaddq_n_s8): Likewise.
(vabdq_s8): Likewise.
(vshlq_n_s8): Likewise.
(vrshrq_n_s8): Likewise.
(vqshlq_n_s8): Likewise.
(vsubq_u16): Likewise.
(vsubq_n_u16): Likewise.
(vrmulhq_u16): Likewise.
(vrhaddq_u16): Likewise.
(vqsubq_u16): Likewise.
(vqsubq_n_u16): Likewise.
(vqaddq_u16): Likewise.
(vqaddq_n_u16): Likewise.
(vorrq_u16): Likewise.
(vornq_u16): Likewise.
(vmulq_u16): Likewise.
(vmulq_n_u16): Likewise.
(vmulltq_int_u16): Likewise.
(vmullbq_int_u16): Likewise.
(vmulhq_u16): Likewise.
(vmladavq_u16): Likewise.
(vminvq_u16): Likewise.
(vminq_u16): Likewise.
(vmaxvq_u16): Likewise.
(vmaxq_u16): Likewise.
(vhsubq_u16): Likewise.
(vhsubq_n_u16): Likewise.
(vhaddq_u16): Likewise.
(vhaddq_n_u16): Likewise.
(veorq_u16): Likewise.
(vcmpneq_n_u16): Likewise.
(vcmphiq_u16): Likewise.
(vcmphiq_n_u16): Likewise.
(vcmpeqq_u16): Likewise.
(vcmpeqq_n_u16): Likewise.
(vcmpcsq_u16): Likewise.
(vcmpcsq_n_u16): Likewise.
(vcaddq_rot90_u16): Likewise.
(vcaddq_rot270_u16): Likewise.
(vbicq_u16): Likewise.
(vandq_u16): Likewise.
(vaddvq_p_u16): Likewise.
(vaddvaq_u16): Likewise.
(vaddq_n_u16): Likewise.
(vabdq_u16): Likewise.
(vshlq_r_u16): Likewise.
(vrshlq_u16): Likewise.
(vrshlq_n_u16): Likewise.
(vqshlq_u16): Likewise.
(vqshlq_r_u16): Likewise.
(vqrshlq_u16): Likewise.
(vqrshlq_n_u16): Likewise.
(vminavq_s16): Likewise.
(vminaq_s16): Likewise.
(vmaxavq_s16): Likewise.
(vmaxaq_s16): Likewise.
(vbrsrq_n_u16): Likewise.
(vshlq_n_u16): Likewise.
(vrshrq_n_u16): Likewise.
(vqshlq_n_u16): Likewise.
(vcmpneq_n_s16): Likewise.
(vcmpltq_s16): Likewise.
(vcmpltq_n_s16): Likewise.
(vcmpleq_s16): Likewise.
(vcmpleq_n_s16): Likewise.
(vcmpgtq_s16): Likewise.
(vcmpgtq_n_s16): Likewise.
(vcmpgeq_s16): Likewise.
(vcmpgeq_n_s16): Likewise.
(vcmpeqq_s16): Likewise.
(vcmpeqq_n_s16): Likewise.
(vqshluq_n_s16): Likewise.
(vaddvq_p_s16): Likewise.
(vsubq_s16): Likewise.
(vsubq_n_s16): Likewise.
(vshlq_r_s16): Likewise.
(vrshlq_s16): Likewise.
(vrshlq_n_s16): Likewise.
(vrmulhq_s16): Likewise.
(vrhaddq_s16): Likewise.
(vqsubq_s16): Likewise.
(vqsubq_n_s16): Likewise.
(vqshlq_s16): Likewise.
(vqshlq_r_s16): Likewise.
(vqrshlq_s16): Likewise.
(vqrshlq_n_s16): Likewise.
(vqrdmulhq_s16): Likewise.
(vqrdmulhq_n_s16): Likewise.
(vqdmulhq_s16): Likewise.
(vqdmulhq_n_s16): Likewise.
(vqaddq_s16): Likewise.
(vqaddq_n_s16): Likewise.
(vorrq_s16): Likewise.
(vornq_s16): Likewise.
(vmulq_s16): Likewise.
(vmulq_n_s16): Likewise.
(vmulltq_int_s16): Likewise.
(vmullbq_int_s16): Likewise.
(vmulhq_s16): Likewise.
(vmlsdavxq_s16): Likewise.
(vmlsdavq_s16): Likewise.
(vmladavxq_s16): Likewise.
(vmladavq_s16): Likewise.
(vminvq_s16): Likewise.
(vminq_s16): Likewise.
(vmaxvq_s16): Likewise.
(vmaxq_s16): Likewise.
(vhsubq_s16): Likewise.
(vhsubq_n_s16): Likewise.
(vhcaddq_rot90_s16): Likewise.
(vhcaddq_rot270_s16): Likewise.
(vhaddq_s16): Likewise.
(vhaddq_n_s16): Likewise.
(veorq_s16): Likewise.
(vcaddq_rot90_s16): Likewise.
(vcaddq_rot270_s16): Likewise.
(vbrsrq_n_s16): Likewise.
(vbicq_s16): Likewise.
(vandq_s16): Likewise.
(vaddvaq_s16): Likewise.
(vaddq_n_s16): Likewise.
(vabdq_s16): Likewise.
(vshlq_n_s16): Likewise.
(vrshrq_n_s16): Likewise.
(vqshlq_n_s16): Likewise.
(vsubq_u32): Likewise.
(vsubq_n_u32): Likewise.
(vrmulhq_u32): Likewise.
(vrhaddq_u32): Likewise.
(vqsubq_u32): Likewise.
(vqsubq_n_u32): Likewise.
(vqaddq_u32): Likewise.
(vqaddq_n_u32): Likewise.
(vorrq_u32): Likewise.
(vornq_u32): Likewise.
(vmulq_u32): Likewise.
(vmulq_n_u32): Likewise.
(vmulltq_int_u32): Likewise.
(vmullbq_int_u32): Likewise.
(vmulhq_u32): Likewise.
(vmladavq_u32): Likewise.
(vminvq_u32): Likewise.
(vminq_u32): Likewise.
(vmaxvq_u32): Likewise.
(vmaxq_u32): Likewise.
(vhsubq_u32): Likewise.
(vhsubq_n_u32): Likewise.
(vhaddq_u32): Likewise.
(vhaddq_n_u32): Likewise.
(veorq_u32): Likewise.
(vcmpneq_n_u32): Likewise.
(vcmphiq_u32): Likewise.
(vcmphiq_n_u32): Likewise.
(vcmpeqq_u32): Likewise.
(vcmpeqq_n_u32): Likewise.
(vcmpcsq_u32): Likewise.
(vcmpcsq_n_u32): Likewise.
(vcaddq_rot90_u32): Likewise.
(vcaddq_rot270_u32): Likewise.
(vbicq_u32): Likewise.
(vandq_u32): Likewise.
(vaddvq_p_u32): Likewise.
(vaddvaq_u32): Likewise.
(vaddq_n_u32): Likewise.
(vabdq_u32): Likewise.
(vshlq_r_u32): Likewise.
(vrshlq_u32): Likewise.
(vrshlq_n_u32): Likewise.
(vqshlq_u32): Likewise.
(vqshlq_r_u32): Likewise.
(vqrshlq_u32): Likewise.
(vqrshlq_n_u32): Likewise.
(vminavq_s32): Likewise.
(vminaq_s32): Likewise.
(vmaxavq_s32): Likewise.
(vmaxaq_s32): Likewise.
(vbrsrq_n_u32): Likewise.
(vshlq_n_u32): Likewise.
(vrshrq_n_u32): Likewise.
(vqshlq_n_u32): Likewise.
(vcmpneq_n_s32): Likewise.
(vcmpltq_s32): Likewise.
(vcmpltq_n_s32): Likewise.
(vcmpleq_s32): Likewise.
(vcmpleq_n_s32): Likewise.
(vcmpgtq_s32): Likewise.
(vcmpgtq_n_s32): Likewise.
(vcmpgeq_s32): Likewise.
(vcmpgeq_n_s32): Likewise.
(vcmpeqq_s32): Likewise.
(vcmpeqq_n_s32): Likewise.
(vqshluq_n_s32): Likewise.
(vaddvq_p_s32): Likewise.
(vsubq_s32): Likewise.
(vsubq_n_s32): Likewise.
(vshlq_r_s32): Likewise.
(vrshlq_s32): Likewise.
(vrshlq_n_s32): Likewise.
(vrmulhq_s32): Likewise.
(vrhaddq_s32): Likewise.
(vqsubq_s32): Likewise.
(vqsubq_n_s32): Likewise.
(vqshlq_s32): Likewise.
(vqshlq_r_s32): Likewise.
(vqrshlq_s32): Likewise.
(vqrshlq_n_s32): Likewise.
(vqrdmulhq_s32): Likewise.
(vqrdmulhq_n_s32): Likewise.
(vqdmulhq_s32): Likewise.
(vqdmulhq_n_s32): Likewise.
(vqaddq_s32): Likewise.
(vqaddq_n_s32): Likewise.
(vorrq_s32): Likewise.
(vornq_s32): Likewise.
(vmulq_s32): Likewise.
(vmulq_n_s32): Likewise.
(vmulltq_int_s32): Likewise.
(vmullbq_int_s32): Likewise.
(vmulhq_s32): Likewise.
(vmlsdavxq_s32): Likewise.
(vmlsdavq_s32): Likewise.
(vmladavxq_s32): Likewise.
(vmladavq_s32): Likewise.
(vminvq_s32): Likewise.
(vminq_s32): Likewise.
(vmaxvq_s32): Likewise.
(vmaxq_s32): Likewise.
(vhsubq_s32): Likewise.
(vhsubq_n_s32): Likewise.
(vhcaddq_rot90_s32): Likewise.
(vhcaddq_rot270_s32): Likewise.
(vhaddq_s32): Likewise.
(vhaddq_n_s32): Likewise.
(veorq_s32): Likewise.
(vcaddq_rot90_s32): Likewise.
(vcaddq_rot270_s32): Likewise.
(vbrsrq_n_s32): Likewise.
(vbicq_s32): Likewise.
(vandq_s32): Likewise.
(vaddvaq_s32): Likewise.
(vaddq_n_s32): Likewise.
(vabdq_s32): Likewise.
(vshlq_n_s32): Likewise.
(vrshrq_n_s32): Likewise.
(vqshlq_n_s32): Likewise.
(__arm_vsubq_u8): Define intrinsic.
(__arm_vsubq_n_u8): Likewise.
(__arm_vrmulhq_u8): Likewise.
(__arm_vrhaddq_u8): Likewise.
(__arm_vqsubq_u8): Likewise.
(__arm_vqsubq_n_u8): Likewise.
(__arm_vqaddq_u8): Likewise.
(__arm_vqaddq_n_u8): Likewise.
(__arm_vorrq_u8): Likewise.
(__arm_vornq_u8): Likewise.
(__arm_vmulq_u8): Likewise.
(__arm_vmulq_n_u8): Likewise.
(__arm_vmulltq_int_u8): Likewise.
(__arm_vmullbq_int_u8): Likewise.
(__arm_vmulhq_u8): Likewise.
(__arm_vmladavq_u8): Likewise.
(__arm_vminvq_u8): Likewise.
(__arm_vminq_u8): Likewise.
(__arm_vmaxvq_u8): Likewise.
(__arm_vmaxq_u8): Likewise.
(__arm_vhsubq_u8): Likewise.
(__arm_vhsubq_n_u8): Likewise.
(__arm_vhaddq_u8): Likewise.
(__arm_vhaddq_n_u8): Likewise.
(__arm_veorq_u8): Likewise.
(__arm_vcmpneq_n_u8): Likewise.
(__arm_vcmphiq_u8): Likewise.
(__arm_vcmphiq_n_u8): Likewise.
(__arm_vcmpeqq_u8): Likewise.
(__arm_vcmpeqq_n_u8): Likewise.
(__arm_vcmpcsq_u8): Likewise.
(__arm_vcmpcsq_n_u8): Likewise.
(__arm_vcaddq_rot90_u8): Likewise.
(__arm_vcaddq_rot270_u8): Likewise.
(__arm_vbicq_u8): Likewise.
(__arm_vandq_u8): Likewise.
(__arm_vaddvq_p_u8): Likewise.
(__arm_vaddvaq_u8): Likewise.
(__arm_vaddq_n_u8): Likewise.
(__arm_vabdq_u8): Likewise.
(__arm_vshlq_r_u8): Likewise.
(__arm_vrshlq_u8): Likewise.
(__arm_vrshlq_n_u8): Likewise.
(__arm_vqshlq_u8): Likewise.
(__arm_vqshlq_r_u8): Likewise.
(__arm_vqrshlq_u8): Likewise.
(__arm_vqrshlq_n_u8): Likewise.
(__arm_vminavq_s8): Likewise.
(__arm_vminaq_s8): Likewise.
(__arm_vmaxavq_s8): Likewise.
(__arm_vmaxaq_s8): Likewise.
(__arm_vbrsrq_n_u8): Likewise.
(__arm_vshlq_n_u8): Likewise.
(__arm_vrshrq_n_u8): Likewise.
(__arm_vqshlq_n_u8): Likewise.
(__arm_vcmpneq_n_s8): Likewise.
(__arm_vcmpltq_s8): Likewise.
(__arm_vcmpltq_n_s8): Likewise.
(__arm_vcmpleq_s8): Likewise.
(__arm_vcmpleq_n_s8): Likewise.
(__arm_vcmpgtq_s8): Likewise.
(__arm_vcmpgtq_n_s8): Likewise.
(__arm_vcmpgeq_s8): Likewise.
(__arm_vcmpgeq_n_s8): Likewise.
(__arm_vcmpeqq_s8): Likewise.
(__arm_vcmpeqq_n_s8): Likewise.
(__arm_vqshluq_n_s8): Likewise.
(__arm_vaddvq_p_s8): Likewise.
(__arm_vsubq_s8): Likewise.
(__arm_vsubq_n_s8): Likewise.
(__arm_vshlq_r_s8): Likewise.
(__arm_vrshlq_s8): Likewise.
(__arm_vrshlq_n_s8): Likewise.
(__arm_vrmulhq_s8): Likewise.
(__arm_vrhaddq_s8): Likewise.
(__arm_vqsubq_s8): Likewise.
(__arm_vqsubq_n_s8): Likewise.
(__arm_vqshlq_s8): Likewise.
(__arm_vqshlq_r_s8): Likewise.
(__arm_vqrshlq_s8): Likewise.
(__arm_vqrshlq_n_s8): Likewise.
(__arm_vqrdmulhq_s8): Likewise.
(__arm_vqrdmulhq_n_s8): Likewise.
(__arm_vqdmulhq_s8): Likewise.
(__arm_vqdmulhq_n_s8): Likewise.
(__arm_vqaddq_s8): Likewise.
(__arm_vqaddq_n_s8): Likewise.
(__arm_vorrq_s8): Likewise.
(__arm_vornq_s8): Likewise.
(__arm_vmulq_s8): Likewise.
(__arm_vmulq_n_s8): Likewise.
(__arm_vmulltq_int_s8): Likewise.
(__arm_vmullbq_int_s8): Likewise.
(__arm_vmulhq_s8): Likewise.
(__arm_vmlsdavxq_s8): Likewise.
(__arm_vmlsdavq_s8): Likewise.
(__arm_vmladavxq_s8): Likewise.
(__arm_vmladavq_s8): Likewise.
(__arm_vminvq_s8): Likewise.
(__arm_vminq_s8): Likewise.
(__arm_vmaxvq_s8): Likewise.
(__arm_vmaxq_s8): Likewise.
(__arm_vhsubq_s8): Likewise.
(__arm_vhsubq_n_s8): Likewise.
(__arm_vhcaddq_rot90_s8): Likewise.
(__arm_vhcaddq_rot270_s8): Likewise.
(__arm_vhaddq_s8): Likewise.
(__arm_vhaddq_n_s8): Likewise.
(__arm_veorq_s8): Likewise.
(__arm_vcaddq_rot90_s8): Likewise.
(__arm_vcaddq_rot270_s8): Likewise.
(__arm_vbrsrq_n_s8): Likewise.
(__arm_vbicq_s8): Likewise.
(__arm_vandq_s8): Likewise.
(__arm_vaddvaq_s8): Likewise.
(__arm_vaddq_n_s8): Likewise.
(__arm_vabdq_s8): Likewise.
(__arm_vshlq_n_s8): Likewise.
(__arm_vrshrq_n_s8): Likewise.
(__arm_vqshlq_n_s8): Likewise.
(__arm_vsubq_u16): Likewise.
(__arm_vsubq_n_u16): Likewise.
(__arm_vrmulhq_u16): Likewise.
(__arm_vrhaddq_u16): Likewise.
(__arm_vqsubq_u16): Likewise.
(__arm_vqsubq_n_u16): Likewise.
(__arm_vqaddq_u16): Likewise.
(__arm_vqaddq_n_u16): Likewise.
(__arm_vorrq_u16): Likewise.
(__arm_vornq_u16): Likewise.
(__arm_vmulq_u16): Likewise.
(__arm_vmulq_n_u16): Likewise.
(__arm_vmulltq_int_u16): Likewise.
(__arm_vmullbq_int_u16): Likewise.
(__arm_vmulhq_u16): Likewise.
(__arm_vmladavq_u16): Likewise.
(__arm_vminvq_u16): Likewise.
(__arm_vminq_u16): Likewise.
(__arm_vmaxvq_u16): Likewise.
(__arm_vmaxq_u16): Likewise.
(__arm_vhsubq_u16): Likewise.
(__arm_vhsubq_n_u16): Likewise.
(__arm_vhaddq_u16): Likewise.
(__arm_vhaddq_n_u16): Likewise.
(__arm_veorq_u16): Likewise.
(__arm_vcmpneq_n_u16): Likewise.
(__arm_vcmphiq_u16): Likewise.
(__arm_vcmphiq_n_u16): Likewise.
(__arm_vcmpeqq_u16): Likewise.
(__arm_vcmpeqq_n_u16): Likewise.
(__arm_vcmpcsq_u16): Likewise.
(__arm_vcmpcsq_n_u16): Likewise.
(__arm_vcaddq_rot90_u16): Likewise.
(__arm_vcaddq_rot270_u16): Likewise.
(__arm_vbicq_u16): Likewise.
(__arm_vandq_u16): Likewise.
(__arm_vaddvq_p_u16): Likewise.
(__arm_vaddvaq_u16): Likewise.
(__arm_vaddq_n_u16): Likewise.
(__arm_vabdq_u16): Likewise.
(__arm_vshlq_r_u16): Likewise.
(__arm_vrshlq_u16): Likewise.
(__arm_vrshlq_n_u16): Likewise.
(__arm_vqshlq_u16): Likewise.
(__arm_vqshlq_r_u16): Likewise.
(__arm_vqrshlq_u16): Likewise.
(__arm_vqrshlq_n_u16): Likewise.
(__arm_vminavq_s16): Likewise.
(__arm_vminaq_s16): Likewise.
(__arm_vmaxavq_s16): Likewise.
(__arm_vmaxaq_s16): Likewise.
(__arm_vbrsrq_n_u16): Likewise.
(__arm_vshlq_n_u16): Likewise.
(__arm_vrshrq_n_u16): Likewise.
(__arm_vqshlq_n_u16): Likewise.
(__arm_vcmpneq_n_s16): Likewise.
(__arm_vcmpltq_s16): Likewise.
(__arm_vcmpltq_n_s16): Likewise.
(__arm_vcmpleq_s16): Likewise.
(__arm_vcmpleq_n_s16): Likewise.
(__arm_vcmpgtq_s16): Likewise.
(__arm_vcmpgtq_n_s16): Likewise.
(__arm_vcmpgeq_s16): Likewise.
(__arm_vcmpgeq_n_s16): Likewise.
(__arm_vcmpeqq_s16): Likewise.
(__arm_vcmpeqq_n_s16): Likewise.
(__arm_vqshluq_n_s16): Likewise.
(__arm_vaddvq_p_s16): Likewise.
(__arm_vsubq_s16): Likewise.
(__arm_vsubq_n_s16): Likewise.
(__arm_vshlq_r_s16): Likewise.
(__arm_vrshlq_s16): Likewise.
(__arm_vrshlq_n_s16): Likewise.
(__arm_vrmulhq_s16): Likewise.
(__arm_vrhaddq_s16): Likewise.
(__arm_vqsubq_s16): Likewise.
(__arm_vqsubq_n_s16): Likewise.
(__arm_vqshlq_s16): Likewise.
(__arm_vqshlq_r_s16): Likewise.
(__arm_vqrshlq_s16): Likewise.
(__arm_vqrshlq_n_s16): Likewise.
(__arm_vqrdmulhq_s16): Likewise.
(__arm_vqrdmulhq_n_s16): Likewise.
(__arm_vqdmulhq_s16): Likewise.
(__arm_vqdmulhq_n_s16): Likewise.
(__arm_vqaddq_s16): Likewise.
(__arm_vqaddq_n_s16): Likewise.
(__arm_vorrq_s16): Likewise.
(__arm_vornq_s16): Likewise.
(__arm_vmulq_s16): Likewise.
(__arm_vmulq_n_s16): Likewise.
(__arm_vmulltq_int_s16): Likewise.
(__arm_vmullbq_int_s16): Likewise.
(__arm_vmulhq_s16): Likewise.
(__arm_vmlsdavxq_s16): Likewise.
(__arm_vmlsdavq_s16): Likewise.
(__arm_vmladavxq_s16): Likewise.
(__arm_vmladavq_s16): Likewise.
(__arm_vminvq_s16): Likewise.
(__arm_vminq_s16): Likewise.
(__arm_vmaxvq_s16): Likewise.
(__arm_vmaxq_s16): Likewise.
(__arm_vhsubq_s16): Likewise.
(__arm_vhsubq_n_s16): Likewise.
(__arm_vhcaddq_rot90_s16): Likewise.
(__arm_vhcaddq_rot270_s16): Likewise.
(__arm_vhaddq_s16): Likewise.
(__arm_vhaddq_n_s16): Likewise.
(__arm_veorq_s16): Likewise.
(__arm_vcaddq_rot90_s16): Likewise.
(__arm_vcaddq_rot270_s16): Likewise.
(__arm_vbrsrq_n_s16): Likewise.
(__arm_vbicq_s16): Likewise.
(__arm_vandq_s16): Likewise.
(__arm_vaddvaq_s16): Likewise.
(__arm_vaddq_n_s16): Likewise.
(__arm_vabdq_s16): Likewise.
(__arm_vshlq_n_s16): Likewise.
(__arm_vrshrq_n_s16): Likewise.
(__arm_vqshlq_n_s16): Likewise.
(__arm_vsubq_u32): Likewise.
(__arm_vsubq_n_u32): Likewise.
(__arm_vrmulhq_u32): Likewise.
(__arm_vrhaddq_u32): Likewise.
(__arm_vqsubq_u32): Likewise.
(__arm_vqsubq_n_u32): Likewise.
(__arm_vqaddq_u32): Likewise.
(__arm_vqaddq_n_u32): Likewise.
(__arm_vorrq_u32): Likewise.
(__arm_vornq_u32): Likewise.
(__arm_vmulq_u32): Likewise.
(__arm_vmulq_n_u32): Likewise.
(__arm_vmulltq_int_u32): Likewise.
(__arm_vmullbq_int_u32): Likewise.
(__arm_vmulhq_u32): Likewise.
(__arm_vmladavq_u32): Likewise.
(__arm_vminvq_u32): Likewise.
(__arm_vminq_u32): Likewise.
(__arm_vmaxvq_u32): Likewise.
(__arm_vmaxq_u32): Likewise.
(__arm_vhsubq_u32): Likewise.
(__arm_vhsubq_n_u32): Likewise.
(__arm_vhaddq_u32): Likewise.
(__arm_vhaddq_n_u32): Likewise.
(__arm_veorq_u32): Likewise.
(__arm_vcmpneq_n_u32): Likewise.
(__arm_vcmphiq_u32): Likewise.
(__arm_vcmphiq_n_u32): Likewise.
(__arm_vcmpeqq_u32): Likewise.
(__arm_vcmpeqq_n_u32): Likewise.
(__arm_vcmpcsq_u32): Likewise.
(__arm_vcmpcsq_n_u32): Likewise.
(__arm_vcaddq_rot90_u32): Likewise.
(__arm_vcaddq_rot270_u32): Likewise.
(__arm_vbicq_u32): Likewise.
(__arm_vandq_u32): Likewise.
(__arm_vaddvq_p_u32): Likewise.
(__arm_vaddvaq_u32): Likewise.
(__arm_vaddq_n_u32): Likewise.
(__arm_vabdq_u32): Likewise.
(__arm_vshlq_r_u32): Likewise.
(__arm_vrshlq_u32): Likewise.
(__arm_vrshlq_n_u32): Likewise.
(__arm_vqshlq_u32): Likewise.
(__arm_vqshlq_r_u32): Likewise.
(__arm_vqrshlq_u32): Likewise.
(__arm_vqrshlq_n_u32): Likewise.
(__arm_vminavq_s32): Likewise.
(__arm_vminaq_s32): Likewise.
(__arm_vmaxavq_s32): Likewise.
(__arm_vmaxaq_s32): Likewise.
(__arm_vbrsrq_n_u32): Likewise.
(__arm_vshlq_n_u32): Likewise.
(__arm_vrshrq_n_u32): Likewise.
(__arm_vqshlq_n_u32): Likewise.
(__arm_vcmpneq_n_s32): Likewise.
(__arm_vcmpltq_s32): Likewise.
(__arm_vcmpltq_n_s32): Likewise.
(__arm_vcmpleq_s32): Likewise.
(__arm_vcmpleq_n_s32): Likewise.
(__arm_vcmpgtq_s32): Likewise.
(__arm_vcmpgtq_n_s32): Likewise.
(__arm_vcmpgeq_s32): Likewise.
(__arm_vcmpgeq_n_s32): Likewise.
(__arm_vcmpeqq_s32): Likewise.
(__arm_vcmpeqq_n_s32): Likewise.
(__arm_vqshluq_n_s32): Likewise.
(__arm_vaddvq_p_s32): Likewise.
(__arm_vsubq_s32): Likewise.
(__arm_vsubq_n_s32): Likewise.
(__arm_vshlq_r_s32): Likewise.
(__arm_vrshlq_s32): Likewise.
(__arm_vrshlq_n_s32): Likewise.
(__arm_vrmulhq_s32): Likewise.
(__arm_vrhaddq_s32): Likewise.
(__arm_vqsubq_s32): Likewise.
(__arm_vqsubq_n_s32): Likewise.
(__arm_vqshlq_s32): Likewise.
(__arm_vqshlq_r_s32): Likewise.
(__arm_vqrshlq_s32): Likewise.
(__arm_vqrshlq_n_s32): Likewise.
(__arm_vqrdmulhq_s32): Likewise.
(__arm_vqrdmulhq_n_s32): Likewise.
(__arm_vqdmulhq_s32): Likewise.
(__arm_vqdmulhq_n_s32): Likewise.
(__arm_vqaddq_s32): Likewise.
(__arm_vqaddq_n_s32): Likewise.
(__arm_vorrq_s32): Likewise.
(__arm_vornq_s32): Likewise.
(__arm_vmulq_s32): Likewise.
(__arm_vmulq_n_s32): Likewise.
(__arm_vmulltq_int_s32): Likewise.
(__arm_vmullbq_int_s32): Likewise.
(__arm_vmulhq_s32): Likewise.
(__arm_vmlsdavxq_s32): Likewise.
(__arm_vmlsdavq_s32): Likewise.
(__arm_vmladavxq_s32): Likewise.
(__arm_vmladavq_s32): Likewise.
(__arm_vminvq_s32): Likewise.
(__arm_vminq_s32): Likewise.
(__arm_vmaxvq_s32): Likewise.
(__arm_vmaxq_s32): Likewise.
(__arm_vhsubq_s32): Likewise.
(__arm_vhsubq_n_s32): Likewise.
(__arm_vhcaddq_rot90_s32): Likewise.
(__arm_vhcaddq_rot270_s32): Likewise.
(__arm_vhaddq_s32): Likewise.
(__arm_vhaddq_n_s32): Likewise.
(__arm_veorq_s32): Likewise.
(__arm_vcaddq_rot90_s32): Likewise.
(__arm_vcaddq_rot270_s32): Likewise.
(__arm_vbrsrq_n_s32): Likewise.
(__arm_vbicq_s32): Likewise.
(__arm_vandq_s32): Likewise.
(__arm_vaddvaq_s32): Likewise.
(__arm_vaddq_n_s32): Likewise.
(__arm_vabdq_s32): Likewise.
(__arm_vshlq_n_s32): Likewise.
(__arm_vrshrq_n_s32): Likewise.
(__arm_vqshlq_n_s32): Likewise.
(vsubq): Define polymorphic variant.
(vsubq_n): Likewise.
(vshlq_r): Likewise.
(vrshlq_n): Likewise.
(vrshlq): Likewise.
(vrmulhq): Likewise.
(vrhaddq): Likewise.
(vqsubq_n): Likewise.
(vqsubq): Likewise.
(vqshlq): Likewise.
(vqshlq_r): Likewise.
(vqshluq): Likewise.
(vrshrq_n): Likewise.
(vshlq_n): Likewise.
(vqshluq_n): Likewise.
(vqshlq_n): Likewise.
(vqrshlq_n): Likewise.
(vqrshlq): Likewise.
(vqrdmulhq_n): Likewise.
(vqrdmulhq): Likewise.
(vqdmulhq_n): Likewise.
(vqdmulhq): Likewise.
(vqaddq_n): Likewise.
(vqaddq): Likewise.
(vorrq_n): Likewise.
(vorrq): Likewise.
(vornq): Likewise.
(vmulq_n): Likewise.
(vmulq): Likewise.
(vmulltq_int): Likewise.
(vmullbq_int): Likewise.
(vmulhq): Likewise.
(vminq): Likewise.
(vminaq): Likewise.
(vmaxq): Likewise.
(vmaxaq): Likewise.
(vhsubq_n): Likewise.
(vhsubq): Likewise.
(vhcaddq_rot90): Likewise.
(vhcaddq_rot270): Likewise.
(vhaddq_n): Likewise.
(vhaddq): Likewise.
(veorq): Likewise.
(vcaddq_rot90): Likewise.
(vcaddq_rot270): Likewise.
(vbrsrq_n): Likewise.
(vbicq_n): Likewise.
(vbicq): Likewise.
(vaddq): Likewise.
(vaddq_n): Likewise.
(vandq): Likewise.
(vabdq): Likewise.
* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_IMM): Use it.
(BINOP_NONE_NONE_NONE): Likewise.
(BINOP_NONE_NONE_UNONE): Likewise.
(BINOP_UNONE_NONE_IMM): Likewise.
(BINOP_UNONE_NONE_NONE): Likewise.
(BINOP_UNONE_UNONE_IMM): Likewise.
(BINOP_UNONE_UNONE_NONE): Likewise.
(BINOP_UNONE_UNONE_UNONE): Likewise.
* config/arm/constraints.md (Ra): Define constraint to check constant is
in the range of 0 to 7.
(Rg): Define constriant to check the constant is one among 1, 2, 4
and 8.
* config/arm/mve.md (mve_vabdq_<supf>): Define RTL pattern.
(mve_vaddq_n_<supf>): Likewise.
(mve_vaddvaq_<supf>): Likewise.
(mve_vaddvq_p_<supf>): Likewise.
(mve_vandq_<supf>): Likewise.
(mve_vbicq_<supf>): Likewise.
(mve_vbrsrq_n_<supf>): Likewise.
(mve_vcaddq_rot270_<supf>): Likewise.
(mve_vcaddq_rot90_<supf>): Likewise.
(mve_vcmpcsq_n_u): Likewise.
(mve_vcmpcsq_u): Likewise.
(mve_vcmpeqq_n_<supf>): Likewise.
(mve_vcmpeqq_<supf>): Likewise.
(mve_vcmpgeq_n_s): Likewise.
(mve_vcmpgeq_s): Likewise.
(mve_vcmpgtq_n_s): Likewise.
(mve_vcmpgtq_s): Likewise.
(mve_vcmphiq_n_u): Likewise.
(mve_vcmphiq_u): Likewise.
(mve_vcmpleq_n_s): Likewise.
(mve_vcmpleq_s): Likewise.
(mve_vcmpltq_n_s): Likewise.
(mve_vcmpltq_s): Likewise.
(mve_vcmpneq_n_<supf>): Likewise.
(mve_vddupq_n_u): Likewise.
(mve_veorq_<supf>): Likewise.
(mve_vhaddq_n_<supf>): Likewise.
(mve_vhaddq_<supf>): Likewise.
(mve_vhcaddq_rot270_s): Likewise.
(mve_vhcaddq_rot90_s): Likewise.
(mve_vhsubq_n_<supf>): Likewise.
(mve_vhsubq_<supf>): Likewise.
(mve_vidupq_n_u): Likewise.
(mve_vmaxaq_s): Likewise.
(mve_vmaxavq_s): Likewise.
(mve_vmaxq_<supf>): Likewise.
(mve_vmaxvq_<supf>): Likewise.
(mve_vminaq_s): Likewise.
(mve_vminavq_s): Likewise.
(mve_vminq_<supf>): Likewise.
(mve_vminvq_<supf>): Likewise.
(mve_vmladavq_<supf>): Likewise.
(mve_vmladavxq_s): Likewise.
(mve_vmlsdavq_s): Likewise.
(mve_vmlsdavxq_s): Likewise.
(mve_vmulhq_<supf>): Likewise.
(mve_vmullbq_int_<supf>): Likewise.
(mve_vmulltq_int_<supf>): Likewise.
(mve_vmulq_n_<supf>): Likewise.
(mve_vmulq_<supf>): Likewise.
(mve_vornq_<supf>): Likewise.
(mve_vorrq_<supf>): Likewise.
(mve_vqaddq_n_<supf>): Likewise.
(mve_vqaddq_<supf>): Likewise.
(mve_vqdmulhq_n_s): Likewise.
(mve_vqdmulhq_s): Likewise.
(mve_vqrdmulhq_n_s): Likewise.
(mve_vqrdmulhq_s): Likewise.
(mve_vqrshlq_n_<supf>): Likewise.
(mve_vqrshlq_<supf>): Likewise.
(mve_vqshlq_n_<supf>): Likewise.
(mve_vqshlq_r_<supf>): Likewise.
(mve_vqshlq_<supf>): Likewise.
(mve_vqshluq_n_s): Likewise.
(mve_vqsubq_n_<supf>): Likewise.
(mve_vqsubq_<supf>): Likewise.
(mve_vrhaddq_<supf>): Likewise.
(mve_vrmulhq_<supf>): Likewise.
(mve_vrshlq_n_<supf>): Likewise.
(mve_vrshlq_<supf>): Likewise.
(mve_vrshrq_n_<supf>): Likewise.
(mve_vshlq_n_<supf>): Likewise.
(mve_vshlq_r_<supf>): Likewise.
(mve_vsubq_n_<supf>): Likewise.
(mve_vsubq_<supf>): Likewise.
* config/arm/predicates.md (mve_imm_7): Define predicate to check
the matching constraint Ra.
(mve_imm_selective_upto_8): Define predicate to check the matching
constraint Rg.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_r_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_r_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_r_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_r_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_r_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_r_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_r_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_r_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_r_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_r_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_r_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_r_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_u8.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 15:22:25 +0000 (15:22 +0000)]
[ARM][GCC][3/2x]: MVE intrinsics with binary operands.
This patch supports following MVE ACLE intrinsics with binary operands.
vaddlvq_p_s32, vaddlvq_p_u32, vcmpneq_s8, vcmpneq_s16, vcmpneq_s32, vcmpneq_u8, vcmpneq_u16, vcmpneq_u32, vshlq_s8, vshlq_s16, vshlq_s32, vshlq_u8, vshlq_u16, vshlq_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (BINOP_NONE_NONE_UNONE_QUALIFIERS): Define
qualifier for binary operands.
(BINOP_UNONE_NONE_NONE_QUALIFIERS): Likewise.
(BINOP_UNONE_UNONE_NONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vaddlvq_p_s32): Define macro.
(vaddlvq_p_u32): Likewise.
(vcmpneq_s8): Likewise.
(vcmpneq_s16): Likewise.
(vcmpneq_s32): Likewise.
(vcmpneq_u8): Likewise.
(vcmpneq_u16): Likewise.
(vcmpneq_u32): Likewise.
(vshlq_s8): Likewise.
(vshlq_s16): Likewise.
(vshlq_s32): Likewise.
(vshlq_u8): Likewise.
(vshlq_u16): Likewise.
(vshlq_u32): Likewise.
(__arm_vaddlvq_p_s32): Define intrinsic.
(__arm_vaddlvq_p_u32): Likewise.
(__arm_vcmpneq_s8): Likewise.
(__arm_vcmpneq_s16): Likewise.
(__arm_vcmpneq_s32): Likewise.
(__arm_vcmpneq_u8): Likewise.
(__arm_vcmpneq_u16): Likewise.
(__arm_vcmpneq_u32): Likewise.
(__arm_vshlq_s8): Likewise.
(__arm_vshlq_s16): Likewise.
(__arm_vshlq_s32): Likewise.
(__arm_vshlq_u8): Likewise.
(__arm_vshlq_u16): Likewise.
(__arm_vshlq_u32): Likewise.
(vaddlvq_p): Define polymorphic variant.
(vcmpneq): Likewise.
(vshlq): Likewise.
* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_UNONE_QUALIFIERS):
Use it.
(BINOP_UNONE_NONE_NONE_QUALIFIERS): Likewise.
(BINOP_UNONE_UNONE_NONE_QUALIFIERS): Likewise.
* config/arm/mve.md (mve_vaddlvq_p_<supf>v4si): Define RTL pattern.
(mve_vcmpneq_<supf><mode>): Likewise.
(mve_vshlq_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vaddlvq_p_s32.c: New test.
* gcc.target/arm/mve/intrinsics/vaddlvq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_u8.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 15:08:47 +0000 (15:08 +0000)]
[ARM][GCC][2/2x]: MVE intrinsics with binary operands.
This patch supports following MVE ACLE intrinsics with binary operands.
vcvtq_n_s16_f16, vcvtq_n_s32_f32, vcvtq_n_u16_f16, vcvtq_n_u32_f32, vcreateq_u8, vcreateq_u16, vcreateq_u32, vcreateq_u64, vcreateq_s8, vcreateq_s16, vcreateq_s32, vcreateq_s64, vshrq_n_s8, vshrq_n_s16, vshrq_n_s32, vshrq_n_u8, vshrq_n_u16, vshrq_n_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch new constraints "Rb" and "Rf" are added, which checks the constant is with in the range of 1 to 8 and 1 to 32 respectively.
Also a new predicates "mve_imm_8" and "mve_imm_32" are added, to check the the matching constraint Rb and Rf respectively.
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (BINOP_UNONE_UNONE_IMM_QUALIFIERS): Define
qualifier for binary operands.
(BINOP_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(BINOP_UNONE_NONE_IMM_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vcvtq_n_s16_f16): Define macro.
(vcvtq_n_s32_f32): Likewise.
(vcvtq_n_u16_f16): Likewise.
(vcvtq_n_u32_f32): Likewise.
(vcreateq_u8): Likewise.
(vcreateq_u16): Likewise.
(vcreateq_u32): Likewise.
(vcreateq_u64): Likewise.
(vcreateq_s8): Likewise.
(vcreateq_s16): Likewise.
(vcreateq_s32): Likewise.
(vcreateq_s64): Likewise.
(vshrq_n_s8): Likewise.
(vshrq_n_s16): Likewise.
(vshrq_n_s32): Likewise.
(vshrq_n_u8): Likewise.
(vshrq_n_u16): Likewise.
(vshrq_n_u32): Likewise.
(__arm_vcreateq_u8): Define intrinsic.
(__arm_vcreateq_u16): Likewise.
(__arm_vcreateq_u32): Likewise.
(__arm_vcreateq_u64): Likewise.
(__arm_vcreateq_s8): Likewise.
(__arm_vcreateq_s16): Likewise.
(__arm_vcreateq_s32): Likewise.
(__arm_vcreateq_s64): Likewise.
(__arm_vshrq_n_s8): Likewise.
(__arm_vshrq_n_s16): Likewise.
(__arm_vshrq_n_s32): Likewise.
(__arm_vshrq_n_u8): Likewise.
(__arm_vshrq_n_u16): Likewise.
(__arm_vshrq_n_u32): Likewise.
(__arm_vcvtq_n_s16_f16): Likewise.
(__arm_vcvtq_n_s32_f32): Likewise.
(__arm_vcvtq_n_u16_f16): Likewise.
(__arm_vcvtq_n_u32_f32): Likewise.
(vshrq_n): Define polymorphic variant.
* config/arm/arm_mve_builtins.def (BINOP_UNONE_UNONE_IMM_QUALIFIERS):
Use it.
(BINOP_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(BINOP_UNONE_NONE_IMM_QUALIFIERS): Likewise.
* config/arm/constraints.md (Rb): Define constraint to check constant is
in the range of 1 to 8.
(Rf): Define constraint to check constant is in the range of 1 to 32.
* config/arm/mve.md (mve_vcreateq_<supf><mode>): Define RTL pattern.
(mve_vshrq_n_<supf><mode>): Likewise.
(mve_vcvtq_n_from_f_<supf><mode>): Likewise.
* config/arm/predicates.md (mve_imm_8): Define predicate to check
the matching constraint Rb.
(mve_imm_32): Define predicate to check the matching constraint Rf.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vcreateq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vcreateq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_n_u8.c: Likewise.
Ville Voutilainen [Tue, 17 Mar 2020 14:38:25 +0000 (16:38 +0200)]
c++: Fix access checks for __is_assignable and __is_constructible
gcc/
PR c++/94197
* cp/method.c (assignable_expr): Use cp_unevaluated.
(is_xible_helper): Push a non-deferred access check for
the stub objects created by assignable_expr and constructible_expr.
testsuite/
PR c++/94197
* g++.dg/ext/pr94197.C: New.
Srinath Parvathaneni [Tue, 17 Mar 2020 14:51:51 +0000 (14:51 +0000)]
[ARM][GCC][1/2x]: MVE intrinsics with binary operands.
This patch supports following MVE ACLE intrinsics with binary operand.
vsubq_n_f16, vsubq_n_f32, vbrsrq_n_f16, vbrsrq_n_f32, vcvtq_n_f16_s16, vcvtq_n_f32_s32, vcvtq_n_f16_u16, vcvtq_n_f32_u32, vcreateq_f16, vcreateq_f32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch new constraint "Rd" is added, which checks the constant is with in the range of 1 to 16.
Also a new predicate "mve_imm_16" is added, to check the the matching constraint Rd.
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (BINOP_NONE_NONE_NONE_QUALIFIERS): Define
qualifier for binary operands.
(BINOP_NONE_NONE_IMM_QUALIFIERS): Likewise.
(BINOP_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(BINOP_NONE_UNONE_UNONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vsubq_n_f16): Define macro.
(vsubq_n_f32): Likewise.
(vbrsrq_n_f16): Likewise.
(vbrsrq_n_f32): Likewise.
(vcvtq_n_f16_s16): Likewise.
(vcvtq_n_f32_s32): Likewise.
(vcvtq_n_f16_u16): Likewise.
(vcvtq_n_f32_u32): Likewise.
(vcreateq_f16): Likewise.
(vcreateq_f32): Likewise.
(__arm_vsubq_n_f16): Define intrinsic.
(__arm_vsubq_n_f32): Likewise.
(__arm_vbrsrq_n_f16): Likewise.
(__arm_vbrsrq_n_f32): Likewise.
(__arm_vcvtq_n_f16_s16): Likewise.
(__arm_vcvtq_n_f32_s32): Likewise.
(__arm_vcvtq_n_f16_u16): Likewise.
(__arm_vcvtq_n_f32_u32): Likewise.
(__arm_vcreateq_f16): Likewise.
(__arm_vcreateq_f32): Likewise.
(vsubq): Define polymorphic variant.
(vbrsrq): Likewise.
(vcvtq_n): Likewise.
* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_NONE_QUALIFIERS): Use
it.
(BINOP_NONE_NONE_IMM_QUALIFIERS): Likewise.
(BINOP_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(BINOP_NONE_UNONE_UNONE_QUALIFIERS): Likewise.
* config/arm/constraints.md (Rd): Define constraint to check constant is
in the range of 1 to 16.
* config/arm/mve.md (mve_vsubq_n_f<mode>): Define RTL pattern.
mve_vbrsrq_n_f<mode>: Likewise.
mve_vcvtq_n_to_f_<supf><mode>: Likewise.
mve_vcreateq_f<mode>: Likewise.
* config/arm/predicates.md (mve_imm_16): Define predicate to check
the matching constraint Rd.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vbrsrq_n_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vbrsrq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcreateq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_n_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_n_f32.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 14:21:50 +0000 (14:21 +0000)]
[ARM][GCC][4/1x]: MVE intrinsics with unary operand.
This patch supports following MVE ACLE intrinsics with unary operand.
vctp16q, vctp32q, vctp64q, vctp8q, vpnot.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
There are few conflicts in defining the machine registers, resolved by re-ordering VPR_REGNUM, APSRQ_REGNUM and APSRGE_REGNUM.
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (hi_UP): Define mode.
* config/arm/arm.h (IS_VPR_REGNUM): Move.
* config/arm/arm.md (VPR_REGNUM): Define before APSRQ_REGNUM.
(APSRQ_REGNUM): Modify.
(APSRGE_REGNUM): Modify.
* config/arm/arm_mve.h (vctp16q): Define macro.
(vctp32q): Likewise.
(vctp64q): Likewise.
(vctp8q): Likewise.
(vpnot): Likewise.
(__arm_vctp16q): Define intrinsic.
(__arm_vctp32q): Likewise.
(__arm_vctp64q): Likewise.
(__arm_vctp8q): Likewise.
(__arm_vpnot): Likewise.
* config/arm/arm_mve_builtins.def (UNOP_UNONE_UNONE): Use builtin
qualifier.
* config/arm/mve.md (mve_vctp<mode1>qhi): Define RTL pattern.
(mve_vpnothi): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vctp16q.c: New test.
* gcc.target/arm/mve/intrinsics/vctp32q.c: Likewise.
* gcc.target/arm/mve/intrinsics/vctp64q.c: Likewise.
* gcc.target/arm/mve/intrinsics/vctp8q.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpnot.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 12:23:42 +0000 (12:23 +0000)]
[ARM][GCC][3/1x]: MVE intrinsics with unary operand.
This patch supports following MVE ACLE intrinsics with unary operand.
vdupq_n_s8, vdupq_n_s16, vdupq_n_s32, vabsq_s8, vabsq_s16, vabsq_s32, vclsq_s8, vclsq_s16, vclsq_s32, vclzq_s8, vclzq_s16, vclzq_s32, vnegq_s8, vnegq_s16, vnegq_s32, vaddlvq_s32, vaddvq_s8, vaddvq_s16, vaddvq_s32, vmovlbq_s8, vmovlbq_s16, vmovltq_s8, vmovltq_s16, vmvnq_s8, vmvnq_s16, vmvnq_s32, vrev16q_s8, vrev32q_s8, vrev32q_s16, vqabsq_s8, vqabsq_s16, vqabsq_s32, vqnegq_s8, vqnegq_s16, vqnegq_s32, vcvtaq_s16_f16, vcvtaq_s32_f32, vcvtnq_s16_f16, vcvtnq_s32_f32, vcvtpq_s16_f16, vcvtpq_s32_f32, vcvtmq_s16_f16, vcvtmq_s32_f32, vmvnq_u8, vmvnq_u16, vmvnq_u32, vdupq_n_u8, vdupq_n_u16, vdupq_n_u32, vclzq_u8, vclzq_u16, vclzq_u32, vaddvq_u8, vaddvq_u16, vaddvq_u32, vrev32q_u8, vrev32q_u16, vmovltq_u8, vmovltq_u16, vmovlbq_u8, vmovlbq_u16, vrev16q_u8, vaddlvq_u32, vcvtpq_u16_f16, vcvtpq_u32_f32, vcvtnq_u16_f16, vcvtmq_u16_f16, vcvtmq_u32_f32, vcvtaq_u16_f16, vcvtaq_u32_f32, vdupq_n, vabsq, vclsq, vclzq, vnegq, vaddlvq, vaddvq, vmovlbq, vmovltq, vmvnq, vrev16q, vrev32q, vqabsq, vqnegq.
A new register class "EVEN_REGS" which allows only even registers is added in this patch.
The new constraint "e" allows only reigsters of EVEN_REGS class.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm.h (enum reg_class): Define new class EVEN_REGS.
* config/arm/arm_mve.h (vdupq_n_s8): Define macro.
(vdupq_n_s16): Likewise.
(vdupq_n_s32): Likewise.
(vabsq_s8): Likewise.
(vabsq_s16): Likewise.
(vabsq_s32): Likewise.
(vclsq_s8): Likewise.
(vclsq_s16): Likewise.
(vclsq_s32): Likewise.
(vclzq_s8): Likewise.
(vclzq_s16): Likewise.
(vclzq_s32): Likewise.
(vnegq_s8): Likewise.
(vnegq_s16): Likewise.
(vnegq_s32): Likewise.
(vaddlvq_s32): Likewise.
(vaddvq_s8): Likewise.
(vaddvq_s16): Likewise.
(vaddvq_s32): Likewise.
(vmovlbq_s8): Likewise.
(vmovlbq_s16): Likewise.
(vmovltq_s8): Likewise.
(vmovltq_s16): Likewise.
(vmvnq_s8): Likewise.
(vmvnq_s16): Likewise.
(vmvnq_s32): Likewise.
(vrev16q_s8): Likewise.
(vrev32q_s8): Likewise.
(vrev32q_s16): Likewise.
(vqabsq_s8): Likewise.
(vqabsq_s16): Likewise.
(vqabsq_s32): Likewise.
(vqnegq_s8): Likewise.
(vqnegq_s16): Likewise.
(vqnegq_s32): Likewise.
(vcvtaq_s16_f16): Likewise.
(vcvtaq_s32_f32): Likewise.
(vcvtnq_s16_f16): Likewise.
(vcvtnq_s32_f32): Likewise.
(vcvtpq_s16_f16): Likewise.
(vcvtpq_s32_f32): Likewise.
(vcvtmq_s16_f16): Likewise.
(vcvtmq_s32_f32): Likewise.
(vmvnq_u8): Likewise.
(vmvnq_u16): Likewise.
(vmvnq_u32): Likewise.
(vdupq_n_u8): Likewise.
(vdupq_n_u16): Likewise.
(vdupq_n_u32): Likewise.
(vclzq_u8): Likewise.
(vclzq_u16): Likewise.
(vclzq_u32): Likewise.
(vaddvq_u8): Likewise.
(vaddvq_u16): Likewise.
(vaddvq_u32): Likewise.
(vrev32q_u8): Likewise.
(vrev32q_u16): Likewise.
(vmovltq_u8): Likewise.
(vmovltq_u16): Likewise.
(vmovlbq_u8): Likewise.
(vmovlbq_u16): Likewise.
(vrev16q_u8): Likewise.
(vaddlvq_u32): Likewise.
(vcvtpq_u16_f16): Likewise.
(vcvtpq_u32_f32): Likewise.
(vcvtnq_u16_f16): Likewise.
(vcvtmq_u16_f16): Likewise.
(vcvtmq_u32_f32): Likewise.
(vcvtaq_u16_f16): Likewise.
(vcvtaq_u32_f32): Likewise.
(__arm_vdupq_n_s8): Define intrinsic.
(__arm_vdupq_n_s16): Likewise.
(__arm_vdupq_n_s32): Likewise.
(__arm_vabsq_s8): Likewise.
(__arm_vabsq_s16): Likewise.
(__arm_vabsq_s32): Likewise.
(__arm_vclsq_s8): Likewise.
(__arm_vclsq_s16): Likewise.
(__arm_vclsq_s32): Likewise.
(__arm_vclzq_s8): Likewise.
(__arm_vclzq_s16): Likewise.
(__arm_vclzq_s32): Likewise.
(__arm_vnegq_s8): Likewise.
(__arm_vnegq_s16): Likewise.
(__arm_vnegq_s32): Likewise.
(__arm_vaddlvq_s32): Likewise.
(__arm_vaddvq_s8): Likewise.
(__arm_vaddvq_s16): Likewise.
(__arm_vaddvq_s32): Likewise.
(__arm_vmovlbq_s8): Likewise.
(__arm_vmovlbq_s16): Likewise.
(__arm_vmovltq_s8): Likewise.
(__arm_vmovltq_s16): Likewise.
(__arm_vmvnq_s8): Likewise.
(__arm_vmvnq_s16): Likewise.
(__arm_vmvnq_s32): Likewise.
(__arm_vrev16q_s8): Likewise.
(__arm_vrev32q_s8): Likewise.
(__arm_vrev32q_s16): Likewise.
(__arm_vqabsq_s8): Likewise.
(__arm_vqabsq_s16): Likewise.
(__arm_vqabsq_s32): Likewise.
(__arm_vqnegq_s8): Likewise.
(__arm_vqnegq_s16): Likewise.
(__arm_vqnegq_s32): Likewise.
(__arm_vmvnq_u8): Likewise.
(__arm_vmvnq_u16): Likewise.
(__arm_vmvnq_u32): Likewise.
(__arm_vdupq_n_u8): Likewise.
(__arm_vdupq_n_u16): Likewise.
(__arm_vdupq_n_u32): Likewise.
(__arm_vclzq_u8): Likewise.
(__arm_vclzq_u16): Likewise.
(__arm_vclzq_u32): Likewise.
(__arm_vaddvq_u8): Likewise.
(__arm_vaddvq_u16): Likewise.
(__arm_vaddvq_u32): Likewise.
(__arm_vrev32q_u8): Likewise.
(__arm_vrev32q_u16): Likewise.
(__arm_vmovltq_u8): Likewise.
(__arm_vmovltq_u16): Likewise.
(__arm_vmovlbq_u8): Likewise.
(__arm_vmovlbq_u16): Likewise.
(__arm_vrev16q_u8): Likewise.
(__arm_vaddlvq_u32): Likewise.
(__arm_vcvtpq_u16_f16): Likewise.
(__arm_vcvtpq_u32_f32): Likewise.
(__arm_vcvtnq_u16_f16): Likewise.
(__arm_vcvtmq_u16_f16): Likewise.
(__arm_vcvtmq_u32_f32): Likewise.
(__arm_vcvtaq_u16_f16): Likewise.
(__arm_vcvtaq_u32_f32): Likewise.
(__arm_vcvtaq_s16_f16): Likewise.
(__arm_vcvtaq_s32_f32): Likewise.
(__arm_vcvtnq_s16_f16): Likewise.
(__arm_vcvtnq_s32_f32): Likewise.
(__arm_vcvtpq_s16_f16): Likewise.
(__arm_vcvtpq_s32_f32): Likewise.
(__arm_vcvtmq_s16_f16): Likewise.
(__arm_vcvtmq_s32_f32): Likewise.
(vdupq_n): Define polymorphic variant.
(vabsq): Likewise.
(vclsq): Likewise.
(vclzq): Likewise.
(vnegq): Likewise.
(vaddlvq): Likewise.
(vaddvq): Likewise.
(vmovlbq): Likewise.
(vmovltq): Likewise.
(vmvnq): Likewise.
(vrev16q): Likewise.
(vrev32q): Likewise.
(vqabsq): Likewise.
(vqnegq): Likewise.
* config/arm/arm_mve_builtins.def (UNOP_SNONE_SNONE): Use it.
(UNOP_SNONE_NONE): Likewise.
(UNOP_UNONE_UNONE): Likewise.
(UNOP_UNONE_NONE): Likewise.
* config/arm/constraints.md (e): Define new constriant to allow only
even registers.
* config/arm/mve.md (mve_vqabsq_s<mode>): Define RTL pattern.
(mve_vnegq_s<mode>): Likewise.
(mve_vmvnq_<supf><mode>): Likewise.
(mve_vdupq_n_<supf><mode>): Likewise.
(mve_vclzq_<supf><mode>): Likewise.
(mve_vclsq_s<mode>): Likewise.
(mve_vaddvq_<supf><mode>): Likewise.
(mve_vabsq_s<mode>): Likewise.
(mve_vrev32q_<supf><mode>): Likewise.
(mve_vmovltq_<supf><mode>): Likewise.
(mve_vmovlbq_<supf><mode>): Likewise.
(mve_vcvtpq_<supf><mode>): Likewise.
(mve_vcvtnq_<supf><mode>): Likewise.
(mve_vcvtmq_<supf><mode>): Likewise.
(mve_vcvtaq_<supf><mode>): Likewise.
(mve_vrev16q_<supf>v16qi): Likewise.
(mve_vaddlvq_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabsq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabsq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_u8.c: Likewise.
Jakub Jelinek [Tue, 17 Mar 2020 12:52:19 +0000 (13:52 +0100)]
Fix up duplicated duplicated words mostly in comments
In the r10-7197-gbae7b38cf8a21e068ad5c0bab089dedb78af3346 commit I've
noticed duplicated word in a message, which lead me to grep for those and
we have a tons of them.
I've used
grep -v 'long long\|optab optab\|template template\|double double' *.[chS] */*.[chS] *.def config/*/* 2>/dev/null | grep ' \([a-zA-Z]\+\) \1 '
Note, the command will not detect the doubled words at the start or end of
line or when one of the words is at the end of line and the next one at the
start of another one.
Some of it is fairly obvious, e.g. all the "the the" cases which is
something I've posted and committed patch for already e.g. in 2016,
other cases are often valid, e.g. "that that" seems to look mostly ok to me.
Some cases are quite hard to figure out, I've left out some of them from the
patch (e.g. "and and" in some cases isn't talking about bitwise/logical and
and so looks incorrect, but in other cases it is talking about those
operations).
In most cases the right solution seems to be to remove one of the duplicated
words, but not always.
I think most important are the ones with user visible messages (in the patch
3 of the first 4 hunks), the rest is just comments (and internal
documentation; for that see the doc/tm.texi changes).
2020-03-17 Jakub Jelinek <jakub@redhat.com>
* lra-spills.c (remove_pseudos): Fix up duplicated word issue in
a dump message.
* tree-sra.c (create_access_replacement): Fix up duplicated word issue
in a comment.
* read-rtl-function.c (find_param_by_name,
function_reader::parse_enum_value, function_reader::get_insn_by_uid):
Likewise.
* spellcheck.c (get_edit_distance_cutoff): Likewise.
* tree-data-ref.c (create_ifn_alias_checks): Likewise.
* tree.def (SWITCH_EXPR): Likewise.
* selftest.c (assert_str_contains): Likewise.
* ipa-param-manipulation.h (class ipa_param_body_adjustments):
Likewise.
* tree-ssa-math-opts.c (convert_expand_mult_copysign): Likewise.
* tree-ssa-loop-split.c (find_vdef_in_loop): Likewise.
* langhooks.h (struct lang_hooks_for_decls): Likewise.
* ipa-prop.h (struct ipa_param_descriptor): Likewise.
* tree-ssa-strlen.c (handle_builtin_string_cmp, handle_store):
Likewise.
* tree-ssa-dom.c (simplify_stmt_for_jump_threading): Likewise.
* tree-ssa-reassoc.c (reassociate_bb): Likewise.
* tree.c (component_ref_size): Likewise.
* hsa-common.c (hsa_init_compilation_unit_data): Likewise.
* gimple-ssa-sprintf.c (get_string_length, format_string,
format_directive): Likewise.
* omp-grid.c (grid_process_kernel_body_copy): Likewise.
* input.c (string_concat_db::get_string_concatenation,
test_lexer_string_locations_ucn4): Likewise.
* cfgexpand.c (pass_expand::execute): Likewise.
* gimple-ssa-warn-restrict.c (builtin_memref::offset_out_of_bounds,
maybe_diag_overlap): Likewise.
* rtl.c (RTX_CODE_HWINT_P_1): Likewise.
* shrink-wrap.c (spread_components): Likewise.
* tree-ssa-dse.c (initialize_ao_ref_for_dse, valid_ao_ref_for_dse):
Likewise.
* tree-call-cdce.c (shrink_wrap_one_built_in_call_with_conds):
Likewise.
* dwarf2out.c (dwarf2out_early_finish): Likewise.
* gimple-ssa-store-merging.c: Likewise.
* ira-costs.c (record_operand_costs): Likewise.
* tree-vect-loop.c (vectorizable_reduction): Likewise.
* target.def (dispatch): Likewise.
(validate_dims, gen_ccmp_first): Fix up duplicated word issue
in documentation text.
* doc/tm.texi: Regenerated.
* config/i386/x86-tune.def (X86_TUNE_PARTIAL_FLAG_REG_STALL): Fix up
duplicated word issue in a comment.
* config/i386/i386.c (ix86_test_loading_unspec): Likewise.
* config/i386/i386-features.c (remove_partial_avx_dependency):
Likewise.
* config/msp430/msp430.c (msp430_select_section): Likewise.
* config/gcn/gcn-run.c (load_image): Likewise.
* config/aarch64/aarch64-sve.md (sve_ld1r<mode>): Likewise.
* config/aarch64/aarch64.c (aarch64_gen_adjusted_ldpstp): Likewise.
* config/aarch64/falkor-tag-collision-avoidance.c
(single_dest_per_chain): Likewise.
* config/nvptx/nvptx.c (nvptx_record_fndecl): Likewise.
* config/fr30/fr30.c (fr30_arg_partial_bytes): Likewise.
* config/rs6000/rs6000-string.c (expand_cmp_vec_sequence): Likewise.
* config/rs6000/rs6000-p8swap.c (replace_swapped_load_constant):
Likewise.
* config/rs6000/rs6000-c.c (rs6000_target_modify_macros): Likewise.
* config/rs6000/rs6000.c (rs6000_option_override_internal): Likewise.
* config/rs6000/rs6000-logue.c
(rs6000_emit_probe_stack_range_stack_clash): Likewise.
* config/nds32/nds32-md-auxiliary.c (nds32_split_ashiftdi3): Likewise.
Fix various other issues in the comment.
c-family/
* c-common.c (resolve_overloaded_builtin): Fix up duplicated word
issue in a diagnostic message.
cp/
* pt.c (tsubst): Fix up duplicated word issue in a diagnostic message.
(lookup_template_class_1, tsubst_expr): Fix up duplicated word issue
in a comment.
* parser.c (cp_parser_statement, cp_parser_linkage_specification,
cp_parser_placeholder_type_specifier,
cp_parser_constraint_requires_parens): Likewise.
* name-lookup.c (suggest_alternative_in_explicit_scope): Likewise.
fortran/
* array.c (gfc_check_iter_variable): Fix up duplicated word issue
in a comment.
* arith.c (gfc_arith_concat): Likewise.
* resolve.c (gfc_resolve_ref): Likewise.
* frontend-passes.c (matmul_lhs_realloc): Likewise.
* module.c (gfc_match_submodule, load_needed): Likewise.
* trans-expr.c (gfc_init_se): Likewise.
Mihail Ionescu [Tue, 17 Mar 2020 12:39:39 +0000 (12:39 +0000)]
[GCC][PATCH][ARM] Add multilib mapping for Armv8.1-M+MVE with -mfloat-abi=hard
This patch adds a new multilib for armv8.1-m.main+mve with hard float abi. For
armv8.1-m.main+mve soft and softfp, the v8-M multilibs will be reused.
The following mappings are also updated:
"-mfloat-abi=hard -march=armv8.1-m.main+mve.fp -> armv8-m.main+fp/hard"
"-mfloat-abi=softfp -march=armv8.1-m.main+mve.fp -> armv8-m.main+fp/softfp"
"-mfloat-abi=soft -march=armv8.1-m.main+mve.fp -> armv8-m.main/nofp"
gcc/ChangeLog:
2020-03-17 Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/t-rmprofile: create new multilib for
armv8.1-m.main+mve hard float and reuse v8-m.main ones for
v8.1-m.main+mve .
gcc/testsuite/ChangeLog:
2020-03-17 Mihail Ionescu <mihail.ionescu@arm.com>
* gcc.target/arm/multilib.exp: Add new v8.1-M entry.
libgcc/ChangLog:
2020-03-17 Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/t-arm: Do not compile cmse_nonsecure_call.S for v8.1-m.
Jakub Jelinek [Tue, 17 Mar 2020 12:36:41 +0000 (13:36 +0100)]
tree-ssa-strlen: Fix up count_nonzero_bytes* [PR94015]
As I said already yesterday in another PR, I'm afraid the mixing of apples
and oranges (what we are actually computing, whether what bytes are zero or
non-zero in the native representation of EXP itself or what EXP points to)
in a single function where it performs some handling which must be specific
to one or the other case unconditionally and only from time to time
determines something based on if nbytes is 0 or not will continue to bite us
again and again.
So, this patch performs at least a partial cleanup to separate those two
cases into two functions.
In addition to the separation, the patch uses e.g. ctor_for_folding so that
it does handle volatile loads properly and various other checks instead of
directly using DECL_INITIAL or does guard native_encode_expr call the way it
is guarded elsewhere (that host and target byte sizes are expected).
I've left other issues I found as is for now, like the *allnonnul being IMHO
wrongly computed (if we don't know anything about the bytes, such as if
_1 = MEM[s_2(D)];
MEM[whatever] = _1;
where nothing really is known about strlen(s) etc., the code right now
clears *nulterm and *allnul, but keeps *allnonnull set), but the callers
seem to never use that value for anything (so the question is why is it
computed and how exactly should it be defined). Another thing I find quite
weird is the distinction between count_nonzero_bytes failing (return false)
and when it succeeds, but sets values to a don't know state (the warning is
only issued if it succeeds), plus what lenrange[2] is for. The size of the
store should be visible already from the store statement. Also the looking
at the type of the MEM_REF first operand to determine if it is is_char_store
is really weird, because both in user code and through sccvn where pointer
conversions are useless the type of the MEM_REF operand doesn't have to have
anything to do with what the code actually does.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR tree-optimization/94015
* tree-ssa-strlen.c (count_nonzero_bytes): Split portions of the
function where EXP is address of the bytes being stored rather than
the bytes themselves into count_nonzero_bytes_addr. Punt on zero
sized MEM_REF. Use VAR_P macro and handle CONST_DECL like VAR_DECLs.
Use ctor_for_folding instead of looking at DECL_INITIAL. Punt before
calling native_encode_expr if host or target doesn't have 8-bit
chars. Formatting fixes.
(count_nonzero_bytes_addr): New function.
* gcc.dg/pr94015.c: New test.
Srinath Parvathaneni [Tue, 17 Mar 2020 12:03:30 +0000 (12:03 +0000)]
[ARM][GCC][2/1x]: MVE intrinsics with unary operand.
This patch supports following MVE ACLE intrinsics with unary operand.
vmvnq_n_s16, vmvnq_n_s32, vrev64q_s8, vrev64q_s16, vrev64q_s32, vcvtq_s16_f16, vcvtq_s32_f32, vrev64q_u8, vrev64q_u16, vrev64q_u32, vmvnq_n_u16, vmvnq_n_u32, vcvtq_u16_f16, vcvtq_u32_f32, vrev64q.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (UNOP_SNONE_SNONE_QUALIFIERS): Define.
(UNOP_SNONE_NONE_QUALIFIERS): Likewise.
(UNOP_SNONE_IMM_QUALIFIERS): Likewise.
(UNOP_UNONE_NONE_QUALIFIERS): Likewise.
(UNOP_UNONE_UNONE_QUALIFIERS): Likewise.
(UNOP_UNONE_IMM_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vmvnq_n_s16): Define macro.
(vmvnq_n_s32): Likewise.
(vrev64q_s8): Likewise.
(vrev64q_s16): Likewise.
(vrev64q_s32): Likewise.
(vcvtq_s16_f16): Likewise.
(vcvtq_s32_f32): Likewise.
(vrev64q_u8): Likewise.
(vrev64q_u16): Likewise.
(vrev64q_u32): Likewise.
(vmvnq_n_u16): Likewise.
(vmvnq_n_u32): Likewise.
(vcvtq_u16_f16): Likewise.
(vcvtq_u32_f32): Likewise.
(__arm_vmvnq_n_s16): Define intrinsic.
(__arm_vmvnq_n_s32): Likewise.
(__arm_vrev64q_s8): Likewise.
(__arm_vrev64q_s16): Likewise.
(__arm_vrev64q_s32): Likewise.
(__arm_vrev64q_u8): Likewise.
(__arm_vrev64q_u16): Likewise.
(__arm_vrev64q_u32): Likewise.
(__arm_vmvnq_n_u16): Likewise.
(__arm_vmvnq_n_u32): Likewise.
(__arm_vcvtq_s16_f16): Likewise.
(__arm_vcvtq_s32_f32): Likewise.
(__arm_vcvtq_u16_f16): Likewise.
(__arm_vcvtq_u32_f32): Likewise.
(vrev64q): Define polymorphic variant.
* config/arm/arm_mve_builtins.def (UNOP_SNONE_SNONE): Use it.
(UNOP_SNONE_NONE): Likewise.
(UNOP_SNONE_IMM): Likewise.
(UNOP_UNONE_UNONE): Likewise.
(UNOP_UNONE_NONE): Likewise.
(UNOP_UNONE_IMM): Likewise.
* config/arm/mve.md (mve_vrev64q_<supf><mode>): Define RTL pattern.
(mve_vcvtq_from_f_<supf><mode>): Likewise.
(mve_vmvnq_n_<supf><mode>): Likewise.
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vcvtq_s16_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vcvtq_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_u8.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 11:50:54 +0000 (11:50 +0000)]
[ARM][GCC][1/1x]: Patch to support MVE ACLE intrinsics with unary operand.
This patch supports MVE ACLE intrinsics vcvtq_f16_s16, vcvtq_f32_s32, vcvtq_f16_u16, vcvtq_f32_u32n vrndxq_f16, vrndxq_f32, vrndq_f16, vrndq_f32, vrndpq_f16, vrndpq_f32, vrndnq_f16, vrndnq_f32, vrndmq_f16, vrndmq_f32, vrndaq_f16, vrndaq_f32, vrev64q_f16, vrev64q_f32, vnegq_f16, vnegq_f32, vdupq_n_f16, vdupq_n_f32, vabsq_f16, vabsq_f32, vrev32q_f16, vcvttq_f32_f16, vcvtbq_f32_f16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (UNOP_NONE_NONE_QUALIFIERS): Define macro.
(UNOP_NONE_SNONE_QUALIFIERS): Likewise.
(UNOP_NONE_UNONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vrndxq_f16): Define macro.
(vrndxq_f32): Likewise.
(vrndq_f16) Likewise.
(vrndq_f32): Likewise.
(vrndpq_f16): Likewise.
(vrndpq_f32): Likewise.
(vrndnq_f16): Likewise.
(vrndnq_f32): Likewise.
(vrndmq_f16): Likewise.
(vrndmq_f32): Likewise.
(vrndaq_f16): Likewise.
(vrndaq_f32): Likewise.
(vrev64q_f16): Likewise.
(vrev64q_f32): Likewise.
(vnegq_f16): Likewise.
(vnegq_f32): Likewise.
(vdupq_n_f16): Likewise.
(vdupq_n_f32): Likewise.
(vabsq_f16): Likewise.
(vabsq_f32): Likewise.
(vrev32q_f16): Likewise.
(vcvttq_f32_f16): Likewise.
(vcvtbq_f32_f16): Likewise.
(vcvtq_f16_s16): Likewise.
(vcvtq_f32_s32): Likewise.
(vcvtq_f16_u16): Likewise.
(vcvtq_f32_u32): Likewise.
(__arm_vrndxq_f16): Define intrinsic.
(__arm_vrndxq_f32): Likewise.
(__arm_vrndq_f16): Likewise.
(__arm_vrndq_f32): Likewise.
(__arm_vrndpq_f16): Likewise.
(__arm_vrndpq_f32): Likewise.
(__arm_vrndnq_f16): Likewise.
(__arm_vrndnq_f32): Likewise.
(__arm_vrndmq_f16): Likewise.
(__arm_vrndmq_f32): Likewise.
(__arm_vrndaq_f16): Likewise.
(__arm_vrndaq_f32): Likewise.
(__arm_vrev64q_f16): Likewise.
(__arm_vrev64q_f32): Likewise.
(__arm_vnegq_f16): Likewise.
(__arm_vnegq_f32): Likewise.
(__arm_vdupq_n_f16): Likewise.
(__arm_vdupq_n_f32): Likewise.
(__arm_vabsq_f16): Likewise.
(__arm_vabsq_f32): Likewise.
(__arm_vrev32q_f16): Likewise.
(__arm_vcvttq_f32_f16): Likewise.
(__arm_vcvtbq_f32_f16): Likewise.
(__arm_vcvtq_f16_s16): Likewise.
(__arm_vcvtq_f32_s32): Likewise.
(__arm_vcvtq_f16_u16): Likewise.
(__arm_vcvtq_f32_u32): Likewise.
(vrndxq): Define polymorphic variants.
(vrndq): Likewise.
(vrndpq): Likewise.
(vrndnq): Likewise.
(vrndmq): Likewise.
(vrndaq): Likewise.
(vrev64q): Likewise.
(vnegq): Likewise.
(vabsq): Likewise.
(vrev32q): Likewise.
(vcvtbq_f32): Likewise.
(vcvttq_f32): Likewise.
(vcvtq): Likewise.
* config/arm/arm_mve_builtins.def (VAR2): Define.
(VAR1): Define.
* config/arm/mve.md (mve_vrndxq_f<mode>): Add RTL pattern.
(mve_vrndq_f<mode>): Likewise.
(mve_vrndpq_f<mode>): Likewise.
(mve_vrndnq_f<mode>): Likewise.
(mve_vrndmq_f<mode>): Likewise.
(mve_vrndaq_f<mode>): Likewise.
(mve_vrev64q_f<mode>): Likewise.
(mve_vnegq_f<mode>): Likewise.
(mve_vdupq_n_f<mode>): Likewise.
(mve_vabsq_f<mode>): Likewise.
(mve_vrev32q_fv8hf): Likewise.
(mve_vcvttq_f32_f16v4sf): Likewise.
(mve_vcvtbq_f32_f16v4sf): Likewise.
(mve_vcvtq_to_f_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabsq_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabsq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_f32.c: Likewise.
Srinath Parvathaneni [Tue, 17 Mar 2020 10:19:31 +0000 (10:19 +0000)]
[ARM][GCC][4/x]: MVE ACLE vector interleaving store intrinsics.
This patch supports MVE ACLE intrinsics vst4q_s8, vst4q_s16, vst4q_s32, vst4q_u8, vst4q_u16, vst4q_u32, vst4q_f16 and vst4q_f32.
In this patch arm_mve_builtins.def file is added to the source code in which the builtins for MVE ACLE intrinsics are defined using builtin qualifiers.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-16 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (CF): Define mve_builtin_data.
(VAR1): Define.
(ARM_BUILTIN_MVE_PATTERN_START): Define.
(arm_init_mve_builtins): Define function.
(arm_init_builtins): Add TARGET_HAVE_MVE check.
(arm_expand_builtin_1): Check the range of fcode.
(arm_expand_mve_builtin): Define function to expand MVE builtins.
(arm_expand_builtin): Check the range of fcode.
* config/arm/arm_mve.h (__ARM_FEATURE_MVE): Define MVE floating point
types.
(__ARM_MVE_PRESERVE_USER_NAMESPACE): Define to protect user namespace.
(vst4q_s8): Define macro.
(vst4q_s16): Likewise.
(vst4q_s32): Likewise.
(vst4q_u8): Likewise.
(vst4q_u16): Likewise.
(vst4q_u32): Likewise.
(vst4q_f16): Likewise.
(vst4q_f32): Likewise.
(__arm_vst4q_s8): Define inline builtin.
(__arm_vst4q_s16): Likewise.
(__arm_vst4q_s32): Likewise.
(__arm_vst4q_u8): Likewise.
(__arm_vst4q_u16): Likewise.
(__arm_vst4q_u32): Likewise.
(__arm_vst4q_f16): Likewise.
(__arm_vst4q_f32): Likewise.
(__ARM_mve_typeid): Define macro with MVE types.
(__ARM_mve_coerce): Define macro with _Generic feature.
(vst4q): Define polymorphic variant for different vst4q builtins.
* config/arm/arm_mve_builtins.def: New file.
* config/arm/iterators.md (VSTRUCT): Modify to allow XI and OI
modes in MVE.
* config/arm/mve.md (MVE_VLD_ST): Define iterator.
(unspec): Define unspec.
(mve_vst4q<mode>): Define RTL pattern.
* config/arm/neon.md (mov<mode>): Modify expand to allow XI and OI
modes in MVE.
(neon_mov<mode>): Modify RTL define_insn to allow XI and OI modes
in MVE.
(define_split): Allow OI mode split for MVE after reload.
(define_split): Allow XI mode split for MVE after reload.
* config/arm/t-arm (arm.o): Add entry for arm_mve_builtins.def.
(arm-builtins.o): Likewise.
2020-03-16 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vst4q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vst4q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst4q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst4q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst4q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst4q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst4q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst4q_u8.c: Likewise.
Jakub Jelinek [Tue, 17 Mar 2020 10:12:59 +0000 (11:12 +0100)]
testsuite: Fix pr94185.C testcase on i686-linux with C++98 [PR94185]
I'm getting on i686-linux
FAIL: g++.target/i386/pr94185.C -std=gnu++98 (test for excess errors)
This is because of a diagnostic that
4294967295 is unsigned only in ISO C90.
Adding U suffix fixes it and the testcase still ICEs with unfixed gcc and
passes with current trunk.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR target/94185
* g++.target/i386/pr94185.C (l): Use 4294967295U instead of
4294967295
to avoid FAIL with -m32 -std=c++98.
Christophe Lyon [Tue, 17 Mar 2020 09:26:08 +0000 (09:26 +0000)]
c: ignore initializers for elements of variable-size types [PR93577]
2020-03-17 Christophe Lyon <christophe.lyon@linaro.org>
gcc/
* c-typeck.c (process_init_element): Handle constructor_type with
type size represented by POLY_INT_CST.
gcc/testsuite/
* gcc.target/aarch64/sve/acle/general-c/sizeless-1.c: Remove
superfluous dg-error.
* gcc.target/aarch64/sve/acle/general-c/sizeless-2.c: Likewise.
Jakub Jelinek [Tue, 17 Mar 2020 09:43:46 +0000 (10:43 +0100)]
strlen: Punt on UB reads past end of string literal [PR94187]
The gcc.dg/pr68785.c test which contains:
int
foo (void)
{
return *(int *) "";
}
has UB in the program if it is ever called, but causes UB in the compiler
as well as at least in theory non-reproduceable code generation.
The problem is that nbytes is in this case 4, prep is the
TREE_STRING_POINTER of a "" string literal with TREE_STRING_LENGTH of 1 and
we do:
4890 for (const char *p = prep; p != prep + nbytes; ++p)
4891 if (*p)
4892 {
4893 *allnul = false;
4894 break;
4895 }
and so read the bytes after the STRING_CST payload, which can be random.
I think we should just punt in this case.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR tree-optimization/94187
* tree-ssa-strlen.c (count_nonzero_bytes): Punt if
nchars - offset < nbytes.
Jakub Jelinek [Tue, 17 Mar 2020 09:42:35 +0000 (10:42 +0100)]
expand: Don't depend on warning flags in code generation of strnlen [PR94189]
The following testcase FAILs with -O2 -fcompare-debug, but the reason isn't
that we'd emit different code based on -g or non-debug, but rather that
we emit different code depending on whether -w is used or not (or e.g.
-Wno-stringop-overflow or whether some other pass emitted some other warning
already on the call).
Code generation shouldn't depend on whether we emit a warning or not if at
all possible.
The following patch punts (i.e. doesn't optimize the strnlen call to a
constant value) if we would emit the warning if it was enabled.
In the PR there is an alternate patch which does optimize the strnlen call
no matter if we emit the warning or not, though I think I prefer the version
below, e.g. the strnlen call might be crossing field boundaries, which is in
strict reading undefined, but I'd be afraid people do that in the real
world programs.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR middle-end/94189
* builtins.c (expand_builtin_strnlen): Do return NULL_RTX if we would
emit a warning if it was enabled and don't depend on TREE_NO_WARNING
for code-generation.
* gcc.dg/pr94189.c: New test.
Martin Liska [Tue, 17 Mar 2020 08:43:46 +0000 (09:43 +0100)]
Filter a test-case with gas.
PR lto/94157
* gcc.dg/lto/pr94157_0.c: Add gas effective
target filter.
Jason Merrill [Tue, 17 Mar 2020 01:16:35 +0000 (21:16 -0400)]
c++: Add test for PR 93901.
Joseph Myers [Tue, 17 Mar 2020 00:34:39 +0000 (00:34 +0000)]
Update gcc sv.po.
* sv.po: Update.
GCC Administrator [Tue, 17 Mar 2020 00:16:15 +0000 (00:16 +0000)]
Daily bump.
Iain Buclaw [Mon, 16 Mar 2020 22:53:20 +0000 (23:53 +0100)]
d: Fix assignment to anonymous union member corrupts sibling members in struct
gcc/d/ChangeLog:
PR d/92309
* types.cc (fixup_anonymous_offset): Don't set DECL_FIELD_OFFSET on
anonymous fields.
gcc/testsuite/ChangeLog:
PR d/92309
* gdc.dg/pr92309.d: New test.
Jonathan Wakely [Mon, 16 Mar 2020 22:53:42 +0000 (22:53 +0000)]
libstdc++: Add default constructor to net::service_already_exists (PR 94199)
The service_already_exists exception type specified in the TS doesn't
have any constructors defined. Since its base class isn't default
constructible, that means has no usable constructors. This may be a
defect in the TS.
This patch fixes it by adding a default constructor, but making it
private. The make_service function is declared as a friend to be able to
call that private constructor.
PR libstdc++/94199
* include/experimental/executor (service_already_exists): Add default
constructor. Declare make_service to be a friend.
* testsuite/experimental/net/execution_context/make_service.cc: New
test.
Iain Buclaw [Mon, 16 Mar 2020 22:04:49 +0000 (23:04 +0100)]
d: Fix multiple definition error when using mixins and interfaces.
gcc/d/ChangeLog:
PR d/92216
* decl.cc (make_thunk): Don't set TREE_PUBLIC on thunks if the target
function is external to the current compilation.
gcc/testsuite/ChangeLog:
PR d/92216
* gdc.dg/imports/pr92216.d: New.
* gdc.dg/pr92216.d: New test.
Jakub Jelinek [Mon, 16 Mar 2020 21:58:41 +0000 (22:58 +0100)]
c: Handle MEM_REF in c_fully_fold* [PR94179]
The recent match.pd changes can generate a MEM_REF which can be seen by the
C FE folding routines. Unlike the C++ FE, they weren't expected in the C FE
yet. MEM_REF should be handled like INDIRECT_REF, except that it has two
operands rather than just one and that we should preserve the type of the
second operand. Given that it already has to be an INTEGER_CST with pointer
type, I think we are fine, the recursive call should return the INTEGER_CST
unmodified and STRIP_TYPE_NOPS will not strip anything.
2020-03-16 Jakub Jelinek <jakub@redhat.com>
PR c/94179
* c-fold.c (c_fully_fold_internal): Handle MEM_REF.
* gcc.c-torture/compile/pr94179.c: New test.
Vladimir N. Makarov [Mon, 16 Mar 2020 20:42:19 +0000 (16:42 -0400)]
Fix PR94185: Do not reuse insn alternative after changing memory subreg.
2020-03-16 Vladimir Makarov <vmakarov@redhat.com>
PR target/94185
* lra-spills.c (remove_pseudos): Do not reuse insn alternative
after changing memory subreg.
2020-03-16 Vladimir Makarov <vmakarov@redhat.com>
PR target/94185
* g++.target/i386/pr94185.C: New test.
Richard Sandiford [Thu, 12 Mar 2020 11:54:27 +0000 (11:54 +0000)]
[testsuite] Avoid duplicate test names in sizeless tests
Jeff pointed out that using:
N: ... /* { dg-error {...} } */
N+1: /* { dg-error {...} "" { target *-*-* } .-1 } */
led to two identical test names for line N. Fixed by adding
a proper test name instead of "".
2020-03-16 Richard Sandiford <richard.sandiford@arm.com>
gcc/testsuite/
* gcc.target/aarch64/sve/acle/general-c/sizeless-1.c: Add a test
name to .-1 dg-error tests.
* gcc.target/aarch64/sve/acle/general-c/sizeless-2.c: Likewise.
Srinath Parvathaneni [Mon, 16 Mar 2020 17:33:03 +0000 (17:33 +0000)]
[ARM][GCC][3/x]: MVE ACLE intrinsics framework patch.
This patch is part of MVE ACLE intrinsics framework.
The patch supports the use of emulation for the single-precision arithmetic
operations for MVE. This changes are to support the MVE ACLE intrinsics which
operates on vector floating point arithmetic operations.
Please refer to Arm reference manual [1] for more details.
[1] https://developer.arm.com/docs/ddi0553/latest
2020-03-16 Andre Vieira <andre.simoesdiasvieira@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm.c (arm_libcall_uses_aapcs_base): Modify function to add
emulator calls for dobule precision arithmetic operations for MVE.
2020-03-16 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/mve_libcall1.c: New test.
* gcc.target/arm/mve/intrinsics/mve_libcall2.c: Likewise.
Srinath Parvathaneni [Mon, 16 Mar 2020 17:22:39 +0000 (17:22 +0000)]
[ARM][GCC][2/x]: MVE ACLE intrinsics framework patch.
This patch is part of MVE ACLE intrinsics framework.
This patches add support to update (read/write) the APSR (Application Program Status Register)
register and FPSCR (Floating-point Status and Control Register) register for MVE.
This patch also enables thumb2 mov RTL patterns for MVE.
A new feature bit vfp_base is added. This bit is enabled for all VFP, MVE and MVE with floating point
extensions. This bit is used to enable the macro TARGET_VFP_BASE. For all the VFP instructions, RTL patterns,
status and control registers are guarded by TARGET_HAVE_FLOAT. But this patch modifies that and the
common instructions, RTL patterns, status and control registers bewteen MVE and VFP are guarded by
TARGET_VFP_BASE macro.
The RTL pattern set_fpscr and get_fpscr are updated to use VFPCC_REGNUM because few MVE intrinsics
set/get carry bit of FPSCR register.
Please refer to Arm reference manual [1] for more details.
[1] https://developer.arm.com/docs/ddi0553/latest
2020-03-16 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* common/config/arm/arm-common.c (arm_asm_auto_mfpu): When vfp_base
feature bit is on and -mfpu=auto is passed as compiler option, do not
generate error on not finding any matching fpu. Because in this case
fpu is not required.
* config/arm/arm-cpus.in (vfp_base): Define feature bit, this bit is
enabled for MVE and also for all VFP extensions.
(VFPv2): Modify fgroup to enable vfp_base feature bit when ever VFPv2
is enabled.
(MVE): Define fgroup to enable feature bits mve, vfp_base and armv7em.
(MVE_FP): Define fgroup to enable feature bits is fgroup MVE and FPv5
along with feature bits mve_float.
(mve): Modify add options in armv8.1-m.main arch for MVE.
(mve.fp): Modify add options in armv8.1-m.main arch for MVE with
floating point.
* config/arm/arm.c (use_return_insn): Replace the
check with TARGET_VFP_BASE.
(thumb2_legitimate_index_p): Replace TARGET_HARD_FLOAT with
TARGET_VFP_BASE.
(arm_rtx_costs_internal): Replace "TARGET_HARD_FLOAT || TARGET_HAVE_MVE"
with TARGET_VFP_BASE, to allow cost calculations for copies in MVE as
well.
(arm_get_vfp_saved_size): Replace TARGET_HARD_FLOAT with
TARGET_VFP_BASE, to allow space calculation for VFP registers in MVE
as well.
(arm_compute_frame_layout): Likewise.
(arm_save_coproc_regs): Likewise.
(arm_fixed_condition_code_regs): Modify to enable using VFPCC_REGNUM
in MVE as well.
(arm_hard_regno_mode_ok): Replace "TARGET_HARD_FLOAT || TARGET_HAVE_MVE"
with equivalent macro TARGET_VFP_BASE.
(arm_expand_epilogue_apcs_frame): Likewise.
(arm_expand_epilogue): Likewise.
(arm_conditional_register_usage): Likewise.
(arm_declare_function_name): Add check to skip printing .fpu directive
in assembly file when TARGET_VFP_BASE is enabled and fpu_to_print is
"softvfp".
* config/arm/arm.h (TARGET_VFP_BASE): Define.
* config/arm/arm.md (arch): Add "mve" to arch.
(eq_attr "arch" "mve"): Enable on TARGET_HAVE_MVE is true.
(vfp_pop_multiple_with_writeback): Replace "TARGET_HARD_FLOAT
|| TARGET_HAVE_MVE" with equivalent macro TARGET_VFP_BASE.
* config/arm/constraints.md (Uf): Define to allow modification to FPCCR
in MVE.
* config/arm/thumb2.md (thumb2_movsfcc_soft_insn): Modify target guard
to not allow for MVE.
* config/arm/unspecs.md (UNSPEC_GET_FPSCR): Move to volatile unspecs
enum.
(VUNSPEC_GET_FPSCR): Define.
* config/arm/vfp.md (thumb2_movhi_vfp): Add support for VMSR and VMRS
instructions which move to general-purpose Register from Floating-point
Special register and vice-versa.
(thumb2_movhi_fp16): Likewise.
(thumb2_movsi_vfp): Add support for VMSR and VMRS instructions along
with MCR and MRC instructions which set and get Floating-point Status
and Control Register (FPSCR).
(movdi_vfp): Modify pattern to enable Single-precision scalar float move
in MVE.
(thumb2_movdf_vfp): Modify pattern to enable Double-precision scalar
float move patterns in MVE.
(thumb2_movsfcc_vfp): Modify pattern to enable single float conditional
code move patterns of VFP also in MVE by adding TARGET_VFP_BASE check.
(thumb2_movdfcc_vfp): Modify pattern to enable double float conditional
code move patterns of VFP also in MVE by adding TARGET_VFP_BASE check.
(push_multi_vfp): Add support to use VFP VPUSH pattern for MVE by adding
TARGET_VFP_BASE check.
(set_fpscr): Add support to set FPSCR register for MVE. Modify pattern
using VFPCC_REGNUM as few MVE intrinsics use carry bit of FPSCR
register.
(get_fpscr): Add support to get FPSCR register for MVE. Modify pattern
using VFPCC_REGNUM as few MVE intrinsics use carry bit of FPSCR
register.
2020-03-16 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/mve_fp_fpu1.c: New test.
* gcc.target/arm/mve/intrinsics/mve_fp_fpu2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_fpu1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_fpu2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_fpu3.c: Likewise.
Srinath Parvathaneni [Mon, 16 Mar 2020 17:06:29 +0000 (17:06 +0000)]
[ARM][GCC][1/x]: MVE ACLE intrinsics framework patch.
This patch creates the required framework for MVE ACLE intrinsics.
The following changes are done in this patch to support MVE ACLE intrinsics.
Header file arm_mve.h is added to source code, which contains the definitions of MVE ACLE intrinsics
and different data types used in MVE. Machine description file mve.md is also added which contains the
RTL patterns defined for MVE.
A new reigster "p0" is added which is used in by MVE predicated patterns. A new register class "VPR_REG"
is added and its contents are defined in REG_CLASS_CONTENTS.
The vec-common.md file is modified to support the standard move patterns. The prefix of neon functions
which are also used by MVE is changed from "neon_" to "simd_".
eg: neon_immediate_valid_for_move changed to simd_immediate_valid_for_move.
In the patch standard patterns mve_move, mve_store and move_load for MVE are added and neon.md and vfp.md
files are modified to support this common patterns.
Please refer to Arm reference manual [1] for more details.
[1] https://developer.arm.com/docs/ddi0553/latest
2020-03-06 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config.gcc (arm_mve.h): Include mve intrinsics header file.
* config/arm/aout.h (p0): Add new register name for MVE predicated
cases.
* config/arm-builtins.c (ARM_BUILTIN_SIMD_LANE_CHECK): Define macro
common to Neon and MVE.
(ARM_BUILTIN_NEON_LANE_CHECK): Renamed to ARM_BUILTIN_SIMD_LANE_CHECK.
(arm_init_simd_builtin_types): Disable poly types for MVE.
(arm_init_neon_builtins): Move a check to arm_init_builtins function.
(arm_init_builtins): Use ARM_BUILTIN_SIMD_LANE_CHECK instead of
ARM_BUILTIN_NEON_LANE_CHECK.
(mve_dereference_pointer): Add function.
(arm_expand_builtin_args): Call to mve_dereference_pointer when MVE is
enabled.
(arm_expand_neon_builtin): Moved to arm_expand_builtin function.
(arm_expand_builtin): Moved from arm_expand_neon_builtin function.
* config/arm/arm-c.c (__ARM_FEATURE_MVE): Define macro for MVE and MVE
with floating point enabled.
* config/arm/arm-protos.h (neon_immediate_valid_for_move): Renamed to
simd_immediate_valid_for_move.
(simd_immediate_valid_for_move): Renamed from
neon_immediate_valid_for_move function.
* config/arm/arm.c (arm_options_perform_arch_sanity_checks): Generate
error if vfpv2 feature bit is disabled and mve feature bit is also
disabled for HARD_FLOAT_ABI.
(use_return_insn): Check to not push VFP regs for MVE.
(aapcs_vfp_allocate): Add MVE check to have same Procedure Call Standard
as Neon.
(aapcs_vfp_allocate_return_reg): Likewise.
(thumb2_legitimate_address_p): Check to return 0 on valid Thumb-2
address operand for MVE.
(arm_rtx_costs_internal): MVE check to determine cost of rtx.
(neon_valid_immediate): Rename to simd_valid_immediate.
(simd_valid_immediate): Rename from neon_valid_immediate.
(simd_valid_immediate): MVE check on size of vector is 128 bits.
(neon_immediate_valid_for_move): Rename to
simd_immediate_valid_for_move.
(simd_immediate_valid_for_move): Rename from
neon_immediate_valid_for_move.
(neon_immediate_valid_for_logic): Modify call to neon_valid_immediate
function.
(neon_make_constant): Modify call to neon_valid_immediate function.
(neon_vector_mem_operand): Return VFP register for POST_INC or PRE_DEC
for MVE.
(output_move_neon): Add MVE check to generate vldm/vstm instrcutions.
(arm_compute_frame_layout): Calculate space for saved VFP registers for
MVE.
(arm_save_coproc_regs): Save coproc registers for MVE.
(arm_print_operand): Add case 'E' to print memory operands for MVE.
(arm_print_operand_address): Check to print register number for MVE.
(arm_hard_regno_mode_ok): Check for arm hard regno mode ok for MVE.
(arm_modes_tieable_p): Check to allow structure mode for MVE.
(arm_regno_class): Add VPR_REGNUM check.
(arm_expand_epilogue_apcs_frame): MVE check to calculate epilogue code
for APCS frame.
(arm_expand_epilogue): MVE check for enabling pop instructions in
epilogue.
(arm_print_asm_arch_directives): Modify function to disable print of
.arch_extension "mve" and "fp" for cases where MVE is enabled with
"SOFT FLOAT ABI".
(arm_vector_mode_supported_p): Check for modes available in MVE interger
and MVE floating point.
(arm_array_mode_supported_p): Add TARGET_HAVE_MVE check for array mode
pointer support.
(arm_conditional_register_usage): Enable usage of conditional regsiter
for MVE.
(fixed_regs[VPR_REGNUM]): Enable VPR_REG for MVE.
(arm_declare_function_name): Modify function to disable print of
.arch_extension "mve" and "fp" for cases where MVE is enabled with
"SOFT FLOAT ABI".
* config/arm/arm.h (TARGET_HAVE_MVE): Disable for soft float abi and
when target general registers are required.
(TARGET_HAVE_MVE_FLOAT): Likewise.
(FIXED_REGISTERS): Add bit for VFP_REG class which is enabled in arm.c
for MVE.
(CALL_USED_REGISTERS): Set bit for VFP_REG class in CALL_USED_REGISTERS
which indicate this is not available for across function calls.
(FIRST_PSEUDO_REGISTER): Modify.
(VALID_MVE_MODE): Define valid MVE mode.
(VALID_MVE_SI_MODE): Define valid MVE SI mode.
(VALID_MVE_SF_MODE): Define valid MVE SF mode.
(VALID_MVE_STRUCT_MODE): Define valid MVE struct mode.
(VPR_REGNUM): Add Vector Predication Register in arm_regs_in_sequence
for MVE.
(IS_VPR_REGNUM): Macro to check for VPR_REG register.
(REG_ALLOC_ORDER): Add VPR_REGNUM entry.
(enum reg_class): Add VPR_REG entry.
(REG_CLASS_NAMES): Add VPR_REG entry.
* config/arm/arm.md (VPR_REGNUM): Define.
(conds): Check is_mve_type attrbiute to differentiate "conditional" and
"unconditional" instructions.
(arm_movsf_soft_insn): Modify RTL to not allow for MVE.
(movdf_soft_insn): Modify RTL to not allow for MVE.
(vfp_pop_multiple_with_writeback): Enable for MVE.
(include "mve.md"): Include mve.md file.
* config/arm/arm_mve.h: Add MVE intrinsics head file.
* config/arm/constraints.md (Up): Constraint to enable "p0" register in MVE
for vector predicated operands.
* config/arm/iterators.md (VNIM1): Define.
(VNINOTM1): Define.
(VHFBF_split): Define
* config/arm/mve.md: New file.
(mve_mov<mode>): Define RTL for move, store and load in MVE.
(mve_mov<mode>): Define move RTL pattern with vec_duplicate operator for
second operand.
* config/arm/neon.md (neon_immediate_valid_for_move): Rename with
simd_immediate_valid_for_move.
(neon_mov<mode>): Split pattern and move expand pattern "movv8hf" which
is common to MVE and NEON to vec-common.md file.
(vec_init<mode><V_elem_l>): Add TARGET_HAVE_MVE check.
* config/arm/predicates.md (vpr_register_operand): Define.
* config/arm/t-arm: Add mve.md file.
* config/arm/types.md (mve_move): Add MVE instructions mve_move to
attribute "type".
(mve_store): Add MVE instructions mve_store to attribute "type".
(mve_load): Add MVE instructions mve_load to attribute "type".
(is_mve_type): Define attribute.
* config/arm/vec-common.md (mov<mode>): Modify RTL expand to support
standard move patterns in MVE along with NEON and IWMMXT with mode
iterator VNIM1.
(mov<mode>): Modify RTL expand to support standard move patterns in NEON
and IWMMXT with mode iterator V8HF.
(movv8hf): Define RTL expand to support standard "movv8hf" pattern in
NEON and MVE.
* config/arm/vfp.md (neon_immediate_valid_for_move): Rename to
simd_immediate_valid_for_move.
2020-03-16 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/mve_vector_float.c: New test.
* gcc.target/arm/mve/intrinsics/mve_vector_float1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_float2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_int.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_int1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_int2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_uint.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_uint1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_uint2.c: Likewise.
* gcc.target/arm/mve/mve.exp: New file.
* lib/target-supports.exp
(check_effective_target_arm_v8_1m_mve_fp_ok_nocache): Proc to check
armv8.1-m.main+mve.fp and returning corresponding options.
(check_effective_target_arm_v8_1m_mve_fp_ok): Proc to call
check_effective_target_arm_v8_1m_mve_fp_ok_nocache to check support of
MVE with floating point on the current target.
(add_options_for_arm_v8_1m_mve_fp): Proc to call
check_effective_target_arm_v8_1m_mve_fp_ok to return corresponding
compiler options for MVE with floating point.
(check_effective_target_arm_v8_1m_mve_ok_nocache): Modify to test and
return hard float-abi on success.
Iain Buclaw [Mon, 16 Mar 2020 16:00:07 +0000 (17:00 +0100)]
libphobos: Reset libtool_VERSION to 1:0:0
libphobos/ChangeLog:
2020-03-16 Iain Buclaw <ibuclaw@gdcproject.org>
PR d/92792
* Makefile.in: Regenerate.
* configure: Regenerate.
* configure.ac (libtool_VERSION): Reset to 1:0:0.
* libdruntime/Makefile.in: Regenerate.
Aaron Sawdey [Mon, 16 Mar 2020 14:29:05 +0000 (09:29 -0500)]
Fix ChangeLog formatting from my commit last friday.
Uros Bizjak [Mon, 16 Mar 2020 13:38:06 +0000 (14:38 +0100)]
x32 does not support MS ABI, skip testcases that require it.
* testsuite/20_util/bind/91371.cc: Skip for x32.
* testsuite/20_util/is_function/91371.cc: Ditto.
* testsuite/20_util/is_member_function_pointer/91371.cc: Ditto.
* testsuite/20_util/is_object/91371.cc: Ditto.
H.J. Lu [Mon, 16 Mar 2020 10:48:55 +0000 (03:48 -0700)]
i386: Use ix86_output_ssemov for SImode TYPE_SSEMOV
There is no need to set mode attribute to XImode since ix86_output_ssemov
can properly encode xmm16-xmm31 registers with and without AVX512VL.
Remove ext_sse_reg_operand since it is no longer needed.
gcc/
PR target/89229
* config/i386/i386.md (*movsi_internal): Call ix86_output_ssemov
for TYPE_SSEMOV. Remove ext_sse_reg_operand and TARGET_AVX512VL
check.
* config/i386/predicates.md (ext_sse_reg_operand): Removed.
gcc/testsuite/
PR target/89229
* gcc.target/i386/pr89229-7a.c: New test.
* gcc.target/i386/pr89229-7b.c: Likewise.
* gcc.target/i386/pr89229-7c.c: Likewise.
Iain Buclaw [Mon, 16 Mar 2020 08:48:54 +0000 (09:48 +0100)]
d/dmd: Merge upstream dmd
b061bd744
Fixes an ICE in the parser, and deprecates a previously allowed style of
syntax that deviated from GNU-style extended asm.
Reviewed-on: https://github.com/dlang/dmd/pull/10916
gcc/testsuite/ChangeLog:
2020-03-16 Iain Buclaw <ibuclaw@gdcproject.org>
* gdc.dg/asm1.d: Add new test for ICE in asm parser.
* gdc.dg/asm5.d: New test.
Iain Buclaw [Sun, 15 Mar 2020 12:59:14 +0000 (13:59 +0100)]
libphobos: Merge upstream druntime
6c45dd3a, phobos
68cc18adb.
Surrounds the gcc-style asm operands with parentheses, as the old style
is now deprecated.
Reviewed-on: https://github.com/dlang/druntime/pull/2986
Jakub Jelinek [Mon, 16 Mar 2020 08:03:59 +0000 (09:03 +0100)]
tree-inline: Fix a -fcompare-debug issue in the inliner [PR94167]
The following testcase fails with -fcompare-debug. The problem is that
bar is marked as address_taken only with -g and not without.
I've tracked it down to insert_init_stmt calling gimple_regimplify_operands
even on DEBUG_STMTs. That function will just insert normal stmts before
the DEBUG_STMT if the DEBUG_STMT operand isn't gimple val or invariant.
While DCE will turn those statements into debug temporaries, it can cause
differences in SSA_NAMEs and more importantly, the ipa references are
generated from those before the DCE happens.
On the testcase, the DEBUG_STMT value is (int)bar.
We could generate DEBUG_STMTs with debug temporaries instead, but I fail to
see the reason to do that, DEBUG_STMTs allow other expressions and all we
want to ensure is that the expressions aren't too large (arbitrarily
complex), but during inlining/function versioning I don't see why something
would queue a DEBUG_STMT with arbitrarily complex expressions in there.
2020-03-16 Jakub Jelinek <jakub@redhat.com>
PR debug/94167
* tree-inline.c (insert_init_stmt): Don't gimple_regimplify_operands
DEBUG_STMTs.
* gcc.dg/pr94167.c: New test.
Jakub Jelinek [Mon, 16 Mar 2020 08:02:21 +0000 (09:02 +0100)]
tree-inline: Fix a -fcompare-debug issue in the inliner [PR94167]
The following testcase fails with -fcompare-debug. The problem is that
bar is marked as address_taken only with -g and not without.
I've tracked it down to insert_init_stmt calling gimple_regimplify_operands
even on DEBUG_STMTs. That function will just insert normal stmts before
the DEBUG_STMT if the DEBUG_STMT operand isn't gimple val or invariant.
While DCE will turn those statements into debug temporaries, it can cause
differences in SSA_NAMEs and more importantly, the ipa references are
generated from those before the DCE happens.
On the testcase, the DEBUG_STMT value is (int)bar.
We could generate DEBUG_STMTs with debug temporaries instead, but I fail to
see the reason to do that, DEBUG_STMTs allow other expressions and all we
want to ensure is that the expressions aren't too large (arbitrarily
complex), but during inlining/function versioning I don't see why something
would queue a DEBUG_STMT with arbitrarily complex expressions in there.
2020-03-16 Jakub Jelinek <jakub@redhat.com>
PR tree-optimization/94166
* tree-ssa-reassoc.c (sort_by_mach_mode): Use SSA_NAME_VERSION
as secondary comparison key.
* gcc.dg/pr94166.c: New test.
Bin Cheng [Mon, 16 Mar 2020 03:09:14 +0000 (11:09 +0800)]
Update post order number for merged SCC.
Function loop_distribution::break_alias_scc_partitions needs to compute
SCC with runtime alias edges skipped. As a result, partitions could be
re-assigned larger post order number than SCC's precedent partition and
distributed before the precedent one. This fixes the issue by updating
the merged partition to the minimal post order in SCC.
gcc/
PR tree-optimization/94125
* tree-loop-distribution.c
(loop_distribution::break_alias_scc_partitions): Update post order
number for merged scc.
gcc/testsuite/
PR tree-optimization/94125
* gcc.dg/tree-ssa/pr94125.c: New test.
GCC Administrator [Mon, 16 Mar 2020 00:16:17 +0000 (00:16 +0000)]
Daily bump.
H.J. Lu [Sun, 15 Mar 2020 17:21:08 +0000 (10:21 -0700)]
i386: Use ix86_output_ssemov for SFmode TYPE_SSEMOV
There is no need to set mode attribute to V16SFmode since ix86_output_ssemov
can properly encode xmm16-xmm31 registers with and without AVX512VL.
gcc/
PR target/89229
* config/i386/i386.c (ix86_output_ssemov): Handle MODE_SI and
MODE_SF.
* config/i386/i386.md (*movsf_internal): Call ix86_output_ssemov
for TYPE_SSEMOV. Remove TARGET_PREFER_AVX256, TARGET_AVX512VL
and ext_sse_reg_operand check.
gcc/testsuite/
PR target/89229
* gcc.target/i386/pr89229-6a.c: New test.
* gcc.target/i386/pr89229-6b.c: Likewise.
* gcc.target/i386/pr89229-6c.c: Likewise.
Iain Sandoe [Sun, 15 Mar 2020 14:22:18 +0000 (14:22 +0000)]
coroutines: Fix indentation (NFC).
Whitespace-only change.
gcc/cp/ChangeLog:
2020-03-15 Iain Sandoe <iain@sandoe.co.uk>
* coroutines.cc (co_await_expander): Fix indentation.
Lewis Hyatt [Sun, 15 Mar 2020 12:58:30 +0000 (08:58 -0400)]
driver: Fix redundant descriptions in options
Addresses issues where the two-column format of options descriptions was
used, but the columns were separated by spaces rather than a single tab,
causing the help output to be more verbose than intended.
gcc/ChangeLog:
2020-03-15 Lewis Hyatt <lhyatt@gmail.com>
* common.opt: Avoid redundancy in the help text.
* config/arc/arc.opt: Likewise.
* config/cr16/cr16.opt: Likewise.
gcc/c-family/ChangeLog:
2020-03-15 Lewis Hyatt <lhyatt@gmail.com>
* c.opt: Avoid redundancy in the help text.
gcc/fortran/ChangeLog:
2020-03-15 Lewis Hyatt <lhyatt@gmail.com>
* lang.opt: Avoid redundancy in the help text.
gcc/testsuite/ChangeLog:
2020-03-15 Lewis Hyatt <lhyatt@gmail.com>
* gcc.misc-tests/help.exp: Adapt to new output for
-Walloc-size-larger-than= option.
Jakub Jelinek [Sun, 15 Mar 2020 00:27:40 +0000 (01:27 +0100)]
tree-nested: Fix handling of *reduction clauses with C array sections [PR93566]
tree-nested.c didn't handle C array sections in {,task_,in_}reduction clauses.
2020-03-14 Jakub Jelinek <jakub@redhat.com>
PR middle-end/93566
* tree-nested.c (convert_nonlocal_omp_clauses,
convert_local_omp_clauses): Handle {,in_,task_}reduction clauses
with C/C++ array sections.
* testsuite/libgomp.c/pr93566.c: New test.
GCC Administrator [Sun, 15 Mar 2020 00:16:14 +0000 (00:16 +0000)]
Daily bump.
H.J. Lu [Sat, 14 Mar 2020 23:06:55 +0000 (16:06 -0700)]
i386: Use ix86_output_ssemov for DImode TYPE_SSEMOV
There is no need to set mode attribute to XImode since ix86_output_ssemov
can properly encode xmm16-xmm31 registers with and without AVX512VL.
gcc/
PR target/89229
* config/i386/i386.md (*movdi_internal): Call ix86_output_ssemov
for TYPE_SSEMOV. Remove ext_sse_reg_operand and TARGET_AVX512VL
check.
gcc/testsuite/
PR target/89229
* gcc.target/i386/pr89229-5a.c: New test.
* gcc.target/i386/pr89229-5b.c: Likewise.
* gcc.target/i386/pr89229-5c.c: Likewise.
Jason Merrill [Sat, 14 Mar 2020 21:10:39 +0000 (17:10 -0400)]
c++: Fix ICE-after-error on partial spec [92068]
Here the template arguments for the partial specialization are valid
arguments for the template, but not for a partial specialization, because
'd' can never be deduced to anything other than an empty pack.
gcc/cp/ChangeLog
2020-03-14 Jason Merrill <jason@redhat.com>
PR c++/92068
* pt.c (process_partial_specialization): Error rather than crash on
extra pack expansion.
Jason Merrill [Sat, 14 Mar 2020 21:10:39 +0000 (17:10 -0400)]
c++: Find parameter pack in typedef in lambda [92909].
find_parameter_packs_r doesn't look through typedefs, which is normally
correct, but that means we need to handle their declarations specially.
gcc/cp/ChangeLog
2020-03-14 Jason Merrill <jason@redhat.com>
PR c++/92909
* pt.c (find_parameter_packs_r): [DECL_EXPR]: Walk
DECL_ORIGINAL_TYPE of a typedef.
Jason Merrill [Sat, 14 Mar 2020 21:10:39 +0000 (17:10 -0400)]
c++: Fix CTAD with multiple-arg ctor template [93248].
When cp_unevaluated_operand is set, tsubst_decl thinks that if it sees a
PARM_DECL that isn't already in local_specializations, we're in a decltype
in a trailing return type or some such, and so we only want a substitution
for a single PARM_DECL. In this case, we want the whole chain, so make sure
cp_unevaluated_operand is cleared.
gcc/cp/ChangeLog
2020-03-14 Jason Merrill <jason@redhat.com>
PR c++/93248
* pt.c (build_deduction_guide): Clear cp_unevaluated_operand for
substituting DECL_ARGUMENTS.