[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[PATCH v2 25/35] tcg/ppc: Tidy up tcg_target_const_match
|
From: |
Richard Henderson |
|
Subject: |
[PATCH v2 25/35] tcg/ppc: Tidy up tcg_target_const_match |
|
Date: |
Sat, 28 Oct 2023 12:45:12 -0700 |
Signed-off-by: Richard Henderson <richard.henderson@linaro.org>
---
tcg/ppc/tcg-target.c.inc | 27 ++++++++++++++++-----------
1 file changed, 16 insertions(+), 11 deletions(-)
diff --git a/tcg/ppc/tcg-target.c.inc b/tcg/ppc/tcg-target.c.inc
index 1ac11acc7c..13d43ef9ba 100644
--- a/tcg/ppc/tcg-target.c.inc
+++ b/tcg/ppc/tcg-target.c.inc
@@ -282,31 +282,36 @@ static bool reloc_pc34(tcg_insn_unit *src_rw, const
tcg_insn_unit *target)
}
/* test if a constant matches the constraint */
-static bool tcg_target_const_match(int64_t val, int ct,
+static bool tcg_target_const_match(int64_t sval, int ct,
TCGType type, TCGCond cond, int vece)
{
+ uint64_t uval = sval;
+
if (ct & TCG_CT_CONST) {
return 1;
}
- /* The only 32-bit constraint we use aside from
- TCG_CT_CONST is TCG_CT_CONST_S16. */
if (type == TCG_TYPE_I32) {
- val = (int32_t)val;
+ uval = (uint32_t)sval;
+ sval = (int32_t)sval;
}
- if ((ct & TCG_CT_CONST_S16) && val == (int16_t)val) {
+ if ((ct & TCG_CT_CONST_S16) && sval == (int16_t)sval) {
return 1;
- } else if ((ct & TCG_CT_CONST_S32) && val == (int32_t)val) {
+ }
+ if ((ct & TCG_CT_CONST_S32) && sval == (int32_t)sval) {
return 1;
- } else if ((ct & TCG_CT_CONST_U32) && val == (uint32_t)val) {
+ }
+ if ((ct & TCG_CT_CONST_U32) && uval == (uint32_t)uval) {
return 1;
- } else if ((ct & TCG_CT_CONST_ZERO) && val == 0) {
+ }
+ if ((ct & TCG_CT_CONST_ZERO) && sval == 0) {
return 1;
- } else if ((ct & TCG_CT_CONST_MONE) && val == -1) {
+ }
+ if ((ct & TCG_CT_CONST_MONE) && sval == -1) {
return 1;
- } else if ((ct & TCG_CT_CONST_WSZ)
- && val == (type == TCG_TYPE_I32 ? 32 : 64)) {
+ }
+ if ((ct & TCG_CT_CONST_WSZ) && sval == (type == TCG_TYPE_I32 ? 32 : 64)) {
return 1;
}
return 0;
--
2.34.1
- [PATCH v2 10/35] tcg/aarch64: Generate CBNZ for TSTNE of UINT32_MAX, (continued)
- [PATCH v2 10/35] tcg/aarch64: Generate CBNZ for TSTNE of UINT32_MAX, Richard Henderson, 2023/10/28
- [PATCH v2 06/35] tcg/optimize: Handle TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28
- [PATCH v2 24/35] tcg/ppc: Use cr0 in tcg_to_bc and tcg_to_isel, Richard Henderson, 2023/10/28
- [PATCH v2 16/35] tcg/loongarch64: Support TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28
- [PATCH v2 11/35] tcg/arm: Support TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28
- [PATCH v2 26/35] tcg/ppc: Add TCG_CT_CONST_CMP, Richard Henderson, 2023/10/28
- [PATCH v2 14/35] tcg/i386: Support TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28
- [PATCH v2 15/35] tcg/i386: Improve TSTNE/TESTEQ vs powers of two, Richard Henderson, 2023/10/28
- [PATCH v2 18/35] tcg/riscv: Support TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28
- [PATCH v2 21/35] tcg/sparc64: Pass TCGCond to tcg_out_cmp, Richard Henderson, 2023/10/28
- [PATCH v2 25/35] tcg/ppc: Tidy up tcg_target_const_match,
Richard Henderson <=
- [PATCH v2 28/35] tcg/s390x: Split constraint A into J+U, Richard Henderson, 2023/10/28
- [PATCH v2 32/35] target/alpha: Use TCG_COND_TST{EQ,NE} for BLB{C,S}, Richard Henderson, 2023/10/28
- [PATCH v2 33/35] target/alpha: Use TCG_COND_TST{EQ, NE} for CMOVLB{C, S}, Richard Henderson, 2023/10/28
- [PATCH v2 12/35] tcg/i386: Pass x86 condition codes to tcg_out_cmov, Richard Henderson, 2023/10/28
- [PATCH v2 13/35] tcg/i386: Move tcg_cond_to_jcc[] into tcg_out_cmp, Richard Henderson, 2023/10/28
- [PATCH v2 20/35] tcg/sparc64: Hoist read of tcg_cond_to_rcond, Richard Henderson, 2023/10/28
- [PATCH v2 23/35] tcg/ppc: Sink tcg_to_bc usage into tcg_out_bc, Richard Henderson, 2023/10/28
- [PATCH v2 17/35] tcg/mips: Support TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28
- [PATCH v2 29/35] tcg/s390x: Add TCG_CT_CONST_CMP, Richard Henderson, 2023/10/28
- [PATCH v2 31/35] tcg/tci: Support TCG_COND_TST{EQ,NE}, Richard Henderson, 2023/10/28