Merge in wide-int.

From-SVN: r210113
This commit is contained in:
Kenneth Zadeck 2014-05-06 16:25:05 +00:00 committed by Mike Stump
parent 6122336c83
commit 807e902eea
195 changed files with 11299 additions and 4874 deletions

899
gcc/ChangeLog.wide-int Normal file
View file

@ -0,0 +1,899 @@
2013-11-21 Kenneth Zadeck <zadeck@naturalbridge.com>
Mike Stump <mikestump@comcast.net>
Richard Sandiford <rdsandiford@googlemail.com>
Kyrylo Tkachov <kyrylo.tkachov@arm.com>
* alias.c (ao_ref_from_mem): Use wide-int interfaces.
(rtx_equal_for_memref_p): Update comment.
(adjust_offset_for_component_ref): Use wide-int interfaces.
* builtins.c (get_object_alignment_2): Likewise.
(c_readstr): Likewise.
(target_char_cast): Add comment.
(determine_block_size): Use wide-int interfaces.
(expand_builtin_signbit): Likewise.
(fold_builtin_int_roundingfn): Likewise.
(fold_builtin_bitop): Likewise.
(fold_builtin_bswap): Likewise.
(fold_builtin_logarithm): Use signop.
(fold_builtin_pow): Likewise.
(fold_builtin_memory_op): Use wide-int interfaces.
(fold_builtin_object_size): Likewise.
* cfgloop.c (alloc_loop): Initialize nb_iterations_upper_bound and
nb_iterations_estimate.
(record_niter_bound): Use wide-int interfaces.
(get_estimated_loop_iterations_int): Likewise.
(get_estimated_loop_iterations): Likewise.
(get_max_loop_iterations): Likewise.
* cfgloop.h: Include wide-int.h.
(struct nb_iter_bound): Change bound to widest_int.
(struct loop): Change nb_iterations_upper_bound and
nb_iterations_estimate to widest_int.
(record_niter_bound): Switch to use widest_int.
(get_estimated_loop_iterations): Likewise.
(get_max_loop_iterations): Likewise.
(gcov_type_to_double_int): Rename to gcov_type_to_wide_int and
update for wide-int.
* cgraph.c (cgraph_add_thunk): Use wide-int interfaces.
* combine.c (try_combine): Likewise.
(subst): Use CONST_SCALAR_INT_P rather than CONST_INT_P.
* config/aarch64/aarch64.c (aapcs_vfp_sub_candidate): Use wide-int
interfaces.
(aarch64_float_const_representable_p): Likewise.
* config/arc/arc.c: Include wide-int.h.
(arc_can_use_doloop_p): Use wide-int interfaces.
* config/arm/arm.c (aapcs_vfp_sub_candidate): Likewise.
(vfp3_const_double_index): Likewise.
* config/avr/avr.c (avr_out_round): Likewise.
(avr_fold_builtin): Likewise.
* config/bfin/bfin.c (bfin_local_alignment): Likewise.
(bfin_can_use_doloop_p): Likewise.
* config/darwin.c (darwin_mergeable_constant_section): Likewise.
(machopic_select_rtx_section): Update to handle CONST_WIDE_INT.
* config/i386/i386.c: Include wide-int.h.
(ix86_data_alignment): Use wide-int interfaces.
(ix86_local_alignment): Likewise.
(ix86_emit_swsqrtsf): Update real_from_integer.
* config/msp430/msp430.c (msp430_attr): Use wide-int interfaces.
* config/nds32/nds32.c (nds32_insert_attributes): Likewise.
* config/rs6000/predicates.md (any_operand): Add const_wide_int.
(zero_constant): Likewise.
(input_operand): Likewise.
(splat_input_operand): Likewise.
(non_logical_cint_operand): Change const_double to const_wide_int.
* config/rs6000/rs6000.c (num_insns_constant): Handle CONST_WIDE_INT.
(easy_altivec_constant): Remove comment.
(paired_expand_vector_init): Use CONSTANT_P.
(rs6000_legitimize_address): Handle CONST_WIDE_INT.
(rs6000_emit_move): Update checks.
(rs6000_aggregate_candidate): Use wide-int interfaces.
(rs6000_expand_ternop_builtin): Likewise.
(rs6000_output_move_128bit): Handle CONST_WIDE_INT.
(rs6000_assemble_integer): Likewise.
(rs6000_hash_constant): Likewise.
(output_toc): Likewise.
(rs6000_rtx_costs): Likewise.
(rs6000_emit_swrsqrt); Update call to real_from_integer.
* config/rs6000/rs6000-c.c: Include wide-int.h.
(altivec_resolve_overloaded_builtin): Use wide-int interfaces.
* config/rs6000/rs6000.h (TARGET_SUPPORTS_WIDE_INT): New.
* config/rs6000/rs6000.md: Use const_scalar_int_operand.
Handle CONST_WIDE_INT.
* config/sol2-c.c (solaris_pragma_align): Change low to unsigned HWI.
Use tree_fits_uhwi_p.
* config/sparc/sparc.c: Include wide-int.h.
(sparc_fold_builtin): Use wide-int interfaces.
* config/vax/vax.c: Include wide-int.h.
(vax_float_literal): Use real_from_integer.
* coretypes.h (struct hwivec_def): New.
(hwivec): New.
(const_hwivec): New.
* cse.c (hash_rtx_cb): Handle CONST_WIDE_INT.
(equiv_constant): Handle CONST_WIDE_INT.
* cselib.c (rtx_equal_for_cselib_1): Use CASE_CONST_UNIQUE.
(cselib_hash_rtx): Handle CONST_WIDE_INT.
* dbxout.c (stabstr_U): Use wide-int interfaces.
(dbxout_type): Update to use cst_fits_shwi_p.
* defaults.h (LOG2_BITS_PER_UNIT): Define.
(TARGET_SUPPORTS_WIDE_INT): Add default.
* dfp.c: Include wide-int.h.
(decimal_real_to_integer2): Use wide-int interfaces and rename to
decimal_real_to_integer.
* dfp.h (decimal_real_to_integer2): Return a wide_int and rename to
decimal_real_to_integer.
* doc/generic.texi (Constant expressions): Update for wide_int.
* doc/rtl.texi (const_double): Likewise.
(const_wide_int, CONST_WIDE_INT, CONST_WIDE_INT_VEC): New.
(CONST_WIDE_INT_NUNITS, CONST_WIDE_INT_ELT): New.
* doc/tm.texi.in (REAL_VALUE_TO_INT): Remove.
(REAL_VALUE_FROM_INT): Remove.
(TARGET_SUPPORTS_WIDE_INT): New.
* doc/tm.texi: Regenerate.
* dojump.c (prefer_and_bit_test): Use wide-int interfaces.
* double-int.h: Include wide-int.h.
(struct wi::int_traits): New.
* dwarf2out.c (get_full_len): New.
(dw_val_equal_p): Add case dw_val_class_wide_int.
(size_of_loc_descr): Likewise.
(output_loc_operands): Likewise.
(insert_double): Remove.
(insert_wide_int): New.
(add_AT_wide): New.
(print_die): Add case dw_val_class_wide_int.
(attr_checksum): Likewise.
(attr_checksum_ordered): Likewise.
(same_dw_val_p): Likewise.
(size_of_die): Likewise.
(value_format): Likewise.
(output_die): Likewise.
(double_int_type_size_in_bits): Rename to offset_int_type_size_in_bits.
Use wide-int.
(clz_loc_descriptor): Use wide-int interfaces.
(mem_loc_descriptor): Likewise. Handle CONST_WIDE_INT.
(loc_descriptor): Use wide-int interfaces. Handle CONST_WIDE_INT.
(round_up_to_align): Use wide-int interfaces.
(field_byte_offset): Likewise.
(insert_double): Rename to insert_wide_int. Use wide-int interfaces.
(add_const_value_attribute): Handle CONST_WIDE_INT. Update
CONST_DOUBLE handling. Use wide-int interfaces.
(add_bound_info): Use tree_fits_uhwi_p. Use wide-int interfaces.
(gen_enumeration_type_die): Use add_AT_wide.
(hash_loc_operands): Add case dw_val_class_wide_int.
(compare_loc_operands): Likewise.
* dwarf2out.h: Include wide-int.h.
(wide_int_ptr): New.
(enum dw_val_class): Add dw_val_class_wide_int.
(struct dw_val_struct): Add val_wide.
* emit-rtl.c (const_wide_int_htab): New.
(const_wide_int_htab_hash): New.
(const_wide_int_htab_eq): New.
(lookup_const_wide_int): New.
(const_double_htab_hash): Use wide-int interfaces.
(const_double_htab_eq): Likewise.
(rtx_to_double_int): Conditionally compile for wide-int.
(immed_double_int_const): Rename to immed_wide_int_const and
update for wide-int.
(immed_double_const): Conditionally compile for wide-int.
(init_emit_once): Use wide-int interfaces.
* explow.c (plus_constant): Likewise.
* expmed.c (mask_rtx): Move further up file. Use wide-int interfaces.
(lshift_value): Use wide-int interfaces.
(expand_mult): Likewise.
(choose_multiplier): Likewise.
(expand_smod_pow2): Likewise.
(make_tree): Likewise.
* expr.c (convert_modes): Consolidate handling of constants.
Use wide-int interfaces.
(emit_group_load_1): Add note.
(store_expr): Update comment.
(get_inner_reference): Use wide-int interfaces.
(expand_constructor): Update comment.
(expand_expr_real_2): Use wide-int interfaces.
(expand_expr_real_1): Likewise.
(reduce_to_bit_field_precision): Likewise.
(const_vector_from_tree): Likewise.
* final.c: Include wide-int-print.h.
(output_addr_const): Handle CONST_WIDE_INT. Use CONST_DOUBLE_AS_INT_P.
* fixed-value.c: Include wide-int.h.
(fixed_from_string): Use wide-int interfaces.
(fixed_to_decimal): Likewise.
(fixed_convert_from_real): Likewise.
(real_convert_from_fixed): Likewise.
* fold-const.h (mem_ref_offset): Return an offset_int.
(div_if_zero_remainder): Remove code parameter.
* fold-const.c (div_if_zero_remainder): Remove code parameter.
Use wide-int interfaces.
(may_negate_without_overflow_p): Use wide-int interfaces.
(negate_expr_p): Likewise.
(fold_negate_expr): Likewise.
(int_const_binop_1): Likewise.
(const_binop): Likewise.
(fold_convert_const_int_from_int): Likewise.
(fold_convert_const_int_from_real): Likewise.
(fold_convert_const_int_from_fixed): Likewise.
(fold_convert_const_fixed_from_int): Likewise.
(all_ones_mask_p): Take an unsigned size. Use wide-int interfaces.
(sign_bit_p): Use wide-int interfaces.
(make_range_step): Likewise.
(build_range_check): Likewise. Pass an integer of the correct type
instead of using integer_one_node.
(range_predecessor): Pass an integer of the correct type instead
of using integer_one_node.
(range_successor): Likewise.
(merge_ranges): Likewise.
(unextend): Use wide-int interfaces.
(extract_muldiv_1): Likewise.
(fold_div_compare): Likewise.
(fold_single_bit_test): Likewise.
(fold_sign_changed_comparison): Likewise.
(try_move_mult_to_index): Update calls to div_if_zero_remainder.
(fold_plusminus_mult_expr): Use wide-int interfaces.
(native_encode_int): Likewise.
(native_interpret_int): Likewise.
(fold_unary_loc): Likewise.
(pointer_may_wrap_p): Likewise.
(size_low_cst): Likewise.
(mask_with_tz): Likewise.
(fold_binary_loc): Likewise.
(fold_ternary_loc): Likewise.
(multiple_of_p): Likewise.
(tree_call_nonnegative_warnv_p): Update calls to
tree_int_cst_min_precision and real_from_integer.
(fold_negate_const): Use wide-int interfaces.
(fold_abs_const): Likewise.
(fold_relational_const): Use tree_int_cst_lt.
(round_up_loc): Use wide-int interfaces.
* genemit.c (gen_exp): Add CONST_WIDE_INT case.
* gengenrtl.c (excluded_rtx): Add CONST_WIDE_INT case.
* gengtype.c: Remove include of double-int.h.
(do_typedef): Use wide-int interfaces.
(open_base_files): Add wide-int.h.
(main): Add offset_int and widest_int typedefs.
* gengtype-lex.l: Handle "^".
(CXX_KEYWORD): Add "static".
* gengtype-parse.c (require3): New.
(require_template_declaration): Handle constant template arguments
and nested templates.
* gengtype-state.c: Don't include "double-int.h".
* genpreds.c (write_one_predicate_function): Update comment.
(write_tm_constrs_h): Add check for hval and lval use in
CONST_WIDE_INT.
* genrecog.c (validate_pattern): Add CONST_WIDE_INT case.
(add_to_sequence): Likewise.
* gensupport.c (struct std_pred_table): Add const_scalar_int_operand
and const_double_operand.
* gimple.c (preprocess_case_label_vec_for_gimple): Use wide-int
interfaces.
* gimple-fold.c (get_base_constructor): Likewise.
(fold_array_ctor_reference): Likewise.
(fold_nonarray_ctor_reference): Likewise.
(fold_const_aggregate_ref_1): Likewise.
(gimple_val_nonnegative_real_p): Likewise.
(gimple_fold_indirect_ref): Likewise.
* gimple-pretty-print.c (dump_ssaname_info): Likewise.
* gimple-ssa-strength-reduction.c: Include wide-int-print.h.
(struct slsr_cand_d): Change index to be widest_int.
(struct incr_info_d): Change incr to be widest_int.
(alloc_cand_and_find_basis): Use wide-int interfaces.
(slsr_process_phi): Likewise.
(backtrace_base_for_ref): Likewise. Return a widest_int.
(restructure_reference): Take a widest_int instead of a double_int.
(slsr_process_ref): Use wide-int interfaces.
(create_mul_ssa_cand): Likewise.
(create_mul_imm_cand): Likewise.
(create_add_ssa_cand): Likewise.
(create_add_imm_cand): Take a widest_int instead of a double_int.
(slsr_process_add): Use wide-int interfaces.
(slsr_process_cast): Likewise.
(slsr_process_copy): Likewise.
(dump_candidate): Likewise.
(dump_incr_vec): Likewise.
(replace_ref): Likewise.
(cand_increment): Likewise. Return a widest_int.
(cand_abs_increment): Likewise.
(replace_mult_candidate): Take a widest_int instead of a double_int.
(replace_unconditional_candidate): Use wide-int interfaces.
(incr_vec_index): Take a widest_int instead of a double_int.
(create_add_on_incoming_edge): Likewise.
(create_phi_basis): Use wide-int interfaces.
(replace_conditional_candidate): Likewise.
(record_increment): Take a widest_int instead of a double_int.
(record_phi_increments): Use wide-int interfaces.
(phi_incr_cost): Take a widest_int instead of a double_int.
(lowest_cost_path): Likewise.
(total_savings): Likewise.
(analyze_increments): Use wide-int interfaces.
(ncd_with_phi): Take a widest_int instead of a double_int.
(ncd_of_cand_and_phis): Likewise.
(nearest_common_dominator_for_cands): Likewise.
(insert_initializers): Use wide-int interfaces.
(all_phi_incrs_profitable): Likewise.
(replace_one_candidate): Likewise.
(replace_profitable_candidates): Likewise.
* godump.c: Include wide-int-print.h.
(go_output_typedef): Use wide-int interfaces.
* graphite-clast-to-gimple.c (gmp_cst_to_tree): Likewise.
* graphite-sese-to-poly.c (tree_int_to_gmp): Likewise.
(build_loop_iteration_domains): Likewise.
* hooks.h: Include wide-int.h rather than double-int.h.
(hook_bool_dint_dint_uint_bool_true): Delete.
(hook_bool_wint_wint_uint_bool_true): Declare.
* hooks.c (hook_bool_dint_dint_uint_bool_true): Removed.
(hook_bool_wint_wint_uint_bool_true): New.
* internal-fn.c (ubsan_expand_si_overflow_addsub_check): Use wide-int
interfaces.
(ubsan_expand_si_overflow_mul_check): Likewise.
* ipa-devirt.c (get_polymorphic_call_info): Likewise.
* ipa-prop.c (compute_complex_assign_jump_func): Likewise.
(get_ancestor_addr_info): Likewise.
(ipa_modify_call_arguments): Likewise.
* loop-doloop.c (doloop_modify): Likewise.
(doloop_optimize): Likewise.
* loop-iv.c (iv_number_of_iterations): Likewise.
* loop-unroll.c (decide_unroll_constant_iterations): Likewise.
(unroll_loop_constant_iterations): Likewise.
(decide_unroll_runtime_iterations): Likewise.
(unroll_loop_runtime_iterations): Likewise.
(decide_peel_simple): Likewise.
(decide_unroll_stupid): Likewise.
* lto-streamer-in.c (streamer_read_wi): Add.
(input_cfg): Use wide-int interfaces.
(lto_input_tree_1): Likewise.
* lto-streamer-out.c (streamer_write_wi): Add.
(hash_tree): Use wide-int interfaces.
(output_cfg): Likewise.
* Makefile.in (OBJS): Add wide-int.o and wide-int-print.o.
(GTFILES): Add wide-int.h and signop.h.
(TAGS): Look for .cc files too.
* omp-low.c (scan_omp_1_op): Use wide-int interfaces.
* optabs.c (expand_subword_shift): Likewise.
(expand_doubleword_shift): Likewise.
(expand_absneg_bit): Likewise.
(expand_copysign_absneg): Likewise.
(expand_copysign_bit): Likewise.
* postreload.c (reload_cse_simplify_set): Likewise.
* predict.c (predict_iv_comparison): Likewise.
* pretty-print.h: Include wide-int-print.h.
(pp_wide_int) New.
* print-rtl.c (print_rtx): Add CONST_WIDE_INT case.
* print-tree.c: Include wide-int-print.h.
(print_node_brief): Use wide-int interfaces.
(print_node): Likewise.
* read-rtl.c (validate_const_wide_int): New.
(read_rtx_code): Add CONST_WIDE_INT case.
* real.c: Include wide-int.h.
(real_to_integer2): Delete.
(real_to_integer): New function, returning a wide_int.
(real_from_integer): Take a wide_int rather than two HOST_WIDE_INTs.
(ten_to_ptwo): Update call to real_from_integer.
(real_digit): Likewise.
* real.h: Include signop.h, wide-int.h and insn-modes.h.
(real_to_integer2, REAL_VALUE_FROM_INT, REAL_VALUE_FROM_UNSIGNED_INT)
(REAL_VALUE_TO_INT): Delete.
(real_to_integer): Declare a wide-int form.
(real_from_integer): Take a wide_int rather than two HOST_WIDE_INTs.
* recog.c (const_int_operand): Improve comment.
(const_scalar_int_operand): New.
(const_double_operand): Add a separate definition for CONST_WIDE_INT.
* rtlanal.c (commutative_operand_precedence): Handle CONST_WIDE_INT.
(split_double): Likewise.
* rtl.c (DEF_RTL_EXPR): Handle CONST_WIDE_INT.
(rtx_size): Likewise.
(rtx_alloc_stat_v): New.
(rtx_alloc_stat): Now calls rtx_alloc_stat_v.
(cwi_output_hex): New.
(iterative_hash_rtx): Handle CONST_WIDE_INT.
(cwi_check_failed_bounds): New.
* rtl.def (CONST_WIDE_INT): New.
* rtl.h: Include <utility> and wide-int.h.
(struct hwivec_def): New.
(CWI_GET_NUM_ELEM): New.
(CWI_PUT_NUM_ELEM): New.
(struct rtx_def): Add num_elem and hwiv.
(CASE_CONST_SCALAR_INT): Modify for TARGET_SUPPORTS_WIDE_INT.
(CASE_CONST_UNIQUE): Likewise.
(CASE_CONST_ANY): Likewise.
(CONST_SCALAR_INT_P): Likewise.
(CONST_WIDE_INT_P): New.
(CWI_ELT): New.
(HWIVEC_CHECK): New.
(cwi_check_failed_bounds): New.
(CWI_ELT): New.
(HWIVEC_CHECK): New.
(CONST_WIDE_INT_VEC) New.
(CONST_WIDE_INT_NUNITS) New.
(CONST_WIDE_INT_ELT) New.
(rtx_mode_t): New type.
(wi::int_traits <rtx_mode_t>): New.
(wi::shwi): New.
(wi::min_value): New.
(wi::max_value): New.
(rtx_alloc_v) New.
(const_wide_int_alloc): New.
(immed_wide_int_const): New.
* sched-vis.c (print_value): Handle CONST_WIDE_INT.
* sel-sched-ir.c (lhs_and_rhs_separable_p): Update comment.
* signop.h: New file.
* simplify-rtx.c (mode_signbit_p): Handle CONST_WIDE_INT.
(simplify_const_unary_operation): Use wide-int interfaces.
(simplify_binary_operation_1): Likewise.
(simplify_const_binary_operation): Likewise.
(simplify_const_relational_operation): Likewise.
(simplify_immed_subreg): Likewise.
* stmt.c (expand_case): Likewise.
* stor-layout.h (set_min_and_max_values_for_integral_type): Take a
signop rather than a bool.
* stor-layout.c (layout_type): Use wide-int interfaces.
(initialize_sizetypes): Update calls to
set_min_and_max_values_for_integral_type.
(set_min_and_max_values_for_integral_type): Take a signop rather
than a bool. Use wide-int interfaces.
(fixup_signed_type): Update accordingly. Remove
HOST_BITS_PER_DOUBLE_INT limit.
(fixup_unsigned_type): Likewise.
* system.h (STATIC_CONSTANT_P): New.
(STATIC_ASSERT): New.
* target.def (can_use_doloop_p): Take widest_ints rather than
double_ints.
* target.h: Include wide-int.h rather than double-int.h.
* targhooks.h (can_use_doloop_if_innermost): Take widest_ints rather
than double_ints.
* targhooks.c (default_cxx_get_cookie_size): Use tree_int_cst_lt
rather than INT_CST_LT_UNSIGNED.
(can_use_doloop_if_innermost): Take widest_ints rather than
double_ints.
* tree-affine.c: Include wide-int-print.h.
(double_int_ext_for_comb): Delete.
(wide_int_ext_for_comb): New.
(aff_combination_zero): Use wide-int interfaces.
(aff_combination_const): Take a widest_int instead of a double_int.
(aff_combination_elt): Use wide-int interfaces.
(aff_combination_scale): Take a widest_int instead of a double_int.
(aff_combination_add_elt): Likewise.
(aff_combination_add_cst): Likewise.
(aff_combination_add): Use wide-int interfaces.
(aff_combination_convert): Likewise.
(tree_to_aff_combination): Likewise.
(add_elt_to_tree): Take a widest_int instead of a double_int.
(aff_combination_to_tree): Use wide-int interfaces.
(aff_combination_remove_elt): Likewise.
(aff_combination_add_product): Take a widest_int instead of
a double_int.
(aff_combination_mult): Use wide-int interfaces.
(aff_combination_expand): Likewise.
(double_int_constant_multiple_p): Delete.
(wide_int_constant_multiple_p): New.
(aff_combination_constant_multiple_p): Take a widest_int pointer
instead of a double_int pointer.
(print_aff): Use wide-int interfaces.
(get_inner_reference_aff): Take a widest_int pointer
instead of a double_int pointer.
(aff_comb_cannot_overlap_p): Take widest_ints instead of double_ints.
* tree-affine.h: Include wide-int.h.
(struct aff_comb_elt): Change type of coef to widest_int.
(struct affine_tree_combination): Change type of offset to widest_int.
(double_int_ext_for_comb): Delete.
(wide_int_ext_for_comb): New.
(aff_combination_const): Use widest_int instead of double_int.
(aff_combination_scale): Likewise.
(aff_combination_add_elt): Likewise.
(aff_combination_constant_multiple_p): Likewise.
(get_inner_reference_aff): Likewise.
(aff_comb_cannot_overlap_p): Likewise.
(aff_combination_zero_p): Use wide-int interfaces.
* tree.c: Include tree.h.
(init_ttree): Use make_int_cst.
(tree_code_size): Removed code for INTEGER_CST case.
(tree_size): Add INTEGER_CST case.
(make_node_stat): Update comment.
(get_int_cst_ext_nunits, build_new_int_cst, build_int_cstu): New.
(build_int_cst_type): Use wide-int interfaces.
(double_int_to_tree): Likewise.
(double_int_fits_to_tree_p): Delete.
(force_fit_type_double): Delete.
(force_fit_type): New.
(int_cst_hash_hash): Use wide-int interfaces.
(int_cst_hash_eq): Likewise.
(build_int_cst_wide): Delete.
(wide_int_to_tree): New.
(cache_integer_cst): Use wide-int interfaces.
(build_low_bits_mask): Likewise.
(cst_and_fits_in_hwi): Likewise.
(real_value_from_int_cst): Likewise.
(make_int_cst_stat): New.
(integer_zerop): Use wide_int interfaces.
(integer_onep): Likewise.
(integer_all_onesp): Likewise.
(integer_pow2p): Likewise.
(integer_nonzerop): Likewise.
(tree_log2): Likewise.
(tree_floor_log2): Likewise.
(tree_ctz): Likewise.
(int_size_in_bytes): Likewise.
(mem_ref_offset): Return an offset_int rather than a double_int.
(build_type_attribute_qual_variant): Use wide_int interfaces.
(type_hash_eq): Likewise
(tree_int_cst_equal): Likewise.
(tree_int_cst_lt): Delete.
(tree_int_cst_compare): Likewise.
(tree_fits_shwi_p): Use wide_int interfaces.
(tree_fits_uhwi_p): Likewise.
(tree_int_cst_sign_bit): Likewise.
(tree_int_cst_sgn): Likewise.
(tree_int_cst_min_precision): Take a signop rather than a bool.
(simple_cst_equal): Use wide_int interfaces.
(compare_tree_int): Likewise.
(iterative_hash_expr): Likewise.
(int_fits_type_p): Likewise. Use tree_int_cst_lt rather than
INT_CST_LT.
(get_type_static_bounds): Use wide_int interfaces.
(tree_int_cst_elt_check_failed): New.
(build_common_tree_nodes): Reordered to set prec before filling in
value.
(int_cst_value): Check cst_and_fits_in_hwi.
(widest_int_cst_value): Use wide_int interfaces.
(upper_bound_in_type): Likewise.
(lower_bound_in_type): Likewise.
(num_ending_zeros): Likewise.
(drop_tree_overflow): Likewise.
* tree-call-cdce.c (check_pow): Update call to real_from_integer.
(gen_conditions_for_pow_cst_base): Likewise.
* tree-cfg.c: Include wide-int.h and wide-int-print.h.
(group_case_labels_stmt): Use wide-int interfaces.
(verify_gimple_assign_binary): Likewise.
(print_loop): Likewise.
* tree-chrec.c (tree_fold_binomial): Likewise.
* tree-core.h (struct tree_base): Add int_length.
(struct tree_int_cst): Change rep of value.
* tree-data-ref.c (dr_analyze_innermost): Use wide-int interfaces.
(dr_may_alias_p): Likewise.
(max_stmt_executions_tree): Likewise.
* tree.def (INTEGER_CST): Update comment.
* tree-dfa.c (get_ref_base_and_extent): Use wide-int interfaces.
* tree-dfa.h (get_addr_base_and_unit_offset_1): Likewise.
* tree-dump.c: Include wide-int.h and wide-int-print.h.
(dequeue_and_dump): Use wide-int interfaces.
* tree.h: Include wide-int.h.
(NULL_TREE): Moved to earlier loc in file.
(TREE_INT_CST_ELT_CHECK): New.
(tree_int_cst_elt_check_failed): New.
(TYPE_SIGN): New.
(TREE_INT_CST): Delete.
(TREE_INT_CST_LOW): Use wide-int interfaces.
(TREE_INT_CST_HIGH): Delete.
(TREE_INT_CST_NUNITS): New.
(TREE_INT_CST_EXT_NUNITS): Likewise.
(TREE_INT_CST_OFFSET_NUNITS): Likewise.
(TREE_INT_CST_ELT): Likewise.
(INT_CST_LT): Delete.
(tree_int_cst_elt_check): New (two forms).
(type_code_size): Update comment.
(make_int_cst_stat, make_int_cst): New.
(tree_to_double_int): Delete.
(double_int_fits_to_tree_p): Delete.
(force_fit_type_double): Delete.
(build_int_cstu): Replace with out-of-line function.
(build_int_cst_wide): Delete.
(tree_int_cst_lt): Define inline.
(tree_int_cst_le): New.
(tree_int_cst_compare): Define inline.
(tree_int_cst_min_precision): Take a signop rather than a bool.
(wi::int_traits <const_tree>): New.
(wi::int_traits <tree>): New.
(wi::extended_tree): New.
(wi::int_traits <wi::extended_tree>): New.
(wi::to_widest): New.
(wi::to_offset): New.
(wi::fits_to_tree_p): New.
(wi::min_value): New.
(wi::max_value): New.
* tree-inline.c (remap_gimple_op_r): Use wide-int interfaces.
(copy_tree_body_r): Likewise.
* tree-object-size.c (compute_object_offset): Likewise.
(addr_object_size): Likewise.
* tree-predcom.c: Include wide-int-print.h.
(struct dref_d): Change type of offset to widest_int.
(dump_dref): Call wide-int printer.
(aff_combination_dr_offset): Use wide-int interfaces.
(determine_offset): Take a widest_int pointer rather than a
double_int pointer.
(split_data_refs_to_components): Use wide-int interfaces.
(suitable_component_p): Likewise.
(order_drefs): Likewise.
(add_ref_to_chain): Likewise.
(valid_initializer_p): Likewise.
(determine_roots_comp): Likewise.
* tree-pretty-print.c: Include wide-int-print.h.
(dump_generic_node): Use wide-int interfaces.
* tree-sra.c (sra_ipa_modify_expr): Likewise.
* tree-ssa-address.c (addr_for_mem_ref): Likewise.
(move_fixed_address_to_symbol): Likewise.
(move_hint_to_base): Likewise.
(move_pointer_to_base): Likewise.
(move_variant_to_index): Likewise.
(most_expensive_mult_to_index): Likewise.
(addr_to_parts): Likewise.
(copy_ref_info): Likewise.
* tree-ssa-alias.c (indirect_ref_may_alias_decl_p): Likewise.
(indirect_refs_may_alias_p): Likewise.
(stmt_kills_ref_p_1): Likewise.
* tree-ssa.c (non_rewritable_mem_ref_base): Likewise.
* tree-ssa-ccp.c: Update comment at top of file. Include
wide-int-print.h.
(struct prop_value_d): Change type of mask to widest_int.
(extend_mask): New function.
(dump_lattice_value): Use wide-int interfaces.
(get_default_value): Likewise.
(set_constant_value): Likewise.
(set_value_varying): Likewise.
(valid_lattice_transition): Likewise.
(set_lattice_value): Likewise.
(value_to_double_int): Delete.
(value_to_wide_int): New.
(get_value_from_alignment): Use wide-int interfaces.
(get_value_for_expr): Likewise.
(do_dbg_cnt): Likewise.
(ccp_finalize): Likewise.
(ccp_lattice_meet): Likewise.
(bit_value_unop_1): Use widest_ints rather than double_ints.
(bit_value_binop_1): Likewise.
(bit_value_unop): Use wide-int interfaces.
(bit_value_binop): Likewise.
(bit_value_assume_aligned): Likewise.
(evaluate_stmt): Likewise.
(ccp_fold_stmt): Likewise.
(visit_cond_stmt): Likewise.
(ccp_visit_stmt): Likewise.
* tree-ssa-forwprop.c (forward_propagate_addr_expr_1): Likewise.
(constant_pointer_difference): Likewise.
(associate_pointerplus): Likewise.
(combine_conversions): Likewise.
* tree-ssa-loop.h: Include wide-int.h.
(struct tree_niter_desc): Change type of max to widest_int.
* tree-ssa-loop-im.c (mem_refs_may_alias_p): Use wide-int interfaces.
* tree-ssa-loop-ivcanon.c (remove_exits_and_undefined_stmts): Likewise.
(remove_redundant_iv_tests): Likewise.
(canonicalize_loop_induction_variables): Likewise.
* tree-ssa-loop-ivopts.c (alloc_iv): Likewise.
(constant_multiple_of): Take a widest_int pointer instead of
a double_int pointer.
(get_computation_aff): Use wide-int interfaces.
(ptr_difference_cost): Likewise.
(difference_cost): Likewise.
(get_loop_invariant_expr_id): Likewise.
(get_computation_cost_at): Likewise.
(iv_elimination_compare_lt): Likewise.
(may_eliminate_iv): Likewise.
* tree-ssa-loop-niter.h (estimated_loop_iterations): Use widest_int
instead of double_int.
(max_loop_iterations): Likewise.
(max_stmt_executions): Likewise.
(estimated_stmt_executions): Likewise.
* tree-ssa-loop-niter.c: Include wide-int-print.h.
(split_to_var_and_offset): Use wide-int interfaces.
(determine_value_range): Likewise.
(bound_difference_of_offsetted_base): Likewise.
(bounds_add): Take a widest_int instead of a double_int.
(number_of_iterations_ne_max): Use wide-int interfaces.
(number_of_iterations_ne): Likewise.
(number_of_iterations_lt_to_ne): Likewise.
(assert_loop_rolls_lt): Likewise.
(number_of_iterations_lt): Likewise.
(number_of_iterations_le): Likewise.
(number_of_iterations_cond): Likewise.
(number_of_iterations_exit): Likewise.
(finite_loop_p): Likewise.
(derive_constant_upper_bound_assign): Likewise.
(derive_constant_upper_bound): Return a widest_int.
(derive_constant_upper_bound_ops): Likewise.
(do_warn_aggressive_loop_optimizations): Use wide-int interfaces.
(record_estimate): Take a widest_int rather than a double_int.
(record_nonwrapping_iv): Use wide-int interfaces.
(double_int_cmp): Delete.
(wide_int_cmp): New.
(bound_index): Take a widest_int rather than a double_int.
(discover_iteration_bound_by_body_walk): Use wide-int interfaces.
(maybe_lower_iteration_bound): Likewise.
(estimate_numbers_of_iterations_loop): Likewise.
(estimated_loop_iterations): Take a widest_int pointer than than
a double_int pointer.
(estimated_loop_iterations_int): Use wide-int interfaces.
(max_loop_iterations): Take a widest_int pointer than than
a double_int pointer.
(max_loop_iterations_int): Use wide-int interfaces.
(max_stmt_executions): Take a widest_int pointer than than
a double_int pointer.
(estimated_stmt_executions): Likewise.
(n_of_executions_at_most): Use wide-int interfaces.
(scev_probably_wraps_p): Likewise.
* tree-ssa-math-opts.c (gimple_expand_builtin_pow): Update calls
to real_to_integer.
* tree-scalar-evolution.c (simplify_peeled_chrec): Use wide-int
interfaces.
* tree-ssanames.c (set_range_info): Use wide_int_refs rather than
double_ints. Adjust for trailing_wide_ints <3> representation.
(set_nonzero_bits): Likewise.
(get_range_info): Return wide_ints rather than double_ints.
Adjust for trailing_wide_ints <3> representation.
(get_nonzero_bits): Likewise.
(duplicate_ssa_name_range_info): Adjust for trailing_wide_ints <3>
representation.
* tree-ssanames.h (struct range_info_def): Replace min, max and
nonzero_bits with a trailing_wide_ints <3>.
(set_range_info): Use wide_int_refs rather than double_ints.
(set_nonzero_bits): Likewise.
(get_range_info): Return wide_ints rather than double_ints.
(get_nonzero_bits): Likewise.
* tree-ssa-phiopt.c (jump_function_from_stmt): Use wide-int interfaces.
* tree-ssa-pre.c (phi_translate_1): Likewise.
* tree-ssa-reassoc.c (decrement_power): Use calls to real_from_integer.
(acceptable_pow_call): Likewise.
* tree-ssa-sccvn.c (copy_reference_ops_from_ref): Use wide-int
interfaces.
(vn_reference_fold_indirect): Likewise.
(vn_reference_maybe_forwprop_address): Likewise.
(valueize_refs_1): Likewise.
* tree-ssa-structalias.c (get_constraint_for_ptr_offset): Likewise.
* tree-ssa-uninit.c (is_value_included_in): Use wide-int interfaces,
tree_int_cst_lt and tree_int_cst_le.
* tree-streamer-in.c (unpack_ts_base_value_fields): Use wide-int
interfaces.
(streamer_alloc_tree): Likewise.
* tree-streamer-out.c (pack_ts_int_cst_value_fields): Likewise.
(streamer_write_tree_header): Likewise.
(streamer_write_integer_cst): Likewise.
* tree-switch-conversion.c (emit_case_bit_tests): Likewise.
(build_constructors): Likewise.
(array_value_type): Likewise.
* tree-vect-data-refs.c (vect_prune_runtime_alias_test_list): Likewise.
(vect_check_gather): Likewise.
* tree-vect-generic.c (build_replicated_const): Likewise.
(expand_vector_divmod): Likewise.
* tree-vect-loop.c (vect_transform_loop): Likewise.
* tree-vect-loop-manip.c (vect_do_peeling_for_loop_bound): Likewise.
(vect_do_peeling_for_alignment): Likewise.
* tree-vect-patterns.c (vect_recog_divmod_pattern): Likewise.
* tree-vrp.c: Include wide-int.h.
(operand_less_p): Use wide-int interfaces and tree_int_cst_lt.
(extract_range_from_assert): Use wide-int interfaces.
(vrp_int_const_binop): Likewise.
(zero_nonzero_bits_from_vr): Take wide_int pointers rather than
double_int pointers.
(ranges_from_anti_range): Use wide-int interfaces.
(quad_int_cmp): Delete.
(quad_int_pair_sort): Likewise.
(extract_range_from_binary_expr_1): Use wide-int interfaces.
(extract_range_from_unary_expr_1): Likewise.
(adjust_range_with_scev): Likewise.
(masked_increment): Take and return wide_ints rather than double_ints.
(register_edge_assert_for_2): Use wide-int interfaces.
(check_array_ref): Likewise.
(search_for_addr_array): Likewise.
(maybe_set_nonzero_bits): Likewise.
(union_ranges): Pass an integer of the correct type instead of
using integer_one_node.
(intersect_ranges): Likewise.
(simplify_truth_ops_using_ranges): Likewise.
(simplify_bit_ops_using_ranges): Use wide-int interfaces.
(range_fits_type_p): Likewise.
(simplify_cond_using_ranges): Likewise. Take a signop rather than
a bool.
(simplify_conversion_using_ranges): Use wide-int interfaces.
(simplify_float_conversion_using_ranges): Likewise.
(vrp_finalize): Likewise.
* value-prof.c (gimple_divmod_fixed_value_transform): Likewise.
(gimple_stringops_transform): Likewise.
* varasm.c (decode_addr_const): Likewise.
(const_hash_1): Likewise.
(const_rtx_hash_1): Likewise
(output_constant): Likewise.
(array_size_for_constructor): Likewise.
(output_constructor_regular_field): Likewise.
(output_constructor_bitfield): Likewise.
* var-tracking.c (loc_cmp): Handle CONST_WIDE_INT.
* mkconfig.sh: Include machmode.h to pick up BITS_PER_UNIT for
GENERATOR_FILEs.
* gencheck.c: Define BITS_PER_UNIT.
* wide-int.cc: New.
* wide-int.h: New.
* wide-int-print.cc: New.
* wide-int-print.h: New.
ada:
* gcc-interface/cuintp.c (UI_From_gnu): Use wide-int interfaces.
* gcc-interface/decl.c (gnat_to_gnu_entity): Use TYPE_SIGN.
(annotate_value): Use wide-int interfaces.
* gcc-interface/utils.c (get_nonnull_operand): Use tree_fits_uhwi_p.
c:
* c-decl.c (check_bitfield_type_and_width): Use TYPE_SIGN.
(finish_enum): Use wide-int interfaces.
* c-parser.c (c_parser_cilk_clause_vectorlength): Likewise.
* c-typeck.c (build_c_cast): Likewise.
(set_nonincremental_init_from_string): Likewise.
(c_tree_equal): Likewise.
c-family:
* c-ada-spec.c: Include wide-int.h.
(ADA_HOST_WIDE_INT_PRINT_DOUBLE_HEX): Remove.
(dump_generic_ada_node): Use wide-int interfaces.
* c-common.c: Include wide-int-print.h.
(shorten_compare): Use wide-int interfaces and tree_int_cst_lt.
(pointer_int_sum): Use wide-int interfaces.
(c_common_nodes_and_builtins): Use make_int_cst.
(match_case_to_enum_1): Use tree_fits_uhwi_p and tree_fits_shwi_p.
(handle_alloc_size_attribute): Use wide-int interfaces.
(get_nonnull_operand): Likewise.
* c-format.c (get_constant): Use tree_fits_uhwi_p.
* c-lex.c: Include wide-int.h.
(narrowest_unsigned_type): Take a widest_int rather than two
HOST_WIDE_INTs.
(narrowest_signed_type): Likewise.
(interpret_integer): Update accordingly. Use wide-int interfaces.
(lex_charconst): Use wide-int interfaces.
* c-pretty-print.c: Include wide-int.h.
(pp_c_integer_constant): Use wide-int interfaces.
* cilk.c (declare_one_free_variable): Use tree_int_cst_lt instead of
INT_CST_LT_UNSIGNED.
cp:
* call.c: Include wide-int.h.
(type_passed_as): Use tree_int_cst_lt instead of INT_CST_LT_UNSIGNED.
(convert_for_arg_passing): Likewise.
* class.c: Include wide-int.h.
(walk_subobject_offsets): Use tree_int_cst_lt instead of INT_CST_LT.
(end_of_class): Use tree_int_cst_lt instead of INT_CST_LT_UNSIGNED.
(include_empty_classes): Likewise
(layout_class_type): Use tree_int_cst_lt instead of INT_CST_LT.
* cvt.c: Include wide-int.h.
(ignore_overflows): Use wide_int_to_tree.
* decl.c: Include wide-int.h.
(check_array_designated_initializer): Use wide-int interfaces.
(compute_array_index_type): Use tree_int_cst_lt instead of INT_CST_LT.
(finish_enum_value_list): Use signop.
(build_enumerator): Use wide-int interfaces.
* init.c: Include wide-int.h.
(build_new_1): Use wide-int interfaces.
* mangle.c: Include wide-int.h.
(write_integer_cst): Use wide-int interfaces.
(write_array_type): Likewise.
* tree.c: Include wide-int.h.
(cp_tree_equal): Use tree_int_cst_equal.
* typeck2.c: Include wide-int.h.
(process_init_constructor_array): Use wide-int interfaces.
fortran:
* target-memory.c: Include wide-int.h.
(gfc_interpret_logical): Use wide-int interfaces.
* trans-array.c: Include wide-int.h.
(gfc_conv_array_initializer): Use wide-int interfaces.
* trans-const.c: Include wide-int.h.
(gfc_conv_string_init): Use wide-int interfaces.
(gfc_conv_mpz_to_tree): Likewise.
(gfc_conv_tree_to_mpz): Likewise.
* trans-decl.c (gfc_can_put_var_on_stack): Use tree_fits_uhwi_p.
* trans-expr.c: Include wide-int.h.
(gfc_conv_cst_int_power): Use wide-int interfaces.
(gfc_string_to_single_character): Likewise.
(gfc_optimize_len_trim): Likewise.
* trans-intrinsic.c: Include wide-int.h.
(trans_this_image): Use wide-int interfaces.
(gfc_conv_intrinsic_bound): Likewise.
(conv_intrinsic_cobound): Likewise.
* trans-types.c (gfc_init_types): Likewise.
(gfc_get_array_type_bounds): Pass an integer of the correct type
instead of using integer_one_node.
go:
* go-gcc.cc (Gcc_backend::type_size): Use tree_fits_uhwi_p.
java:
* boehm.c: Include wide-int.h.
(mark_reference_fields): Use a wide_int mask.
(get_boehm_type_descriptor): Use wide-int interfaces.
* expr.c: Include wide-int.h.
(build_newarray): Remove bogus "== INTEGER_CST".
(expand_java_pushc): Use real_from_integer.
(build_field_ref): Use tree_int_cst_lt instead of INT_CST_LT_UNSIGNED.
* jcf-parse.c: Include wide-int.h.
(get_constant): Use wide-int interfaces.
lto:
* lto.c (compare_tree_sccs_1): Use wide-int interfaces.
* lto-lang.c (get_nonnull_operand): Likewise.
objc:
* objc-act.c: Include wide-int.h.
(objc_decl_method_attributes): Use wide-int interfaces.
testsuite:
* gcc.dg/tree-ssa/pr45427.c: Update to look for 0x0 instead of 0.

View file

@ -1464,6 +1464,8 @@ OBJS = \
vmsdbgout.o \
vtable-verify.o \
web.o \
wide-int.o \
wide-int-print.o \
xcoffout.o \
$(out_object_file) \
$(EXTRA_OBJS) \
@ -2229,7 +2231,7 @@ s-tm-texi: build/genhooks$(build_exeext) $(srcdir)/doc/tm.texi.in
GTFILES = $(CPP_ID_DATA_H) $(srcdir)/input.h $(srcdir)/coretypes.h \
$(host_xm_file_list) \
$(tm_file_list) $(HASHTAB_H) $(SPLAY_TREE_H) $(srcdir)/bitmap.h \
$(srcdir)/alias.h $(srcdir)/coverage.c $(srcdir)/rtl.h \
$(srcdir)/wide-int.h $(srcdir)/alias.h $(srcdir)/coverage.c $(srcdir)/rtl.h \
$(srcdir)/optabs.h $(srcdir)/tree.h $(srcdir)/tree-core.h \
$(srcdir)/libfuncs.h $(SYMTAB_H) \
$(srcdir)/real.h $(srcdir)/function.h $(srcdir)/insn-addr.h $(srcdir)/hwint.h \
@ -2240,6 +2242,7 @@ GTFILES = $(CPP_ID_DATA_H) $(srcdir)/input.h $(srcdir)/coretypes.h \
$(srcdir)/alias.c $(srcdir)/bitmap.c $(srcdir)/cselib.c $(srcdir)/cgraph.c \
$(srcdir)/ipa-prop.c $(srcdir)/ipa-cp.c $(srcdir)/ipa-utils.h \
$(srcdir)/dbxout.c \
$(srcdir)/signop.h \
$(srcdir)/dwarf2out.h \
$(srcdir)/dwarf2asm.c \
$(srcdir)/dwarf2cfi.c \
@ -2442,10 +2445,9 @@ gengtype-state.o build/gengtype-state.o: gengtype-state.c $(SYSTEM_H) \
gengtype-state.o: $(CONFIG_H)
CFLAGS-gengtype-state.o += -DGENERATOR_FILE
build/gengtype-state.o: $(BCONFIG_H)
gengtype.o build/gengtype.o : gengtype.c $(SYSTEM_H) gengtype.h \
rtl.def insn-notes.def errors.h double-int.h version.h $(HASHTAB_H) \
$(OBSTACK_H) $(XREGEX_H)
rtl.def insn-notes.def errors.h double-int.h version.h \
$(HASHTAB_H) $(OBSTACK_H) $(XREGEX_H)
gengtype.o: $(CONFIG_H)
CFLAGS-gengtype.o += -DGENERATOR_FILE
build/gengtype.o: $(BCONFIG_H)
@ -3752,7 +3754,7 @@ TAGS: lang.tags
incs="$$incs --include $$dir/TAGS.sub"; \
fi; \
done; \
etags -o TAGS.sub c-family/*.h c-family/*.c *.h *.c; \
etags -o TAGS.sub c-family/*.h c-family/*.c *.h *.c *.cc; \
etags --include TAGS.sub $$incs)
# -----------------------------------------------------

View file

@ -160,7 +160,11 @@ UI_From_gnu (tree Input)
in a signed 64-bit integer. */
if (tree_fits_shwi_p (Input))
return UI_From_Int (tree_to_shwi (Input));
else if (TREE_INT_CST_HIGH (Input) < 0 && TYPE_UNSIGNED (gnu_type))
gcc_assert (TYPE_PRECISION (gnu_type) <= 64);
if (TYPE_UNSIGNED (gnu_type)
&& TYPE_PRECISION (gnu_type) == 64
&& wi::neg_p (Input, SIGNED))
return No_Uint;
#endif

View file

@ -1642,7 +1642,7 @@ gnat_to_gnu_entity (Entity_Id gnat_entity, tree gnu_expr, int definition)
TYPE_PRECISION (gnu_type) = esize;
TYPE_UNSIGNED (gnu_type) = is_unsigned;
set_min_and_max_values_for_integral_type (gnu_type, esize,
is_unsigned);
TYPE_SIGN (gnu_type));
process_attributes (&gnu_type, &attr_list, true, gnat_entity);
layout_type (gnu_type);
@ -7521,11 +7521,9 @@ annotate_value (tree gnu_size)
if (TREE_CODE (TREE_OPERAND (gnu_size, 1)) == INTEGER_CST)
{
tree op1 = TREE_OPERAND (gnu_size, 1);
double_int signed_op1
= tree_to_double_int (op1).sext (TYPE_PRECISION (sizetype));
if (signed_op1.is_negative ())
if (wi::neg_p (op1))
{
op1 = double_int_to_tree (sizetype, -signed_op1);
op1 = wide_int_to_tree (sizetype, wi::neg (op1));
pre_op1 = annotate_value (build1 (NEGATE_EXPR, sizetype, op1));
}
}

View file

@ -6187,8 +6187,7 @@ static bool
get_nonnull_operand (tree arg_num_expr, unsigned HOST_WIDE_INT *valp)
{
/* Verify the arg number is a constant. */
if (TREE_CODE (arg_num_expr) != INTEGER_CST
|| TREE_INT_CST_HIGH (arg_num_expr) != 0)
if (!tree_fits_uhwi_p (arg_num_expr))
return false;
*valp = TREE_INT_CST_LOW (arg_num_expr);

View file

@ -340,9 +340,10 @@ ao_ref_from_mem (ao_ref *ref, const_rtx mem)
if (MEM_EXPR (mem) != get_spill_slot_decl (false)
&& (ref->offset < 0
|| (DECL_P (ref->base)
&& (!tree_fits_uhwi_p (DECL_SIZE (ref->base))
|| (tree_to_uhwi (DECL_SIZE (ref->base))
< (unsigned HOST_WIDE_INT) (ref->offset + ref->size))))))
&& (DECL_SIZE (ref->base) == NULL_TREE
|| TREE_CODE (DECL_SIZE (ref->base)) != INTEGER_CST
|| wi::ltu_p (wi::to_offset (DECL_SIZE (ref->base)),
ref->offset + ref->size)))))
return false;
return true;
@ -1532,9 +1533,7 @@ rtx_equal_for_memref_p (const_rtx x, const_rtx y)
case VALUE:
CASE_CONST_UNIQUE:
/* There's no need to compare the contents of CONST_DOUBLEs or
CONST_INTs because pointer equality is a good enough
comparison for these nodes. */
/* Pointer equality guarantees equality for these nodes. */
return 0;
default:
@ -2275,15 +2274,22 @@ adjust_offset_for_component_ref (tree x, bool *known_p,
{
tree xoffset = component_ref_field_offset (x);
tree field = TREE_OPERAND (x, 1);
if (! tree_fits_uhwi_p (xoffset))
if (TREE_CODE (xoffset) != INTEGER_CST)
{
*known_p = false;
return;
}
*offset += (tree_to_uhwi (xoffset)
+ (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
/ BITS_PER_UNIT));
offset_int woffset
= (wi::to_offset (xoffset)
+ wi::lrshift (wi::to_offset (DECL_FIELD_BIT_OFFSET (field)),
LOG2_BITS_PER_UNIT));
if (!wi::fits_uhwi_p (woffset))
{
*known_p = false;
return;
}
*offset += woffset.to_uhwi ();
x = TREE_OPERAND (x, 0);
}

View file

@ -413,7 +413,7 @@ get_object_alignment_2 (tree exp, unsigned int *alignp,
bitpos += ptr_bitpos;
if (TREE_CODE (exp) == MEM_REF
|| TREE_CODE (exp) == TARGET_MEM_REF)
bitpos += mem_ref_offset (exp).low * BITS_PER_UNIT;
bitpos += mem_ref_offset (exp).to_short_addr () * BITS_PER_UNIT;
}
}
else if (TREE_CODE (exp) == STRING_CST)
@ -672,20 +672,24 @@ c_getstr (tree src)
return TREE_STRING_POINTER (src) + tree_to_uhwi (offset_node);
}
/* Return a CONST_INT or CONST_DOUBLE corresponding to target reading
/* Return a constant integer corresponding to target reading
GET_MODE_BITSIZE (MODE) bits from string constant STR. */
static rtx
c_readstr (const char *str, enum machine_mode mode)
{
HOST_WIDE_INT c[2];
HOST_WIDE_INT ch;
unsigned int i, j;
HOST_WIDE_INT tmp[MAX_BITSIZE_MODE_ANY_INT / HOST_BITS_PER_WIDE_INT];
gcc_assert (GET_MODE_CLASS (mode) == MODE_INT);
unsigned int len = (GET_MODE_PRECISION (mode) + HOST_BITS_PER_WIDE_INT - 1)
/ HOST_BITS_PER_WIDE_INT;
gcc_assert (len <= MAX_BITSIZE_MODE_ANY_INT / HOST_BITS_PER_WIDE_INT);
for (i = 0; i < len; i++)
tmp[i] = 0;
c[0] = 0;
c[1] = 0;
ch = 1;
for (i = 0; i < GET_MODE_SIZE (mode); i++)
{
@ -696,13 +700,14 @@ c_readstr (const char *str, enum machine_mode mode)
&& GET_MODE_SIZE (mode) >= UNITS_PER_WORD)
j = j + UNITS_PER_WORD - 2 * (j % UNITS_PER_WORD) - 1;
j *= BITS_PER_UNIT;
gcc_assert (j < HOST_BITS_PER_DOUBLE_INT);
if (ch)
ch = (unsigned char) str[i];
c[j / HOST_BITS_PER_WIDE_INT] |= ch << (j % HOST_BITS_PER_WIDE_INT);
tmp[j / HOST_BITS_PER_WIDE_INT] |= ch << (j % HOST_BITS_PER_WIDE_INT);
}
return immed_double_const (c[0], c[1], mode);
wide_int c = wide_int::from_array (tmp, len, GET_MODE_PRECISION (mode));
return immed_wide_int_const (c, mode);
}
/* Cast a target constant CST to target CHAR and if that value fits into
@ -718,7 +723,9 @@ target_char_cast (tree cst, char *p)
|| CHAR_TYPE_SIZE > HOST_BITS_PER_WIDE_INT)
return 1;
/* Do not care if it fits or not right here. */
val = TREE_INT_CST_LOW (cst);
if (CHAR_TYPE_SIZE < HOST_BITS_PER_WIDE_INT)
val &= (((unsigned HOST_WIDE_INT) 1) << CHAR_TYPE_SIZE) - 1;
@ -3128,7 +3135,7 @@ determine_block_size (tree len, rtx len_rtx,
}
else
{
double_int min, max;
wide_int min, max;
enum value_range_type range_type = VR_UNDEFINED;
/* Determine bounds from the type. */
@ -3146,18 +3153,18 @@ determine_block_size (tree len, rtx len_rtx,
range_type = get_range_info (len, &min, &max);
if (range_type == VR_RANGE)
{
if (min.fits_uhwi () && *min_size < min.to_uhwi ())
if (wi::fits_uhwi_p (min) && *min_size < min.to_uhwi ())
*min_size = min.to_uhwi ();
if (max.fits_uhwi () && *max_size > max.to_uhwi ())
if (wi::fits_uhwi_p (max) && *max_size > max.to_uhwi ())
*probable_max_size = *max_size = max.to_uhwi ();
}
else if (range_type == VR_ANTI_RANGE)
{
/* Anti range 0...N lets us to determine minimal size to N+1. */
if (min.is_zero ())
if (min == 0)
{
if ((max + double_int_one).fits_uhwi ())
*min_size = (max + double_int_one).to_uhwi ();
if (wi::fits_uhwi_p (max) && max.to_uhwi () + 1 != 0)
*min_size = max.to_uhwi () + 1;
}
/* Code like
@ -3168,9 +3175,8 @@ determine_block_size (tree len, rtx len_rtx,
Produce anti range allowing negative values of N. We still
can use the information and make a guess that N is not negative.
*/
else if (!max.ule (double_int_one.lshift (30))
&& min.fits_uhwi ())
*probable_max_size = min.to_uhwi () - 1;
else if (!wi::leu_p (max, 1 << 30) && wi::fits_uhwi_p (min))
*probable_max_size = min.to_uhwi () - 1;
}
}
gcc_checking_assert (*max_size <=
@ -4943,12 +4949,12 @@ expand_builtin_signbit (tree exp, rtx target)
if (bitpos < GET_MODE_BITSIZE (rmode))
{
double_int mask = double_int_zero.set_bit (bitpos);
wide_int mask = wi::set_bit_in_zero (bitpos, GET_MODE_PRECISION (rmode));
if (GET_MODE_SIZE (imode) > GET_MODE_SIZE (rmode))
temp = gen_lowpart (rmode, temp);
temp = expand_binop (rmode, and_optab, temp,
immed_double_int_const (mask, rmode),
immed_wide_int_const (mask, rmode),
NULL_RTX, 1, OPTAB_LIB_WIDEN);
}
else
@ -8012,8 +8018,8 @@ fold_builtin_int_roundingfn (location_t loc, tree fndecl, tree arg)
{
tree itype = TREE_TYPE (TREE_TYPE (fndecl));
tree ftype = TREE_TYPE (arg);
double_int val;
REAL_VALUE_TYPE r;
bool fail = false;
switch (DECL_FUNCTION_CODE (fndecl))
{
@ -8039,9 +8045,9 @@ fold_builtin_int_roundingfn (location_t loc, tree fndecl, tree arg)
gcc_unreachable ();
}
real_to_integer2 ((HOST_WIDE_INT *)&val.low, &val.high, &r);
if (double_int_fits_to_tree_p (itype, val))
return double_int_to_tree (itype, val);
wide_int val = real_to_integer (&r, &fail, TYPE_PRECISION (itype));
if (!fail)
return wide_int_to_tree (itype, val);
}
}
@ -8074,94 +8080,39 @@ fold_builtin_bitop (tree fndecl, tree arg)
/* Optimize for constant argument. */
if (TREE_CODE (arg) == INTEGER_CST && !TREE_OVERFLOW (arg))
{
HOST_WIDE_INT hi, width, result;
unsigned HOST_WIDE_INT lo;
tree type;
type = TREE_TYPE (arg);
width = TYPE_PRECISION (type);
lo = TREE_INT_CST_LOW (arg);
/* Clear all the bits that are beyond the type's precision. */
if (width > HOST_BITS_PER_WIDE_INT)
{
hi = TREE_INT_CST_HIGH (arg);
if (width < HOST_BITS_PER_DOUBLE_INT)
hi &= ~(HOST_WIDE_INT_M1U << (width - HOST_BITS_PER_WIDE_INT));
}
else
{
hi = 0;
if (width < HOST_BITS_PER_WIDE_INT)
lo &= ~(HOST_WIDE_INT_M1U << width);
}
tree type = TREE_TYPE (arg);
int result;
switch (DECL_FUNCTION_CODE (fndecl))
{
CASE_INT_FN (BUILT_IN_FFS):
if (lo != 0)
result = ffs_hwi (lo);
else if (hi != 0)
result = HOST_BITS_PER_WIDE_INT + ffs_hwi (hi);
else
result = 0;
result = wi::ffs (arg);
break;
CASE_INT_FN (BUILT_IN_CLZ):
if (hi != 0)
result = width - floor_log2 (hi) - 1 - HOST_BITS_PER_WIDE_INT;
else if (lo != 0)
result = width - floor_log2 (lo) - 1;
if (wi::ne_p (arg, 0))
result = wi::clz (arg);
else if (! CLZ_DEFINED_VALUE_AT_ZERO (TYPE_MODE (type), result))
result = width;
result = TYPE_PRECISION (type);
break;
CASE_INT_FN (BUILT_IN_CTZ):
if (lo != 0)
result = ctz_hwi (lo);
else if (hi != 0)
result = HOST_BITS_PER_WIDE_INT + ctz_hwi (hi);
if (wi::ne_p (arg, 0))
result = wi::ctz (arg);
else if (! CTZ_DEFINED_VALUE_AT_ZERO (TYPE_MODE (type), result))
result = width;
result = TYPE_PRECISION (type);
break;
CASE_INT_FN (BUILT_IN_CLRSB):
if (width > 2 * HOST_BITS_PER_WIDE_INT)
return NULL_TREE;
if (width > HOST_BITS_PER_WIDE_INT
&& (hi & ((unsigned HOST_WIDE_INT) 1
<< (width - HOST_BITS_PER_WIDE_INT - 1))) != 0)
{
hi = ~hi & ~(HOST_WIDE_INT_M1U
<< (width - HOST_BITS_PER_WIDE_INT - 1));
lo = ~lo;
}
else if (width <= HOST_BITS_PER_WIDE_INT
&& (lo & ((unsigned HOST_WIDE_INT) 1 << (width - 1))) != 0)
lo = ~lo & ~(HOST_WIDE_INT_M1U << (width - 1));
if (hi != 0)
result = width - floor_log2 (hi) - 2 - HOST_BITS_PER_WIDE_INT;
else if (lo != 0)
result = width - floor_log2 (lo) - 2;
else
result = width - 1;
result = wi::clrsb (arg);
break;
CASE_INT_FN (BUILT_IN_POPCOUNT):
result = 0;
while (lo)
result++, lo &= lo - 1;
while (hi)
result++, hi &= (unsigned HOST_WIDE_INT) hi - 1;
result = wi::popcount (arg);
break;
CASE_INT_FN (BUILT_IN_PARITY):
result = 0;
while (lo)
result++, lo &= lo - 1;
while (hi)
result++, hi &= (unsigned HOST_WIDE_INT) hi - 1;
result &= 1;
result = wi::parity (arg);
break;
default:
@ -8185,49 +8136,24 @@ fold_builtin_bswap (tree fndecl, tree arg)
/* Optimize constant value. */
if (TREE_CODE (arg) == INTEGER_CST && !TREE_OVERFLOW (arg))
{
HOST_WIDE_INT hi, width, r_hi = 0;
unsigned HOST_WIDE_INT lo, r_lo = 0;
tree type = TREE_TYPE (TREE_TYPE (fndecl));
width = TYPE_PRECISION (type);
lo = TREE_INT_CST_LOW (arg);
hi = TREE_INT_CST_HIGH (arg);
switch (DECL_FUNCTION_CODE (fndecl))
{
case BUILT_IN_BSWAP16:
case BUILT_IN_BSWAP32:
case BUILT_IN_BSWAP64:
{
int s;
for (s = 0; s < width; s += 8)
{
int d = width - s - 8;
unsigned HOST_WIDE_INT byte;
if (s < HOST_BITS_PER_WIDE_INT)
byte = (lo >> s) & 0xff;
else
byte = (hi >> (s - HOST_BITS_PER_WIDE_INT)) & 0xff;
if (d < HOST_BITS_PER_WIDE_INT)
r_lo |= byte << d;
else
r_hi |= byte << (d - HOST_BITS_PER_WIDE_INT);
}
signop sgn = TYPE_SIGN (type);
tree result =
wide_int_to_tree (type,
wide_int::from (arg, TYPE_PRECISION (type),
sgn).bswap ());
return result;
}
break;
default:
gcc_unreachable ();
}
if (width < HOST_BITS_PER_WIDE_INT)
return build_int_cst (type, r_lo);
else
return build_int_cst_wide (type, r_lo, r_hi);
}
return NULL_TREE;
@ -8289,7 +8215,7 @@ fold_builtin_logarithm (location_t loc, tree fndecl, tree arg,
/* Prepare to do logN(exp10(exponent) -> exponent*logN(10). */
{
REAL_VALUE_TYPE dconst10;
real_from_integer (&dconst10, VOIDmode, 10, 0, 0);
real_from_integer (&dconst10, VOIDmode, 10, SIGNED);
x = build_real (type, dconst10);
}
exponent = CALL_EXPR_ARG (arg, 0);
@ -8442,7 +8368,7 @@ fold_builtin_pow (location_t loc, tree fndecl, tree arg0, tree arg1, tree type)
/* Check for an integer exponent. */
n = real_to_integer (&c);
real_from_integer (&cint, VOIDmode, n, n < 0 ? -1 : 0, 0);
real_from_integer (&cint, VOIDmode, n, SIGNED);
if (real_identical (&c, &cint))
{
/* Attempt to evaluate pow at compile-time, unless this should
@ -8814,20 +8740,18 @@ fold_builtin_memory_op (location_t loc, tree dest, tree src,
else if (TREE_CODE (src_base) == MEM_REF
&& TREE_CODE (dest_base) == MEM_REF)
{
double_int off;
if (! operand_equal_p (TREE_OPERAND (src_base, 0),
TREE_OPERAND (dest_base, 0), 0))
return NULL_TREE;
off = mem_ref_offset (src_base) +
double_int::from_shwi (src_offset);
if (!off.fits_shwi ())
offset_int off = mem_ref_offset (src_base) + src_offset;
if (!wi::fits_shwi_p (off))
return NULL_TREE;
src_offset = off.low;
off = mem_ref_offset (dest_base) +
double_int::from_shwi (dest_offset);
if (!off.fits_shwi ())
src_offset = off.to_shwi ();
off = mem_ref_offset (dest_base) + dest_offset;
if (!wi::fits_shwi_p (off))
return NULL_TREE;
dest_offset = off.low;
dest_offset = off.to_shwi ();
if (ranges_overlap_p (src_offset, maxsize,
dest_offset, maxsize))
return NULL_TREE;
@ -12690,8 +12614,7 @@ fold_builtin_object_size (tree ptr, tree ost)
if (TREE_CODE (ptr) == ADDR_EXPR)
{
bytes = compute_builtin_object_size (ptr, object_size_type);
if (double_int_fits_to_tree_p (size_type_node,
double_int::from_uhwi (bytes)))
if (wi::fits_to_tree_p (bytes, size_type_node))
return build_int_cstu (size_type_node, bytes);
}
else if (TREE_CODE (ptr) == SSA_NAME)
@ -12701,8 +12624,7 @@ fold_builtin_object_size (tree ptr, tree ost)
it. */
bytes = compute_builtin_object_size (ptr, object_size_type);
if (bytes != (unsigned HOST_WIDE_INT) (object_size_type < 2 ? -1 : 0)
&& double_int_fits_to_tree_p (size_type_node,
double_int::from_uhwi (bytes)))
&& wi::fits_to_tree_p (bytes, size_type_node))
return build_int_cstu (size_type_node, bytes);
}

View file

@ -29,21 +29,7 @@ along with GCC; see the file COPYING3. If not see
#include "cpplib.h"
#include "c-pragma.h"
#include "cpp-id-data.h"
/* Adapted from hwint.h to use the Ada prefix. */
#if HOST_BITS_PER_WIDE_INT == HOST_BITS_PER_LONG
# if HOST_BITS_PER_WIDE_INT == 64
# define ADA_HOST_WIDE_INT_PRINT_DOUBLE_HEX \
"16#%" HOST_LONG_FORMAT "x%016" HOST_LONG_FORMAT "x#"
# else
# define ADA_HOST_WIDE_INT_PRINT_DOUBLE_HEX \
"16#%" HOST_LONG_FORMAT "x%08" HOST_LONG_FORMAT "x#"
# endif
#else
/* We can assume that 'long long' is at least 64 bits. */
# define ADA_HOST_WIDE_INT_PRINT_DOUBLE_HEX \
"16#%" HOST_LONG_LONG_FORMAT "x%016" HOST_LONG_LONG_FORMAT "x#"
#endif /* HOST_BITS_PER_WIDE_INT == HOST_BITS_PER_LONG */
#include "wide-int.h"
/* Local functions, macros and variables. */
static int dump_generic_ada_node (pretty_printer *, tree, tree, int, int,
@ -2211,19 +2197,19 @@ dump_generic_ada_node (pretty_printer *buffer, tree node, tree type, int spc,
pp_unsigned_wide_integer (buffer, tree_to_uhwi (node));
else
{
tree val = node;
unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (val);
HOST_WIDE_INT high = TREE_INT_CST_HIGH (val);
if (tree_int_cst_sgn (val) < 0)
wide_int val = node;
int i;
if (wi::neg_p (val))
{
pp_minus (buffer);
high = ~high + !low;
low = -low;
val = -val;
}
sprintf (pp_buffer (buffer)->digit_buffer,
ADA_HOST_WIDE_INT_PRINT_DOUBLE_HEX,
(unsigned HOST_WIDE_INT) high, low);
"16#%" HOST_WIDE_INT_PRINT "x",
val.elt (val.get_len () - 1));
for (i = val.get_len () - 2; i >= 0; i--)
sprintf (pp_buffer (buffer)->digit_buffer,
HOST_WIDE_INT_PRINT_PADDED_HEX, val.elt (i));
pp_string (buffer, pp_buffer (buffer)->digit_buffer);
}
break;

View file

@ -49,6 +49,7 @@ along with GCC; see the file COPYING3. If not see
#include "cgraph.h"
#include "target-def.h"
#include "gimplify.h"
#include "wide-int-print.h"
cpp_reader *parse_in; /* Declared in c-pragma.h. */
@ -4122,9 +4123,12 @@ shorten_compare (location_t loc, tree *op0_ptr, tree *op1_ptr,
{
/* Convert primop1 to target type, but do not introduce
additional overflow. We know primop1 is an int_cst. */
primop1 = force_fit_type_double (*restype_ptr,
tree_to_double_int (primop1),
0, TREE_OVERFLOW (primop1));
primop1 = force_fit_type (*restype_ptr,
wide_int::from
(primop1,
TYPE_PRECISION (*restype_ptr),
TYPE_SIGN (TREE_TYPE (primop1))),
0, TREE_OVERFLOW (primop1));
}
if (type != *restype_ptr)
{
@ -4132,20 +4136,10 @@ shorten_compare (location_t loc, tree *op0_ptr, tree *op1_ptr,
maxval = convert (*restype_ptr, maxval);
}
if (unsignedp && unsignedp0)
{
min_gt = INT_CST_LT_UNSIGNED (primop1, minval);
max_gt = INT_CST_LT_UNSIGNED (primop1, maxval);
min_lt = INT_CST_LT_UNSIGNED (minval, primop1);
max_lt = INT_CST_LT_UNSIGNED (maxval, primop1);
}
else
{
min_gt = INT_CST_LT (primop1, minval);
max_gt = INT_CST_LT (primop1, maxval);
min_lt = INT_CST_LT (minval, primop1);
max_lt = INT_CST_LT (maxval, primop1);
}
min_gt = tree_int_cst_lt (primop1, minval);
max_gt = tree_int_cst_lt (primop1, maxval);
min_lt = tree_int_cst_lt (minval, primop1);
max_lt = tree_int_cst_lt (maxval, primop1);
val = 0;
/* This used to be a switch, but Genix compiler can't handle that. */
@ -4434,8 +4428,7 @@ pointer_int_sum (location_t loc, enum tree_code resultcode,
convert (TREE_TYPE (intop), size_exp), 1);
intop = convert (sizetype, t);
if (TREE_OVERFLOW_P (intop) && !TREE_OVERFLOW (t))
intop = build_int_cst_wide (TREE_TYPE (intop), TREE_INT_CST_LOW (intop),
TREE_INT_CST_HIGH (intop));
intop = wide_int_to_tree (TREE_TYPE (intop), intop);
}
/* Create the sum or difference. */
@ -5512,7 +5505,7 @@ c_common_nodes_and_builtins (void)
}
/* This node must not be shared. */
void_zero_node = make_node (INTEGER_CST);
void_zero_node = make_int_cst (1, 1);
TREE_TYPE (void_zero_node) = void_type_node;
void_list_node = build_void_list_node ();
@ -5719,7 +5712,7 @@ c_common_nodes_and_builtins (void)
/* Create the built-in __null node. It is important that this is
not shared. */
null_node = make_node (INTEGER_CST);
null_node = make_int_cst (1, 1);
TREE_TYPE (null_node) = c_common_type_for_size (POINTER_SIZE, 0);
/* Since builtin_types isn't gc'ed, don't export these nodes. */
@ -6097,22 +6090,14 @@ c_add_case_label (location_t loc, splay_tree cases, tree cond, tree orig_type,
static void
match_case_to_enum_1 (tree key, tree type, tree label)
{
char buf[2 + 2*HOST_BITS_PER_WIDE_INT/4 + 1];
char buf[WIDE_INT_PRINT_BUFFER_SIZE];
/* ??? Not working too hard to print the double-word value.
Should perhaps be done with %lwd in the diagnostic routines? */
if (TREE_INT_CST_HIGH (key) == 0)
snprintf (buf, sizeof (buf), HOST_WIDE_INT_PRINT_UNSIGNED,
TREE_INT_CST_LOW (key));
else if (!TYPE_UNSIGNED (type)
&& TREE_INT_CST_HIGH (key) == -1
&& TREE_INT_CST_LOW (key) != 0)
snprintf (buf, sizeof (buf), "-" HOST_WIDE_INT_PRINT_UNSIGNED,
-TREE_INT_CST_LOW (key));
if (tree_fits_uhwi_p (key))
print_dec (key, buf, UNSIGNED);
else if (tree_fits_shwi_p (key))
print_dec (key, buf, SIGNED);
else
snprintf (buf, sizeof (buf), HOST_WIDE_INT_PRINT_DOUBLE_HEX,
(unsigned HOST_WIDE_INT) TREE_INT_CST_HIGH (key),
(unsigned HOST_WIDE_INT) TREE_INT_CST_LOW (key));
print_hex (key, buf);
if (TYPE_NAME (type) == 0)
warning_at (DECL_SOURCE_LOCATION (CASE_LABEL (label)),
@ -8849,13 +8834,14 @@ check_nonnull_arg (void * ARG_UNUSED (ctx), tree param,
static bool
get_nonnull_operand (tree arg_num_expr, unsigned HOST_WIDE_INT *valp)
{
/* Verify the arg number is a constant. */
if (TREE_CODE (arg_num_expr) != INTEGER_CST
|| TREE_INT_CST_HIGH (arg_num_expr) != 0)
/* Verify the arg number is a small constant. */
if (tree_fits_uhwi_p (arg_num_expr))
{
*valp = TREE_INT_CST_LOW (arg_num_expr);
return true;
}
else
return false;
*valp = TREE_INT_CST_LOW (arg_num_expr);
return true;
}
/* Handle a "nothrow" attribute; arguments as in

View file

@ -227,7 +227,7 @@ check_format_string (tree fntype, unsigned HOST_WIDE_INT format_num,
static bool
get_constant (tree expr, unsigned HOST_WIDE_INT *value, int validated_p)
{
if (TREE_CODE (expr) != INTEGER_CST || TREE_INT_CST_HIGH (expr) != 0)
if (!tree_fits_uhwi_p (expr))
{
gcc_assert (!validated_p);
return false;

View file

@ -35,6 +35,7 @@ along with GCC; see the file COPYING3. If not see
#include "splay-tree.h"
#include "debug.h"
#include "target.h"
#include "wide-int.h"
/* We may keep statistics about how long which files took to compile. */
static int header_time, body_time;
@ -49,9 +50,9 @@ static tree interpret_float (const cpp_token *, unsigned int, const char *,
enum overflow_type *);
static tree interpret_fixed (const cpp_token *, unsigned int);
static enum integer_type_kind narrowest_unsigned_type
(unsigned HOST_WIDE_INT, unsigned HOST_WIDE_INT, unsigned int);
(const widest_int &, unsigned int);
static enum integer_type_kind narrowest_signed_type
(unsigned HOST_WIDE_INT, unsigned HOST_WIDE_INT, unsigned int);
(const widest_int &, unsigned int);
static enum cpp_ttype lex_string (const cpp_token *, tree *, bool, bool);
static tree lex_charconst (const cpp_token *);
static void update_header_times (const char *);
@ -527,9 +528,7 @@ c_lex_with_flags (tree *value, location_t *loc, unsigned char *cpp_flags,
there isn't one. */
static enum integer_type_kind
narrowest_unsigned_type (unsigned HOST_WIDE_INT low,
unsigned HOST_WIDE_INT high,
unsigned int flags)
narrowest_unsigned_type (const widest_int &val, unsigned int flags)
{
int itk;
@ -548,9 +547,7 @@ narrowest_unsigned_type (unsigned HOST_WIDE_INT low,
continue;
upper = TYPE_MAX_VALUE (integer_types[itk]);
if ((unsigned HOST_WIDE_INT) TREE_INT_CST_HIGH (upper) > high
|| ((unsigned HOST_WIDE_INT) TREE_INT_CST_HIGH (upper) == high
&& TREE_INT_CST_LOW (upper) >= low))
if (wi::geu_p (wi::to_widest (upper), val))
return (enum integer_type_kind) itk;
}
@ -559,8 +556,7 @@ narrowest_unsigned_type (unsigned HOST_WIDE_INT low,
/* Ditto, but narrowest signed type. */
static enum integer_type_kind
narrowest_signed_type (unsigned HOST_WIDE_INT low,
unsigned HOST_WIDE_INT high, unsigned int flags)
narrowest_signed_type (const widest_int &val, unsigned int flags)
{
int itk;
@ -571,7 +567,6 @@ narrowest_signed_type (unsigned HOST_WIDE_INT low,
else
itk = itk_long_long;
for (; itk < itk_none; itk += 2 /* skip signed types */)
{
tree upper;
@ -580,9 +575,7 @@ narrowest_signed_type (unsigned HOST_WIDE_INT low,
continue;
upper = TYPE_MAX_VALUE (integer_types[itk]);
if ((unsigned HOST_WIDE_INT) TREE_INT_CST_HIGH (upper) > high
|| ((unsigned HOST_WIDE_INT) TREE_INT_CST_HIGH (upper) == high
&& TREE_INT_CST_LOW (upper) >= low))
if (wi::geu_p (wi::to_widest (upper), val))
return (enum integer_type_kind) itk;
}
@ -597,6 +590,7 @@ interpret_integer (const cpp_token *token, unsigned int flags,
tree value, type;
enum integer_type_kind itk;
cpp_num integer;
HOST_WIDE_INT ival[3];
*overflow = OT_NONE;
@ -604,18 +598,23 @@ interpret_integer (const cpp_token *token, unsigned int flags,
if (integer.overflow)
*overflow = OT_OVERFLOW;
ival[0] = integer.low;
ival[1] = integer.high;
ival[2] = 0;
widest_int wval = widest_int::from_array (ival, 3);
/* The type of a constant with a U suffix is straightforward. */
if (flags & CPP_N_UNSIGNED)
itk = narrowest_unsigned_type (integer.low, integer.high, flags);
itk = narrowest_unsigned_type (wval, flags);
else
{
/* The type of a potentially-signed integer constant varies
depending on the base it's in, the standard in use, and the
length suffixes. */
enum integer_type_kind itk_u
= narrowest_unsigned_type (integer.low, integer.high, flags);
= narrowest_unsigned_type (wval, flags);
enum integer_type_kind itk_s
= narrowest_signed_type (integer.low, integer.high, flags);
= narrowest_signed_type (wval, flags);
/* In both C89 and C99, octal and hex constants may be signed or
unsigned, whichever fits tighter. We do not warn about this
@ -667,7 +666,7 @@ interpret_integer (const cpp_token *token, unsigned int flags,
: "integer constant is too large for %<long%> type");
}
value = build_int_cst_wide (type, integer.low, integer.high);
value = wide_int_to_tree (type, wval);
/* Convert imaginary to a complex type. */
if (flags & CPP_N_IMAGINARY)
@ -1165,9 +1164,9 @@ lex_charconst (const cpp_token *token)
/* Cast to cppchar_signed_t to get correct sign-extension of RESULT
before possibly widening to HOST_WIDE_INT for build_int_cst. */
if (unsignedp || (cppchar_signed_t) result >= 0)
value = build_int_cst_wide (type, result, 0);
value = build_int_cst (type, result);
else
value = build_int_cst_wide (type, (cppchar_signed_t) result, -1);
value = build_int_cst (type, (cppchar_signed_t) result);
return value;
}

View file

@ -30,6 +30,7 @@ along with GCC; see the file COPYING3. If not see
#include "tree-pretty-print.h"
#include "tree-iterator.h"
#include "diagnostic.h"
#include "wide-int-print.h"
/* The pretty-printer code is primarily designed to closely follow
(GNU) C and C++ grammars. That is to be contrasted with spaghetti
@ -923,16 +924,14 @@ pp_c_integer_constant (c_pretty_printer *pp, tree i)
pp_unsigned_wide_integer (pp, tree_to_uhwi (i));
else
{
unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (i);
HOST_WIDE_INT high = TREE_INT_CST_HIGH (i);
if (tree_int_cst_sgn (i) < 0)
wide_int wi = i;
if (wi::lt_p (i, 0, TYPE_SIGN (TREE_TYPE (i))))
{
pp_minus (pp);
high = ~high + !low;
low = -low;
wi = -wi;
}
sprintf (pp_buffer (pp)->digit_buffer, HOST_WIDE_INT_PRINT_DOUBLE_HEX,
(unsigned HOST_WIDE_INT) high, (unsigned HOST_WIDE_INT) low);
print_hex (wi, pp_buffer (pp)->digit_buffer);
pp_string (pp, pp_buffer (pp)->digit_buffer);
}
if (TYPE_UNSIGNED (type))

View file

@ -666,8 +666,7 @@ declare_one_free_variable (const void *var0, void **map0,
/* Maybe promote to int. */
if (INTEGRAL_TYPE_P (var_type) && COMPLETE_TYPE_P (var_type)
&& INT_CST_LT_UNSIGNED (TYPE_SIZE (var_type),
TYPE_SIZE (integer_type_node)))
&& tree_int_cst_lt (TYPE_SIZE (var_type), TYPE_SIZE (integer_type_node)))
arg_type = integer_type_node;
else
arg_type = var_type;

View file

@ -4880,8 +4880,8 @@ check_bitfield_type_and_width (tree *type, tree *width, tree orig_name)
{
struct lang_type *lt = TYPE_LANG_SPECIFIC (*type);
if (!lt
|| w < tree_int_cst_min_precision (lt->enum_min, TYPE_UNSIGNED (*type))
|| w < tree_int_cst_min_precision (lt->enum_max, TYPE_UNSIGNED (*type)))
|| w < tree_int_cst_min_precision (lt->enum_min, TYPE_SIGN (*type))
|| w < tree_int_cst_min_precision (lt->enum_max, TYPE_SIGN (*type)))
warning (0, "%qs is narrower than values of its type", name);
}
}
@ -7605,7 +7605,8 @@ finish_enum (tree enumtype, tree values, tree attributes)
{
tree pair, tem;
tree minnode = 0, maxnode = 0;
int precision, unsign;
int precision;
signop sign;
bool toplevel = (file_scope == current_scope);
struct lang_type *lt;
@ -7632,13 +7633,13 @@ finish_enum (tree enumtype, tree values, tree attributes)
as one of the integral types - the narrowest one that fits, except
that normally we only go as narrow as int - and signed iff any of
the values are negative. */
unsign = (tree_int_cst_sgn (minnode) >= 0);
precision = MAX (tree_int_cst_min_precision (minnode, unsign),
tree_int_cst_min_precision (maxnode, unsign));
sign = (tree_int_cst_sgn (minnode) >= 0) ? UNSIGNED : SIGNED;
precision = MAX (tree_int_cst_min_precision (minnode, sign),
tree_int_cst_min_precision (maxnode, sign));
if (TYPE_PACKED (enumtype) || precision > TYPE_PRECISION (integer_type_node))
{
tem = c_common_type_for_size (precision, unsign);
tem = c_common_type_for_size (precision, sign == UNSIGNED ? 1 : 0);
if (tem == NULL)
{
warning (0, "enumeration values exceed range of largest integer");
@ -7646,7 +7647,7 @@ finish_enum (tree enumtype, tree values, tree attributes)
}
}
else
tem = unsign ? unsigned_type_node : integer_type_node;
tem = sign == UNSIGNED ? unsigned_type_node : integer_type_node;
TYPE_MIN_VALUE (enumtype) = TYPE_MIN_VALUE (tem);
TYPE_MAX_VALUE (enumtype) = TYPE_MAX_VALUE (tem);

View file

@ -13616,7 +13616,7 @@ c_parser_cilk_clause_vectorlength (c_parser *parser, tree clauses,
|| !INTEGRAL_TYPE_P (TREE_TYPE (expr)))
error_at (loc, "vectorlength must be an integer constant");
else if (exact_log2 (TREE_INT_CST_LOW (expr)) == -1)
else if (wi::exact_log2 (expr) == -1)
error_at (loc, "vectorlength must be a power of 2");
else
{

View file

@ -50,6 +50,7 @@ along with GCC; see the file COPYING3. If not see
#include "c-family/c-common.h"
#include "c-family/c-ubsan.h"
#include "cilk.h"
#include "wide-int.h"
/* Possible cases of implicit bad conversions. Used to select
diagnostic messages in convert_for_assignment. */
@ -5126,9 +5127,7 @@ build_c_cast (location_t loc, tree type, tree expr)
}
else if (TREE_OVERFLOW (value))
/* Reset VALUE's overflow flags, ensuring constant sharing. */
value = build_int_cst_wide (TREE_TYPE (value),
TREE_INT_CST_LOW (value),
TREE_INT_CST_HIGH (value));
value = wide_int_to_tree (TREE_TYPE (value), value);
}
}
@ -8078,20 +8077,20 @@ set_nonincremental_init_from_string (tree str,
{
if (wchar_bytes == 1)
{
val[1] = (unsigned char) *p++;
val[0] = 0;
val[0] = (unsigned char) *p++;
val[1] = 0;
}
else
{
val[0] = 0;
val[1] = 0;
val[0] = 0;
for (byte = 0; byte < wchar_bytes; byte++)
{
if (BYTES_BIG_ENDIAN)
bitpos = (wchar_bytes - byte - 1) * charwidth;
else
bitpos = byte * charwidth;
val[bitpos < HOST_BITS_PER_WIDE_INT]
val[bitpos % HOST_BITS_PER_WIDE_INT]
|= ((unsigned HOST_WIDE_INT) ((unsigned char) *p++))
<< (bitpos % HOST_BITS_PER_WIDE_INT);
}
@ -8102,24 +8101,26 @@ set_nonincremental_init_from_string (tree str,
bitpos = ((wchar_bytes - 1) * charwidth) + HOST_BITS_PER_CHAR;
if (bitpos < HOST_BITS_PER_WIDE_INT)
{
if (val[1] & (((HOST_WIDE_INT) 1) << (bitpos - 1)))
if (val[0] & (((HOST_WIDE_INT) 1) << (bitpos - 1)))
{
val[1] |= ((HOST_WIDE_INT) -1) << bitpos;
val[0] = -1;
val[0] |= ((HOST_WIDE_INT) -1) << bitpos;
val[1] = -1;
}
}
else if (bitpos == HOST_BITS_PER_WIDE_INT)
{
if (val[1] < 0)
val[0] = -1;
if (val[0] < 0)
val[1] = -1;
}
else if (val[0] & (((HOST_WIDE_INT) 1)
else if (val[1] & (((HOST_WIDE_INT) 1)
<< (bitpos - 1 - HOST_BITS_PER_WIDE_INT)))
val[0] |= ((HOST_WIDE_INT) -1)
val[1] |= ((HOST_WIDE_INT) -1)
<< (bitpos - HOST_BITS_PER_WIDE_INT);
}
value = build_int_cst_wide (type, val[1], val[0]);
value = wide_int_to_tree (type,
wide_int::from_array (val, 2,
HOST_BITS_PER_WIDE_INT * 2));
add_pending_init (input_location, purpose, value, NULL_TREE, true,
braced_init_obstack);
}
@ -12365,8 +12366,7 @@ c_tree_equal (tree t1, tree t2)
switch (code1)
{
case INTEGER_CST:
return TREE_INT_CST_LOW (t1) == TREE_INT_CST_LOW (t2)
&& TREE_INT_CST_HIGH (t1) == TREE_INT_CST_HIGH (t2);
return wi::eq_p (t1, t2);
case REAL_CST:
return REAL_VALUES_EQUAL (TREE_REAL_CST (t1), TREE_REAL_CST (t2));

View file

@ -336,7 +336,8 @@ alloc_loop (void)
loop->exits = ggc_alloc_cleared_loop_exit ();
loop->exits->next = loop->exits->prev = loop->exits;
loop->can_be_parallel = false;
loop->nb_iterations_upper_bound = 0;
loop->nb_iterations_estimate = 0;
return loop;
}
@ -1787,21 +1788,21 @@ get_loop_location (struct loop *loop)
I_BOUND times. */
void
record_niter_bound (struct loop *loop, double_int i_bound, bool realistic,
bool upper)
record_niter_bound (struct loop *loop, const widest_int &i_bound,
bool realistic, bool upper)
{
/* Update the bounds only when there is no previous estimation, or when the
current estimation is smaller. */
if (upper
&& (!loop->any_upper_bound
|| i_bound.ult (loop->nb_iterations_upper_bound)))
|| wi::ltu_p (i_bound, loop->nb_iterations_upper_bound)))
{
loop->any_upper_bound = true;
loop->nb_iterations_upper_bound = i_bound;
}
if (realistic
&& (!loop->any_estimate
|| i_bound.ult (loop->nb_iterations_estimate)))
|| wi::ltu_p (i_bound, loop->nb_iterations_estimate)))
{
loop->any_estimate = true;
loop->nb_iterations_estimate = i_bound;
@ -1811,7 +1812,8 @@ record_niter_bound (struct loop *loop, double_int i_bound, bool realistic,
number of iterations, use the upper bound instead. */
if (loop->any_upper_bound
&& loop->any_estimate
&& loop->nb_iterations_upper_bound.ult (loop->nb_iterations_estimate))
&& wi::ltu_p (loop->nb_iterations_upper_bound,
loop->nb_iterations_estimate))
loop->nb_iterations_estimate = loop->nb_iterations_upper_bound;
}
@ -1822,13 +1824,13 @@ record_niter_bound (struct loop *loop, double_int i_bound, bool realistic,
HOST_WIDE_INT
get_estimated_loop_iterations_int (struct loop *loop)
{
double_int nit;
widest_int nit;
HOST_WIDE_INT hwi_nit;
if (!get_estimated_loop_iterations (loop, &nit))
return -1;
if (!nit.fits_shwi ())
if (!wi::fits_shwi_p (nit))
return -1;
hwi_nit = nit.to_shwi ();
@ -1859,7 +1861,7 @@ max_stmt_executions_int (struct loop *loop)
returns true. */
bool
get_estimated_loop_iterations (struct loop *loop, double_int *nit)
get_estimated_loop_iterations (struct loop *loop, widest_int *nit)
{
/* Even if the bound is not recorded, possibly we can derrive one from
profile. */
@ -1867,7 +1869,7 @@ get_estimated_loop_iterations (struct loop *loop, double_int *nit)
{
if (loop->header->count)
{
*nit = gcov_type_to_double_int
*nit = gcov_type_to_wide_int
(expected_loop_iterations_unbounded (loop) + 1);
return true;
}
@ -1883,7 +1885,7 @@ get_estimated_loop_iterations (struct loop *loop, double_int *nit)
false, otherwise returns true. */
bool
get_max_loop_iterations (struct loop *loop, double_int *nit)
get_max_loop_iterations (struct loop *loop, widest_int *nit)
{
if (!loop->any_upper_bound)
return false;
@ -1899,13 +1901,13 @@ get_max_loop_iterations (struct loop *loop, double_int *nit)
HOST_WIDE_INT
get_max_loop_iterations_int (struct loop *loop)
{
double_int nit;
widest_int nit;
HOST_WIDE_INT hwi_nit;
if (!get_max_loop_iterations (loop, &nit))
return -1;
if (!nit.fits_shwi ())
if (!wi::fits_shwi_p (nit))
return -1;
hwi_nit = nit.to_shwi ();

View file

@ -21,6 +21,7 @@ along with GCC; see the file COPYING3. If not see
#define GCC_CFGLOOP_H
#include "double-int.h"
#include "wide-int.h"
#include "bitmap.h"
#include "sbitmap.h"
#include "function.h"
@ -62,7 +63,7 @@ struct GTY ((chain_next ("%h.next"))) nb_iter_bound {
overflows (as MAX + 1 is sometimes produced as the estimate on number
of executions of STMT).
b) it is consistent with the result of number_of_iterations_exit. */
double_int bound;
widest_int bound;
/* True if the statement will cause the loop to be leaved the (at most)
BOUND + 1-st time it is executed, that is, all the statements after it
@ -146,12 +147,12 @@ struct GTY ((chain_next ("%h.next"))) loop {
/* An integer guaranteed to be greater or equal to nb_iterations. Only
valid if any_upper_bound is true. */
double_int nb_iterations_upper_bound;
widest_int nb_iterations_upper_bound;
/* An integer giving an estimate on nb_iterations. Unlike
nb_iterations_upper_bound, there is no guarantee that it is at least
nb_iterations. */
double_int nb_iterations_estimate;
widest_int nb_iterations_estimate;
bool any_upper_bound;
bool any_estimate;
@ -737,27 +738,27 @@ loop_outermost (struct loop *loop)
return (*loop->superloops)[1];
}
extern void record_niter_bound (struct loop *, double_int, bool, bool);
extern void record_niter_bound (struct loop *, const widest_int &, bool, bool);
extern HOST_WIDE_INT get_estimated_loop_iterations_int (struct loop *);
extern HOST_WIDE_INT get_max_loop_iterations_int (struct loop *);
extern bool get_estimated_loop_iterations (struct loop *loop, double_int *nit);
extern bool get_max_loop_iterations (struct loop *loop, double_int *nit);
extern bool get_estimated_loop_iterations (struct loop *loop, widest_int *nit);
extern bool get_max_loop_iterations (struct loop *loop, widest_int *nit);
extern int bb_loop_depth (const_basic_block);
/* Converts VAL to double_int. */
/* Converts VAL to widest_int. */
static inline double_int
gcov_type_to_double_int (gcov_type val)
static inline widest_int
gcov_type_to_wide_int (gcov_type val)
{
double_int ret;
HOST_WIDE_INT a[2];
ret.low = (unsigned HOST_WIDE_INT) val;
a[0] = (unsigned HOST_WIDE_INT) val;
/* If HOST_BITS_PER_WIDE_INT == HOST_BITS_PER_WIDEST_INT, avoid shifting by
the size of type. */
val >>= HOST_BITS_PER_WIDE_INT - 1;
val >>= 1;
ret.high = (unsigned HOST_WIDE_INT) val;
a[1] = (unsigned HOST_WIDE_INT) val;
return ret;
return widest_int::from_array (a, 2);
}
#endif /* GCC_CFGLOOP_H */

View file

@ -650,8 +650,7 @@ cgraph_add_thunk (struct cgraph_node *decl_node ATTRIBUTE_UNUSED,
node = cgraph_create_node (alias);
gcc_checking_assert (!virtual_offset
|| tree_to_double_int (virtual_offset) ==
double_int::from_shwi (virtual_value));
|| wi::eq_p (virtual_offset, virtual_value));
node->thunk.fixed_offset = fixed_offset;
node->thunk.this_adjusting = this_adjusting;
node->thunk.virtual_value = virtual_value;

View file

@ -2671,22 +2671,15 @@ try_combine (rtx i3, rtx i2, rtx i1, rtx i0, int *new_direct_jump_p,
offset = -1;
}
if (offset >= 0
&& (GET_MODE_PRECISION (GET_MODE (SET_DEST (temp)))
<= HOST_BITS_PER_DOUBLE_INT))
if (offset >= 0)
{
double_int m, o, i;
rtx inner = SET_SRC (PATTERN (i3));
rtx outer = SET_SRC (temp);
o = rtx_to_double_int (outer);
i = rtx_to_double_int (inner);
m = double_int::mask (width);
i &= m;
m = m.llshift (offset, HOST_BITS_PER_DOUBLE_INT);
i = i.llshift (offset, HOST_BITS_PER_DOUBLE_INT);
o = o.and_not (m) | i;
wide_int o
= wi::insert (std::make_pair (outer, GET_MODE (SET_DEST (temp))),
std::make_pair (inner, GET_MODE (dest)),
offset, width);
combine_merges++;
subst_insn = i3;
@ -2699,7 +2692,7 @@ try_combine (rtx i3, rtx i2, rtx i1, rtx i0, int *new_direct_jump_p,
resulting insn the new pattern for I3. Then skip to where we
validate the pattern. Everything was set up above. */
SUBST (SET_SRC (temp),
immed_double_int_const (o, GET_MODE (SET_DEST (temp))));
immed_wide_int_const (o, GET_MODE (SET_DEST (temp))));
newpat = PATTERN (i2);
@ -5139,7 +5132,7 @@ subst (rtx x, rtx from, rtx to, int in_dest, int in_cond, int unique_copy)
if (! x)
x = gen_rtx_CLOBBER (mode, const0_rtx);
}
else if (CONST_INT_P (new_rtx)
else if (CONST_SCALAR_INT_P (new_rtx)
&& GET_CODE (x) == ZERO_EXTEND)
{
x = simplify_unary_operation (ZERO_EXTEND, GET_MODE (x),

View file

@ -6132,8 +6132,10 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
int count;
tree index = TYPE_DOMAIN (type);
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
count = aapcs_vfp_sub_candidate (TREE_TYPE (type), modep);
@ -6150,9 +6152,7 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
- tree_to_uhwi (TYPE_MIN_VALUE (index)));
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -6164,8 +6164,10 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
int sub_count;
tree field;
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
for (field = TYPE_FIELDS (type); field; field = TREE_CHAIN (field))
@ -6180,9 +6182,7 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
}
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -6196,8 +6196,10 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
int sub_count;
tree field;
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
for (field = TYPE_FIELDS (type); field; field = TREE_CHAIN (field))
@ -6212,9 +6214,7 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
}
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -7557,8 +7557,8 @@ aarch64_float_const_representable_p (rtx x)
int point_pos = 2 * HOST_BITS_PER_WIDE_INT - 1;
int exponent;
unsigned HOST_WIDE_INT mantissa, mask;
HOST_WIDE_INT m1, m2;
REAL_VALUE_TYPE r, m;
bool fail;
if (!CONST_DOUBLE_P (x))
return false;
@ -7582,16 +7582,16 @@ aarch64_float_const_representable_p (rtx x)
WARNING: If we ever have a representation using more than 2 * H_W_I - 1
bits for the mantissa, this can fail (low bits will be lost). */
real_ldexp (&m, &r, point_pos - exponent);
REAL_VALUE_TO_INT (&m1, &m2, m);
wide_int w = real_to_integer (&m, &fail, HOST_BITS_PER_WIDE_INT * 2);
/* If the low part of the mantissa has bits set we cannot represent
the value. */
if (m1 != 0)
if (w.elt (0) != 0)
return false;
/* We have rejected the lower HOST_WIDE_INT, so update our
understanding of how many bits lie in the mantissa and
look only at the high HOST_WIDE_INT. */
mantissa = m2;
mantissa = w.elt (1);
point_pos -= HOST_BITS_PER_WIDE_INT;
/* We can only represent values with a mantissa of the form 1.xxxx. */

View file

@ -65,6 +65,7 @@ along with GCC; see the file COPYING3. If not see
#include "tree-pass.h"
#include "context.h"
#include "pass_manager.h"
#include "wide-int.h"
/* Which cpu we're compiling for (A5, ARC600, ARC601, ARC700). */
static const char *arc_cpu_string = "";
@ -391,7 +392,8 @@ static bool arc_return_in_memory (const_tree, const_tree);
static void arc_init_simd_builtins (void);
static bool arc_vector_mode_supported_p (enum machine_mode);
static bool arc_can_use_doloop_p (double_int, double_int, unsigned int, bool);
static bool arc_can_use_doloop_p (const widest_int &, const widest_int &,
unsigned int, bool);
static const char *arc_invalid_within_doloop (const_rtx);
static void output_short_suffix (FILE *file);
@ -5700,7 +5702,7 @@ arc_pass_by_reference (cumulative_args_t ca_v ATTRIBUTE_UNUSED,
/* Implement TARGET_CAN_USE_DOLOOP_P. */
static bool
arc_can_use_doloop_p (double_int iterations, double_int,
arc_can_use_doloop_p (const widest_int &iterations, const widest_int &,
unsigned int loop_depth, bool entered_at_top)
{
if (loop_depth > 1)
@ -5708,9 +5710,8 @@ arc_can_use_doloop_p (double_int iterations, double_int,
/* Setting up the loop with two sr instructions costs 6 cycles. */
if (TARGET_ARC700
&& !entered_at_top
&& iterations.high == 0
&& iterations.low > 0
&& iterations.low <= (flag_pic ? 6 : 3))
&& wi::gtu_p (iterations, 0)
&& wi::leu_p (iterations, flag_pic ? 6 : 3))
return false;
return true;
}

View file

@ -5121,8 +5121,10 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
int count;
tree index = TYPE_DOMAIN (type);
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
count = aapcs_vfp_sub_candidate (TREE_TYPE (type), modep);
@ -5139,9 +5141,7 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
- tree_to_uhwi (TYPE_MIN_VALUE (index)));
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -5153,8 +5153,10 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
int sub_count;
tree field;
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
for (field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
@ -5169,9 +5171,7 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
}
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -5185,8 +5185,10 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
int sub_count;
tree field;
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
for (field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
@ -5201,9 +5203,7 @@ aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
}
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -11920,8 +11920,8 @@ vfp3_const_double_index (rtx x)
int sign, exponent;
unsigned HOST_WIDE_INT mantissa, mant_hi;
unsigned HOST_WIDE_INT mask;
HOST_WIDE_INT m1, m2;
int point_pos = 2 * HOST_BITS_PER_WIDE_INT - 1;
bool fail;
if (!TARGET_VFP3 || !CONST_DOUBLE_P (x))
return -1;
@ -11941,9 +11941,9 @@ vfp3_const_double_index (rtx x)
WARNING: If there's ever a VFP version which uses more than 2 * H_W_I - 1
bits for the mantissa, this may fail (low bits would be lost). */
real_ldexp (&m, &r, point_pos - exponent);
REAL_VALUE_TO_INT (&m1, &m2, m);
mantissa = m1;
mant_hi = m2;
wide_int w = real_to_integer (&m, &fail, HOST_BITS_PER_WIDE_INT * 2);
mantissa = w.elt (0);
mant_hi = w.elt (1);
/* If there are bits set in the low part of the mantissa, we can't
represent this value. */

View file

@ -7566,6 +7566,8 @@ avr_out_round (rtx insn ATTRIBUTE_UNUSED, rtx *xop, int *plen)
// The smallest fractional bit not cleared by the rounding is 2^(-RP).
int fbit = (int) GET_MODE_FBIT (mode);
double_int i_add = double_int_zero.set_bit (fbit-1 - INTVAL (xop[2]));
wide_int wi_add = wi::set_bit_in_zero (fbit-1 - INTVAL (xop[2]),
GET_MODE_PRECISION (imode));
// Lengths of PLUS and AND parts.
int len_add = 0, *plen_add = plen ? &len_add : NULL;
int len_and = 0, *plen_and = plen ? &len_and : NULL;
@ -7595,7 +7597,7 @@ avr_out_round (rtx insn ATTRIBUTE_UNUSED, rtx *xop, int *plen)
// Rounding point ^^^^^^^
// Added above ^^^^^^^^^
rtx xreg = simplify_gen_subreg (imode, xop[0], mode, 0);
rtx xmask = immed_double_int_const (-i_add - i_add, imode);
rtx xmask = immed_wide_int_const (-wi_add - wi_add, imode);
xpattern = gen_rtx_SET (VOIDmode, xreg, gen_rtx_AND (imode, xreg, xmask));
@ -12246,7 +12248,7 @@ avr_fold_builtin (tree fndecl, int n_args ATTRIBUTE_UNUSED, tree *arg,
break;
}
tmap = double_int_to_tree (map_type, tree_to_double_int (arg[0]));
tmap = wide_int_to_tree (map_type, arg[0]);
map = TREE_INT_CST_LOW (tmap);
if (TREE_CODE (tval) != INTEGER_CST
@ -12351,8 +12353,7 @@ avr_fold_builtin (tree fndecl, int n_args ATTRIBUTE_UNUSED, tree *arg,
/* Use map o G^-1 instead of original map to undo the effect of G. */
tmap = double_int_to_tree (map_type,
double_int::from_uhwi (best_g.map));
tmap = wide_int_to_tree (map_type, best_g.map);
return build_call_expr (fndecl, 3, tmap, tbits, tval);
} /* AVR_BUILTIN_INSERT_BITS */

View file

@ -3288,8 +3288,8 @@ bfin_local_alignment (tree type, unsigned align)
memcpy can use 32 bit loads/stores. */
if (TYPE_SIZE (type)
&& TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
&& (TREE_INT_CST_LOW (TYPE_SIZE (type)) > 8
|| TREE_INT_CST_HIGH (TYPE_SIZE (type))) && align < 32)
&& wi::gtu_p (TYPE_SIZE (type), 8)
&& align < 32)
return 32;
return align;
}
@ -3371,15 +3371,14 @@ find_prev_insn_start (rtx insn)
/* Implement TARGET_CAN_USE_DOLOOP_P. */
static bool
bfin_can_use_doloop_p (double_int, double_int iterations_max,
bfin_can_use_doloop_p (const widest_int &, const widest_int &iterations_max,
unsigned int, bool)
{
/* Due to limitations in the hardware (an initial loop count of 0
does not loop 2^32 times) we must avoid to generate a hardware
loops when we cannot rule out this case. */
if (!flag_unsafe_loop_optimizations
&& (iterations_max.high != 0
|| iterations_max.low >= 0xFFFFFFFF))
&& wi::geu_p (iterations_max, 0xFFFFFFFF))
return false;
return true;
}

View file

@ -1299,22 +1299,17 @@ darwin_mergeable_constant_section (tree exp,
{
tree size = TYPE_SIZE_UNIT (TREE_TYPE (exp));
if (TREE_CODE (size) == INTEGER_CST
&& TREE_INT_CST_LOW (size) == 4
&& TREE_INT_CST_HIGH (size) == 0)
return darwin_sections[literal4_section];
else if (TREE_CODE (size) == INTEGER_CST
&& TREE_INT_CST_LOW (size) == 8
&& TREE_INT_CST_HIGH (size) == 0)
return darwin_sections[literal8_section];
else if (HAVE_GAS_LITERAL16
&& TARGET_64BIT
&& TREE_CODE (size) == INTEGER_CST
&& TREE_INT_CST_LOW (size) == 16
&& TREE_INT_CST_HIGH (size) == 0)
return darwin_sections[literal16_section];
else
return readonly_data_section;
if (TREE_CODE (size) == INTEGER_CST)
{
if (wi::eq_p (size, 4))
return darwin_sections[literal4_section];
else if (wi::eq_p (size, 8))
return darwin_sections[literal8_section];
else if (HAVE_GAS_LITERAL16
&& TARGET_64BIT
&& wi::eq_p (size, 16))
return darwin_sections[literal16_section];
}
}
return readonly_data_section;
@ -1741,16 +1736,19 @@ machopic_select_rtx_section (enum machine_mode mode, rtx x,
{
if (GET_MODE_SIZE (mode) == 8
&& (GET_CODE (x) == CONST_INT
|| GET_CODE (x) == CONST_WIDE_INT
|| GET_CODE (x) == CONST_DOUBLE))
return darwin_sections[literal8_section];
else if (GET_MODE_SIZE (mode) == 4
&& (GET_CODE (x) == CONST_INT
|| GET_CODE (x) == CONST_WIDE_INT
|| GET_CODE (x) == CONST_DOUBLE))
return darwin_sections[literal4_section];
else if (HAVE_GAS_LITERAL16
&& TARGET_64BIT
&& GET_MODE_SIZE (mode) == 16
&& (GET_CODE (x) == CONST_INT
|| GET_CODE (x) == CONST_WIDE_INT
|| GET_CODE (x) == CONST_DOUBLE
|| GET_CODE (x) == CONST_VECTOR))
return darwin_sections[literal16_section];

View file

@ -78,6 +78,7 @@ along with GCC; see the file COPYING3. If not see
#include "diagnostic.h"
#include "dumpfile.h"
#include "tree-pass.h"
#include "wide-int.h"
#include "context.h"
#include "pass_manager.h"
#include "target-globals.h"
@ -26582,14 +26583,12 @@ ix86_data_alignment (tree type, int align, bool opt)
&& TYPE_SIZE (type)
&& TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST)
{
if ((TREE_INT_CST_LOW (TYPE_SIZE (type)) >= (unsigned) max_align_compat
|| TREE_INT_CST_HIGH (TYPE_SIZE (type)))
if (wi::geu_p (TYPE_SIZE (type), max_align_compat)
&& align < max_align_compat)
align = max_align_compat;
if ((TREE_INT_CST_LOW (TYPE_SIZE (type)) >= (unsigned) max_align
|| TREE_INT_CST_HIGH (TYPE_SIZE (type)))
&& align < max_align)
align = max_align;
if (wi::geu_p (TYPE_SIZE (type), max_align)
&& align < max_align)
align = max_align;
}
/* x86-64 ABI requires arrays greater than 16 bytes to be aligned
@ -26599,8 +26598,8 @@ ix86_data_alignment (tree type, int align, bool opt)
if ((opt ? AGGREGATE_TYPE_P (type) : TREE_CODE (type) == ARRAY_TYPE)
&& TYPE_SIZE (type)
&& TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
&& (TREE_INT_CST_LOW (TYPE_SIZE (type)) >= 128
|| TREE_INT_CST_HIGH (TYPE_SIZE (type))) && align < 128)
&& wi::geu_p (TYPE_SIZE (type), 128)
&& align < 128)
return 128;
}
@ -26709,13 +26708,13 @@ ix86_local_alignment (tree exp, enum machine_mode mode,
&& TARGET_SSE)
{
if (AGGREGATE_TYPE_P (type)
&& (va_list_type_node == NULL_TREE
|| (TYPE_MAIN_VARIANT (type)
!= TYPE_MAIN_VARIANT (va_list_type_node)))
&& TYPE_SIZE (type)
&& TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
&& (TREE_INT_CST_LOW (TYPE_SIZE (type)) >= 16
|| TREE_INT_CST_HIGH (TYPE_SIZE (type))) && align < 128)
&& (va_list_type_node == NULL_TREE
|| (TYPE_MAIN_VARIANT (type)
!= TYPE_MAIN_VARIANT (va_list_type_node)))
&& TYPE_SIZE (type)
&& TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
&& wi::geu_p (TYPE_SIZE (type), 16)
&& align < 128)
return 128;
}
if (TREE_CODE (type) == ARRAY_TYPE)
@ -41375,7 +41374,7 @@ void ix86_emit_swsqrtsf (rtx res, rtx a, enum machine_mode mode,
e2 = gen_reg_rtx (mode);
e3 = gen_reg_rtx (mode);
real_from_integer (&r, VOIDmode, -3, -1, 0);
real_from_integer (&r, VOIDmode, -3, SIGNED);
mthree = CONST_DOUBLE_FROM_REAL_VALUE (r, SFmode);
real_arithmetic (&r, NEGATE_EXPR, &dconsthalf, NULL);

View file

@ -1085,7 +1085,7 @@ msp430_attr (tree * node,
break;
case INTEGER_CST:
if (TREE_INT_CST_LOW (value) > 63)
if (wi::gtu_p (value, 63))
/* Allow the attribute to be added - the linker script
being used may still recognise this value. */
warning (OPT_Wattributes,

View file

@ -3148,8 +3148,8 @@ nds32_insert_attributes (tree decl, tree *attributes)
id = TREE_VALUE (id_list);
/* Issue error if it is not a valid integer value. */
if (TREE_CODE (id) != INTEGER_CST
|| TREE_INT_CST_LOW (id) < lower_bound
|| TREE_INT_CST_LOW (id) > upper_bound)
|| wi::ltu_p (id, lower_bound)
|| wi::gtu_p (id, upper_bound))
error ("invalid id value for interrupt/exception attribute");
/* Advance to next id. */
@ -3176,8 +3176,8 @@ nds32_insert_attributes (tree decl, tree *attributes)
/* 3. Check valid integer value for reset. */
if (TREE_CODE (id) != INTEGER_CST
|| TREE_INT_CST_LOW (id) < lower_bound
|| TREE_INT_CST_LOW (id) > upper_bound)
|| wi::ltu_p (id, lower_bound)
|| wi::gtu_p (id, upper_bound))
error ("invalid id value for reset attribute");
/* 4. Check valid function for nmi/warm. */

View file

@ -19,7 +19,7 @@
;; Return 1 for anything except PARALLEL.
(define_predicate "any_operand"
(match_code "const_int,const_double,const,symbol_ref,label_ref,subreg,reg,mem"))
(match_code "const_int,const_double,const_wide_int,const,symbol_ref,label_ref,subreg,reg,mem"))
;; Return 1 for any PARALLEL.
(define_predicate "any_parallel_operand"
@ -601,7 +601,7 @@
;; Return 1 if operand is constant zero (scalars and vectors).
(define_predicate "zero_constant"
(and (match_code "const_int,const_double,const_vector")
(and (match_code "const_int,const_double,const_wide_int,const_vector")
(match_test "op == CONST0_RTX (mode)")))
;; Return 1 if operand is 0.0.
@ -796,7 +796,7 @@
;; Return 1 if op is a constant that is not a logical operand, but could
;; be split into one.
(define_predicate "non_logical_cint_operand"
(and (match_code "const_int,const_double")
(and (match_code "const_int,const_wide_int")
(and (not (match_operand 0 "logical_operand"))
(match_operand 0 "reg_or_logical_cint_operand"))))
@ -1073,7 +1073,7 @@
;; Return 1 if this operand is a valid input for a move insn.
(define_predicate "input_operand"
(match_code "symbol_ref,const,reg,subreg,mem,
const_double,const_vector,const_int")
const_double,const_wide_int,const_vector,const_int")
{
/* Memory is always valid. */
if (memory_operand (op, mode))
@ -1086,8 +1086,7 @@
/* Allow any integer constant. */
if (GET_MODE_CLASS (mode) == MODE_INT
&& (GET_CODE (op) == CONST_INT
|| GET_CODE (op) == CONST_DOUBLE))
&& CONST_SCALAR_INT_P (op))
return 1;
/* Allow easy vector constants. */
@ -1126,7 +1125,7 @@
;; Return 1 if this operand is a valid input for a vsx_splat insn.
(define_predicate "splat_input_operand"
(match_code "symbol_ref,const,reg,subreg,mem,
const_double,const_vector,const_int")
const_double,const_wide_int,const_vector,const_int")
{
if (MEM_P (op))
{

View file

@ -28,6 +28,7 @@
#include "tree.h"
#include "stor-layout.h"
#include "stringpool.h"
#include "wide-int.h"
#include "c-family/c-common.h"
#include "c-family/c-pragma.h"
#include "diagnostic-core.h"
@ -4304,8 +4305,7 @@ altivec_resolve_overloaded_builtin (location_t loc, tree fndecl,
mode = TYPE_MODE (arg1_type);
if ((mode == V2DFmode || mode == V2DImode) && VECTOR_MEM_VSX_P (mode)
&& TREE_CODE (arg2) == INTEGER_CST
&& TREE_INT_CST_HIGH (arg2) == 0
&& (TREE_INT_CST_LOW (arg2) == 0 || TREE_INT_CST_LOW (arg2) == 1))
&& wi::ltu_p (arg2, 2))
{
tree call = NULL_TREE;
@ -4319,8 +4319,7 @@ altivec_resolve_overloaded_builtin (location_t loc, tree fndecl,
}
else if (mode == V1TImode && VECTOR_MEM_VSX_P (mode)
&& TREE_CODE (arg2) == INTEGER_CST
&& TREE_INT_CST_HIGH (arg2) == 0
&& TREE_INT_CST_LOW (arg2) == 0)
&& wi::eq_p (arg2, 0))
{
tree call = rs6000_builtin_decls[VSX_BUILTIN_VEC_EXT_V1TI];
return build_call_expr (call, 2, arg1, arg2);
@ -4409,8 +4408,7 @@ altivec_resolve_overloaded_builtin (location_t loc, tree fndecl,
mode = TYPE_MODE (arg1_type);
if ((mode == V2DFmode || mode == V2DImode) && VECTOR_UNIT_VSX_P (mode)
&& TREE_CODE (arg2) == INTEGER_CST
&& TREE_INT_CST_HIGH (arg2) == 0
&& (TREE_INT_CST_LOW (arg2) == 0 || TREE_INT_CST_LOW (arg2) == 1))
&& wi::ltu_p (arg2, 2))
{
tree call = NULL_TREE;
@ -4426,8 +4424,7 @@ altivec_resolve_overloaded_builtin (location_t loc, tree fndecl,
}
else if (mode == V1TImode && VECTOR_UNIT_VSX_P (mode)
&& TREE_CODE (arg2) == INTEGER_CST
&& TREE_INT_CST_HIGH (arg2) == 0
&& TREE_INT_CST_LOW (arg2) == 0)
&& wi::eq_p (arg2, 0))
{
tree call = rs6000_builtin_decls[VSX_BUILTIN_VEC_SET_V1TI];

View file

@ -4969,6 +4969,15 @@ num_insns_constant (rtx op, enum machine_mode mode)
else
return num_insns_constant_wide (INTVAL (op));
case CONST_WIDE_INT:
{
int i;
int ins = CONST_WIDE_INT_NUNITS (op) - 1;
for (i = 0; i < CONST_WIDE_INT_NUNITS (op); i++)
ins += num_insns_constant_wide (CONST_WIDE_INT_ELT (op, i));
return ins;
}
case CONST_DOUBLE:
if (mode == SFmode || mode == SDmode)
{
@ -5143,8 +5152,6 @@ easy_altivec_constant (rtx op, enum machine_mode mode)
else if (mode == V2DImode)
{
/* In case the compiler is built 32-bit, CONST_DOUBLE constants are not
easy. */
if (GET_CODE (CONST_VECTOR_ELT (op, 0)) != CONST_INT
|| GET_CODE (CONST_VECTOR_ELT (op, 1)) != CONST_INT)
return false;
@ -5309,9 +5316,7 @@ paired_expand_vector_init (rtx target, rtx vals)
for (i = 0; i < n_elts; ++i)
{
x = XVECEXP (vals, 0, i);
if (!(CONST_INT_P (x)
|| GET_CODE (x) == CONST_DOUBLE
|| GET_CODE (x) == CONST_FIXED))
if (!CONSTANT_P (x))
++n_var;
}
if (n_var == 0)
@ -5463,9 +5468,7 @@ rs6000_expand_vector_init (rtx target, rtx vals)
for (i = 0; i < n_elts; ++i)
{
x = XVECEXP (vals, 0, i);
if (!(CONST_INT_P (x)
|| GET_CODE (x) == CONST_DOUBLE
|| GET_CODE (x) == CONST_FIXED))
if (!CONSTANT_P (x))
++n_var, one_var = i;
else if (x != CONST0_RTX (inner_mode))
all_const_zero = false;
@ -6703,6 +6706,7 @@ rs6000_legitimize_address (rtx x, rtx oldx ATTRIBUTE_UNUSED,
&& TARGET_NO_TOC
&& ! flag_pic
&& GET_CODE (x) != CONST_INT
&& GET_CODE (x) != CONST_WIDE_INT
&& GET_CODE (x) != CONST_DOUBLE
&& CONSTANT_P (x)
&& GET_MODE_NUNITS (mode) == 1
@ -8167,21 +8171,12 @@ rs6000_emit_move (rtx dest, rtx source, enum machine_mode mode)
}
/* Sanity checks. Check that we get CONST_DOUBLE only when we should. */
if (GET_CODE (operands[1]) == CONST_DOUBLE
&& ! FLOAT_MODE_P (mode)
if (CONST_WIDE_INT_P (operands[1])
&& GET_MODE_BITSIZE (mode) <= HOST_BITS_PER_WIDE_INT)
{
/* FIXME. This should never happen. */
/* Since it seems that it does, do the safe thing and convert
to a CONST_INT. */
operands[1] = gen_int_mode (CONST_DOUBLE_LOW (operands[1]), mode);
/* This should be fixed with the introduction of CONST_WIDE_INT. */
gcc_unreachable ();
}
gcc_assert (GET_CODE (operands[1]) != CONST_DOUBLE
|| FLOAT_MODE_P (mode)
|| ((CONST_DOUBLE_HIGH (operands[1]) != 0
|| CONST_DOUBLE_LOW (operands[1]) < 0)
&& (CONST_DOUBLE_HIGH (operands[1]) != -1
|| CONST_DOUBLE_LOW (operands[1]) >= 0)));
/* Check if GCC is setting up a block move that will end up using FP
registers as temporaries. We must make sure this is acceptable. */
@ -8697,8 +8692,10 @@ rs6000_aggregate_candidate (const_tree type, enum machine_mode *modep)
int count;
tree index = TYPE_DOMAIN (type);
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
count = rs6000_aggregate_candidate (TREE_TYPE (type), modep);
@ -8715,9 +8712,7 @@ rs6000_aggregate_candidate (const_tree type, enum machine_mode *modep)
- tree_to_uhwi (TYPE_MIN_VALUE (index)));
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -8729,8 +8724,10 @@ rs6000_aggregate_candidate (const_tree type, enum machine_mode *modep)
int sub_count;
tree field;
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
return -1;
for (field = TYPE_FIELDS (type); field; field = TREE_CHAIN (field))
@ -8745,9 +8742,7 @@ rs6000_aggregate_candidate (const_tree type, enum machine_mode *modep)
}
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -8761,9 +8756,10 @@ rs6000_aggregate_candidate (const_tree type, enum machine_mode *modep)
int sub_count;
tree field;
/* Can't handle incomplete types. */
if (!COMPLETE_TYPE_P (type))
return -1;
/* Can't handle incomplete types nor sizes that are not
fixed. */
if (!COMPLETE_TYPE_P (type)
|| TREE_CODE (TYPE_SIZE (type)) != INTEGER_CST)
for (field = TYPE_FIELDS (type); field; field = TREE_CHAIN (field))
{
@ -8777,9 +8773,7 @@ rs6000_aggregate_candidate (const_tree type, enum machine_mode *modep)
}
/* There must be no padding. */
if (!tree_fits_uhwi_p (TYPE_SIZE (type))
|| ((HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
!= count * GET_MODE_BITSIZE (*modep)))
if (wi::ne_p (TYPE_SIZE (type), count * GET_MODE_BITSIZE (*modep)))
return -1;
return count;
@ -12474,16 +12468,14 @@ rs6000_expand_ternop_builtin (enum insn_code icode, tree exp, rtx target)
/* Check whether the 2nd and 3rd arguments are integer constants and in
range and prepare arguments. */
STRIP_NOPS (arg1);
if (TREE_CODE (arg1) != INTEGER_CST
|| !IN_RANGE (TREE_INT_CST_LOW (arg1), 0, 1))
if (TREE_CODE (arg1) != INTEGER_CST || wi::geu_p (arg1, 2))
{
error ("argument 2 must be 0 or 1");
return const0_rtx;
}
STRIP_NOPS (arg2);
if (TREE_CODE (arg2) != INTEGER_CST
|| !IN_RANGE (TREE_INT_CST_LOW (arg2), 0, 15))
if (TREE_CODE (arg2) != INTEGER_CST || wi::geu_p (arg1, 16))
{
error ("argument 3 must be in the range 0..15");
return const0_rtx;
@ -17456,6 +17448,7 @@ rs6000_output_move_128bit (rtx operands[])
/* Constants. */
else if (dest_regno >= 0
&& (GET_CODE (src) == CONST_INT
|| GET_CODE (src) == CONST_WIDE_INT
|| GET_CODE (src) == CONST_DOUBLE
|| GET_CODE (src) == CONST_VECTOR))
{
@ -18495,8 +18488,7 @@ rs6000_assemble_integer (rtx x, unsigned int size, int aligned_p)
if (TARGET_RELOCATABLE
&& in_section != toc_section
&& !recurse
&& GET_CODE (x) != CONST_INT
&& GET_CODE (x) != CONST_DOUBLE
&& !CONST_SCALAR_INT_P (x)
&& CONSTANT_P (x))
{
char buf[256];
@ -25243,6 +25235,15 @@ rs6000_hash_constant (rtx k)
case LABEL_REF:
return result * 1231 + (unsigned) INSN_UID (XEXP (k, 0));
case CONST_WIDE_INT:
{
int i;
flen = CONST_WIDE_INT_NUNITS (k);
for (i = 0; i < flen; i++)
result = result * 613 + CONST_WIDE_INT_ELT (k, i);
return result;
}
case CONST_DOUBLE:
if (mode != VOIDmode)
return real_hash (CONST_DOUBLE_REAL_VALUE (k)) * result;
@ -25447,7 +25448,7 @@ output_toc (FILE *file, rtx x, int labelno, enum machine_mode mode)
/* If we're going to put a double constant in the TOC, make sure it's
aligned properly when strict alignment is on. */
if (GET_CODE (x) == CONST_DOUBLE
if ((CONST_DOUBLE_P (x) || CONST_WIDE_INT_P (x))
&& STRICT_ALIGNMENT
&& GET_MODE_BITSIZE (mode) >= 64
&& ! (TARGET_NO_FP_IN_TOC && ! TARGET_MINIMAL_TOC)) {
@ -29453,6 +29454,7 @@ rs6000_rtx_costs (rtx x, int code, int outer_code, int opno ATTRIBUTE_UNUSED,
/* FALLTHRU */
case CONST_DOUBLE:
case CONST_WIDE_INT:
case CONST:
case HIGH:
case SYMBOL_REF:
@ -30092,7 +30094,7 @@ rs6000_emit_swrsqrt (rtx dst, rtx src)
gcc_assert (code != CODE_FOR_nothing);
/* Load up the constant 1.5 either as a scalar, or as a vector. */
real_from_integer (&dconst3_2, VOIDmode, 3, 0, 0);
real_from_integer (&dconst3_2, VOIDmode, 3, SIGNED);
SET_REAL_EXP (&dconst3_2, REAL_EXP (&dconst3_2) - 1);
halfthree = rs6000_load_constant_and_splat (mode, dconst3_2);

View file

@ -2689,3 +2689,4 @@ enum rs6000_builtin_type_index
extern GTY(()) tree rs6000_builtin_types[RS6000_BTI_MAX];
extern GTY(()) tree rs6000_builtin_decls[RS6000_BUILTIN_COUNT];
#define TARGET_SUPPORTS_WIDE_INT 1

View file

@ -10336,7 +10336,7 @@
(define_split
[(set (match_operand:DI 0 "gpc_reg_operand" "")
(match_operand:DI 1 "const_double_operand" ""))]
(match_operand:DI 1 "const_scalar_int_operand" ""))]
"TARGET_POWERPC64 && num_insns_constant (operands[1], DImode) > 1"
[(set (match_dup 0) (match_dup 2))
(set (match_dup 0) (plus:DI (match_dup 0) (match_dup 3)))]
@ -10402,7 +10402,7 @@
(define_split
[(set (match_operand:TI2 0 "int_reg_operand" "")
(match_operand:TI2 1 "const_double_operand" ""))]
(match_operand:TI2 1 "const_scalar_int_operand" ""))]
"TARGET_POWERPC64
&& (VECTOR_MEM_NONE_P (<MODE>mode)
|| (reload_completed && INT_REGNO_P (REGNO (operands[0]))))"
@ -10414,12 +10414,12 @@
<MODE>mode);
operands[3] = operand_subword_force (operands[0], WORDS_BIG_ENDIAN != 0,
<MODE>mode);
if (GET_CODE (operands[1]) == CONST_DOUBLE)
if (CONST_WIDE_INT_P (operands[1]))
{
operands[4] = GEN_INT (CONST_DOUBLE_HIGH (operands[1]));
operands[5] = GEN_INT (CONST_DOUBLE_LOW (operands[1]));
operands[4] = GEN_INT (CONST_WIDE_INT_ELT (operands[1], 1));
operands[5] = GEN_INT (CONST_WIDE_INT_ELT (operands[1], 0));
}
else if (GET_CODE (operands[1]) == CONST_INT)
else if (CONST_INT_P (operands[1]))
{
operands[4] = GEN_INT (- (INTVAL (operands[1]) < 0));
operands[5] = operands[1];

View file

@ -474,9 +474,7 @@ s390_handle_hotpatch_attribute (tree *node, tree name, tree args,
if (TREE_CODE (expr) != INTEGER_CST
|| !INTEGRAL_TYPE_P (TREE_TYPE (expr))
|| TREE_INT_CST_HIGH (expr) != 0
|| TREE_INT_CST_LOW (expr) > (unsigned int)
s390_hotpatch_trampoline_halfwords_max)
|| wi::gtu_p (expr, s390_hotpatch_trampoline_halfwords_max))
{
error ("requested %qE attribute is not a non-negative integer"
" constant or too large (max. %d)", name,

View file

@ -86,7 +86,7 @@ solaris_pragma_align (cpp_reader *pfile ATTRIBUTE_UNUSED)
{
tree t, x;
enum cpp_ttype ttype;
HOST_WIDE_INT low;
unsigned HOST_WIDE_INT low;
if (pragma_lex (&x) != CPP_NUMBER
|| pragma_lex (&t) != CPP_OPEN_PAREN)
@ -96,7 +96,7 @@ solaris_pragma_align (cpp_reader *pfile ATTRIBUTE_UNUSED)
}
low = TREE_INT_CST_LOW (x);
if (TREE_INT_CST_HIGH (x) != 0
if (!tree_fits_uhwi_p (x)
|| (low != 1 && low != 2 && low != 4 && low != 8 && low != 16
&& low != 32 && low != 64 && low != 128))
{

View file

@ -69,6 +69,7 @@ along with GCC; see the file COPYING3. If not see
#include "opts.h"
#include "tree-pass.h"
#include "context.h"
#include "wide-int.h"
/* Processor costs */
@ -10930,30 +10931,30 @@ sparc_fold_builtin (tree fndecl, int n_args ATTRIBUTE_UNUSED,
&& TREE_CODE (arg2) == INTEGER_CST)
{
bool overflow = false;
double_int result = TREE_INT_CST (arg2);
double_int tmp;
wide_int result = arg2;
wide_int tmp;
unsigned i;
for (i = 0; i < VECTOR_CST_NELTS (arg0); ++i)
{
double_int e0 = TREE_INT_CST (VECTOR_CST_ELT (arg0, i));
double_int e1 = TREE_INT_CST (VECTOR_CST_ELT (arg1, i));
tree e0 = VECTOR_CST_ELT (arg0, i);
tree e1 = VECTOR_CST_ELT (arg1, i);
bool neg1_ovf, neg2_ovf, add1_ovf, add2_ovf;
tmp = e1.neg_with_overflow (&neg1_ovf);
tmp = e0.add_with_sign (tmp, false, &add1_ovf);
if (tmp.is_negative ())
tmp = tmp.neg_with_overflow (&neg2_ovf);
tmp = wi::neg (e1, &neg1_ovf);
tmp = wi::add (e0, tmp, SIGNED, &add1_ovf);
if (wi::neg_p (tmp))
tmp = wi::neg (tmp, &neg2_ovf);
else
neg2_ovf = false;
result = result.add_with_sign (tmp, false, &add2_ovf);
result = wi::add (result, tmp, SIGNED, &add2_ovf);
overflow |= neg1_ovf | neg2_ovf | add1_ovf | add2_ovf;
}
gcc_assert (!overflow);
return build_int_cst_wide (rtype, result.low, result.high);
return wide_int_to_tree (rtype, result);
}
default:

View file

@ -45,6 +45,7 @@ along with GCC; see the file COPYING3. If not see
#include "tm_p.h"
#include "target.h"
#include "target-def.h"
#include "wide-int.h"
static void vax_option_override (void);
static bool vax_legitimate_address_p (enum machine_mode, rtx, bool);
@ -645,7 +646,7 @@ vax_float_literal (rtx c)
{
int x = 1 << i;
bool ok;
REAL_VALUE_FROM_INT (s, x, 0, mode);
real_from_integer (&s, mode, x, SIGNED);
if (REAL_VALUES_EQUAL (r, s))
return true;

View file

@ -58,6 +58,9 @@ typedef const struct rtx_def *const_rtx;
struct rtvec_def;
typedef struct rtvec_def *rtvec;
typedef const struct rtvec_def *const_rtvec;
struct hwivec_def;
typedef struct hwivec_def *hwivec;
typedef const struct hwivec_def *const_hwivec;
union tree_node;
typedef union tree_node *tree;
typedef const union tree_node *const_tree;

View file

@ -41,6 +41,7 @@ along with GCC; see the file COPYING3. If not see
#include "c-family/c-objc.h"
#include "timevar.h"
#include "cgraph.h"
#include "wide-int.h"
/* The various kinds of conversion. */
@ -6576,8 +6577,7 @@ type_passed_as (tree type)
else if (targetm.calls.promote_prototypes (type)
&& INTEGRAL_TYPE_P (type)
&& COMPLETE_TYPE_P (type)
&& INT_CST_LT_UNSIGNED (TYPE_SIZE (type),
TYPE_SIZE (integer_type_node)))
&& tree_int_cst_lt (TYPE_SIZE (type), TYPE_SIZE (integer_type_node)))
type = integer_type_node;
return type;
@ -6617,8 +6617,7 @@ convert_for_arg_passing (tree type, tree val, tsubst_flags_t complain)
else if (targetm.calls.promote_prototypes (type)
&& INTEGRAL_TYPE_P (type)
&& COMPLETE_TYPE_P (type)
&& INT_CST_LT_UNSIGNED (TYPE_SIZE (type),
TYPE_SIZE (integer_type_node)))
&& tree_int_cst_lt (TYPE_SIZE (type), TYPE_SIZE (integer_type_node)))
val = cp_perform_integral_promotions (val, complain);
if ((complain & tf_warning)
&& warn_suggest_attribute_format)

View file

@ -40,6 +40,7 @@ along with GCC; see the file COPYING3. If not see
#include "dumpfile.h"
#include "splay-tree.h"
#include "gimplify.h"
#include "wide-int.h"
/* The number of nested classes being processed. If we are not in the
scope of any class, this is zero. */
@ -3811,7 +3812,7 @@ walk_subobject_offsets (tree type,
/* If this OFFSET is bigger than the MAX_OFFSET, then we should
stop. */
if (max_offset && INT_CST_LT (max_offset, offset))
if (max_offset && tree_int_cst_lt (max_offset, offset))
return 0;
if (type == error_mark_node)
@ -3968,8 +3969,8 @@ walk_subobject_offsets (tree type,
for (index = size_zero_node;
/* G++ 3.2 had an off-by-one error here. */
(abi_version_at_least (2)
? !INT_CST_LT (TYPE_MAX_VALUE (domain), index)
: INT_CST_LT (index, TYPE_MAX_VALUE (domain)));
? !tree_int_cst_lt (TYPE_MAX_VALUE (domain), index)
: tree_int_cst_lt (index, TYPE_MAX_VALUE (domain)));
index = size_binop (PLUS_EXPR, index, size_one_node))
{
r = walk_subobject_offsets (TREE_TYPE (type),
@ -3985,7 +3986,7 @@ walk_subobject_offsets (tree type,
/* If this new OFFSET is bigger than the MAX_OFFSET, then
there's no point in iterating through the remaining
elements of the array. */
if (max_offset && INT_CST_LT (max_offset, offset))
if (max_offset && tree_int_cst_lt (max_offset, offset))
break;
}
}
@ -5922,7 +5923,7 @@ end_of_class (tree t, int include_virtuals_p)
continue;
offset = end_of_base (base_binfo);
if (INT_CST_LT_UNSIGNED (result, offset))
if (tree_int_cst_lt (result, offset))
result = offset;
}
@ -5932,7 +5933,7 @@ end_of_class (tree t, int include_virtuals_p)
vec_safe_iterate (vbases, i, &base_binfo); i++)
{
offset = end_of_base (base_binfo);
if (INT_CST_LT_UNSIGNED (result, offset))
if (tree_int_cst_lt (result, offset))
result = offset;
}
@ -6012,7 +6013,7 @@ include_empty_classes (record_layout_info rli)
CLASSTYPE_AS_BASE (rli->t) != NULL_TREE);
rli_size = rli_size_unit_so_far (rli);
if (TREE_CODE (rli_size) == INTEGER_CST
&& INT_CST_LT_UNSIGNED (rli_size, eoc))
&& tree_int_cst_lt (rli_size, eoc))
{
if (!abi_version_at_least (2))
/* In version 1 of the ABI, the size of a class that ends with
@ -6128,7 +6129,7 @@ layout_class_type (tree t, tree *virtuals_p)
type, then there are some special rules for allocating
it. */
if (DECL_C_BIT_FIELD (field)
&& INT_CST_LT (TYPE_SIZE (type), DECL_SIZE (field)))
&& tree_int_cst_lt (TYPE_SIZE (type), DECL_SIZE (field)))
{
unsigned int itk;
tree integer_type;
@ -6139,10 +6140,10 @@ layout_class_type (tree t, tree *virtuals_p)
bits as additional padding. */
for (itk = itk_char; itk != itk_none; ++itk)
if (integer_types[itk] != NULL_TREE
&& (INT_CST_LT (size_int (MAX_FIXED_MODE_SIZE),
TYPE_SIZE (integer_types[itk]))
|| INT_CST_LT (DECL_SIZE (field),
TYPE_SIZE (integer_types[itk]))))
&& (tree_int_cst_lt (size_int (MAX_FIXED_MODE_SIZE),
TYPE_SIZE (integer_types[itk]))
|| tree_int_cst_lt (DECL_SIZE (field),
TYPE_SIZE (integer_types[itk]))))
break;
/* ITK now indicates a type that is too large for the
@ -6158,7 +6159,7 @@ layout_class_type (tree t, tree *virtuals_p)
3.2 always created a padding field, even if it had zero
width. */
if (!abi_version_at_least (2)
|| INT_CST_LT (TYPE_SIZE (integer_type), DECL_SIZE (field)))
|| tree_int_cst_lt (TYPE_SIZE (integer_type), DECL_SIZE (field)))
{
if (abi_version_at_least (2) && TREE_CODE (t) == UNION_TYPE)
/* In a union, the padding field must have the full width

View file

@ -36,6 +36,7 @@ along with GCC; see the file COPYING3. If not see
#include "convert.h"
#include "decl.h"
#include "target.h"
#include "wide-int.h"
static tree cp_convert_to_pointer (tree, tree, tsubst_flags_t);
static tree convert_to_pointer_force (tree, tree, tsubst_flags_t);
@ -582,9 +583,7 @@ ignore_overflows (tree expr, tree orig)
{
gcc_assert (!TREE_OVERFLOW (orig));
/* Ensure constant sharing. */
expr = build_int_cst_wide (TREE_TYPE (expr),
TREE_INT_CST_LOW (expr),
TREE_INT_CST_HIGH (expr));
expr = wide_int_to_tree (TREE_TYPE (expr), expr);
}
return expr;
}

View file

@ -60,6 +60,7 @@ along with GCC; see the file COPYING3. If not see
#include "plugin.h"
#include "cgraph.h"
#include "cilk.h"
#include "wide-int.h"
/* Possible cases of bad specifiers type used by bad_specifiers. */
enum bad_spec_place {
@ -4844,7 +4845,7 @@ check_array_designated_initializer (constructor_elt *ce,
if (TREE_CODE (ce->index) == INTEGER_CST)
{
/* A C99 designator is OK if it matches the current index. */
if (TREE_INT_CST_LOW (ce->index) == index)
if (wi::eq_p (ce->index, index))
return true;
else
sorry ("non-trivial designated initializers not supported");
@ -8316,7 +8317,7 @@ compute_array_index_type (tree name, tree size, tsubst_flags_t complain)
constant_expression_error (size);
/* An array must have a positive number of elements. */
if (INT_CST_LT (size, integer_zero_node))
if (tree_int_cst_lt (size, integer_zero_node))
{
if (!(complain & tf_error))
return error_mark_node;
@ -12677,9 +12678,9 @@ finish_enum_value_list (tree enumtype)
enumeration. We must do this before the type of MINNODE and
MAXNODE are transformed, since tree_int_cst_min_precision relies
on the TREE_TYPE of the value it is passed. */
bool unsignedp = tree_int_cst_sgn (minnode) >= 0;
int lowprec = tree_int_cst_min_precision (minnode, unsignedp);
int highprec = tree_int_cst_min_precision (maxnode, unsignedp);
signop sgn = tree_int_cst_sgn (minnode) >= 0 ? UNSIGNED : SIGNED;
int lowprec = tree_int_cst_min_precision (minnode, sgn);
int highprec = tree_int_cst_min_precision (maxnode, sgn);
int precision = MAX (lowprec, highprec);
unsigned int itk;
bool use_short_enum;
@ -12711,7 +12712,7 @@ finish_enum_value_list (tree enumtype)
underlying_type = integer_types[itk];
if (underlying_type != NULL_TREE
&& TYPE_PRECISION (underlying_type) >= precision
&& TYPE_UNSIGNED (underlying_type) == unsignedp)
&& TYPE_SIGN (underlying_type) == sgn)
break;
}
if (itk == itk_none)
@ -12758,12 +12759,11 @@ finish_enum_value_list (tree enumtype)
= build_distinct_type_copy (underlying_type);
TYPE_PRECISION (ENUM_UNDERLYING_TYPE (enumtype)) = precision;
set_min_and_max_values_for_integral_type
(ENUM_UNDERLYING_TYPE (enumtype), precision, unsignedp);
(ENUM_UNDERLYING_TYPE (enumtype), precision, sgn);
/* If -fstrict-enums, still constrain TYPE_MIN/MAX_VALUE. */
if (flag_strict_enums)
set_min_and_max_values_for_integral_type (enumtype, precision,
unsignedp);
set_min_and_max_values_for_integral_type (enumtype, precision, sgn);
}
else
underlying_type = ENUM_UNDERLYING_TYPE (enumtype);
@ -12887,14 +12887,14 @@ build_enumerator (tree name, tree value, tree enumtype, location_t loc)
value = error_mark_node;
else
{
double_int di = TREE_INT_CST (prev_value)
.add_with_sign (double_int_one,
false, &overflowed);
tree type = TREE_TYPE (prev_value);
signop sgn = TYPE_SIGN (type);
widest_int wi = wi::add (wi::to_widest (prev_value), 1, sgn,
&overflowed);
if (!overflowed)
{
tree type = TREE_TYPE (prev_value);
bool pos = TYPE_UNSIGNED (type) || !di.is_negative ();
if (!double_int_fits_to_tree_p (type, di))
bool pos = !wi::neg_p (wi, sgn);
if (!wi::fits_to_tree_p (wi, type))
{
unsigned int itk;
for (itk = itk_int; itk != itk_none; itk++)
@ -12902,7 +12902,7 @@ build_enumerator (tree name, tree value, tree enumtype, location_t loc)
type = integer_types[itk];
if (type != NULL_TREE
&& (pos || !TYPE_UNSIGNED (type))
&& double_int_fits_to_tree_p (type, di))
&& wi::fits_to_tree_p (wi, type))
break;
}
if (type && cxx_dialect < cxx11
@ -12914,7 +12914,7 @@ incremented enumerator value is too large for %<long%>");
if (type == NULL_TREE)
overflowed = true;
else
value = double_int_to_tree (type, di);
value = wide_int_to_tree (type, wi);
}
if (overflowed)

View file

@ -31,6 +31,7 @@ along with GCC; see the file COPYING3. If not see
#include "flags.h"
#include "target.h"
#include "gimplify.h"
#include "wide-int.h"
static bool begin_init_stmts (tree *, tree *);
static tree finish_init_stmts (bool, tree, tree);
@ -2284,10 +2285,10 @@ build_new_1 (vec<tree, va_gc> **placement, tree type, tree nelts,
/* For arrays, a bounds checks on the NELTS parameter. */
tree outer_nelts_check = NULL_TREE;
bool outer_nelts_from_type = false;
double_int inner_nelts_count = double_int_one;
offset_int inner_nelts_count = 1;
tree alloc_call, alloc_expr;
/* Size of the inner array elements. */
double_int inner_size;
offset_int inner_size;
/* The address returned by the call to "operator new". This node is
a VAR_DECL and is therefore reusable. */
tree alloc_node;
@ -2343,9 +2344,8 @@ build_new_1 (vec<tree, va_gc> **placement, tree type, tree nelts,
if (TREE_CODE (inner_nelts_cst) == INTEGER_CST)
{
bool overflow;
double_int result = TREE_INT_CST (inner_nelts_cst)
.mul_with_sign (inner_nelts_count,
false, &overflow);
offset_int result = wi::mul (wi::to_offset (inner_nelts_cst),
inner_nelts_count, SIGNED, &overflow);
if (overflow)
{
if (complain & tf_error)
@ -2456,42 +2456,40 @@ build_new_1 (vec<tree, va_gc> **placement, tree type, tree nelts,
{
/* Maximum available size in bytes. Half of the address space
minus the cookie size. */
double_int max_size
= double_int_one.llshift (TYPE_PRECISION (sizetype) - 1,
HOST_BITS_PER_DOUBLE_INT);
offset_int max_size
= wi::set_bit_in_zero <offset_int> (TYPE_PRECISION (sizetype) - 1);
/* Maximum number of outer elements which can be allocated. */
double_int max_outer_nelts;
offset_int max_outer_nelts;
tree max_outer_nelts_tree;
gcc_assert (TREE_CODE (size) == INTEGER_CST);
cookie_size = targetm.cxx.get_cookie_size (elt_type);
gcc_assert (TREE_CODE (cookie_size) == INTEGER_CST);
gcc_checking_assert (TREE_INT_CST (cookie_size).ult (max_size));
gcc_checking_assert (wi::ltu_p (wi::to_offset (cookie_size), max_size));
/* Unconditionally subtract the cookie size. This decreases the
maximum object size and is safe even if we choose not to use
a cookie after all. */
max_size -= TREE_INT_CST (cookie_size);
max_size -= wi::to_offset (cookie_size);
bool overflow;
inner_size = TREE_INT_CST (size)
.mul_with_sign (inner_nelts_count, false, &overflow);
if (overflow || inner_size.ugt (max_size))
inner_size = wi::mul (wi::to_offset (size), inner_nelts_count, SIGNED,
&overflow);
if (overflow || wi::gtu_p (inner_size, max_size))
{
if (complain & tf_error)
error ("size of array is too large");
return error_mark_node;
}
max_outer_nelts = max_size.udiv (inner_size, TRUNC_DIV_EXPR);
max_outer_nelts = wi::udiv_trunc (max_size, inner_size);
/* Only keep the top-most seven bits, to simplify encoding the
constant in the instruction stream. */
{
unsigned shift = HOST_BITS_PER_DOUBLE_INT - 7
- (max_outer_nelts.high ? clz_hwi (max_outer_nelts.high)
: (HOST_BITS_PER_WIDE_INT + clz_hwi (max_outer_nelts.low)));
max_outer_nelts
= max_outer_nelts.lrshift (shift, HOST_BITS_PER_DOUBLE_INT)
.llshift (shift, HOST_BITS_PER_DOUBLE_INT);
unsigned shift = (max_outer_nelts.get_precision ()) - 7
- wi::clz (max_outer_nelts);
max_outer_nelts = wi::lshift (wi::lrshift (max_outer_nelts, shift),
shift);
}
max_outer_nelts_tree = double_int_to_tree (sizetype, max_outer_nelts);
max_outer_nelts_tree = wide_int_to_tree (sizetype, max_outer_nelts);
size = size_binop (MULT_EXPR, size, convert (sizetype, nelts));
outer_nelts_check = fold_build2 (LE_EXPR, boolean_type_node,
@ -2572,7 +2570,7 @@ build_new_1 (vec<tree, va_gc> **placement, tree type, tree nelts,
cookie_size = NULL_TREE;
/* No size arithmetic necessary, so the size check is
not needed. */
if (outer_nelts_check != NULL && inner_size.is_one ())
if (outer_nelts_check != NULL && inner_size == 1)
outer_nelts_check = NULL_TREE;
}
/* Perform the overflow check. */
@ -2617,7 +2615,7 @@ build_new_1 (vec<tree, va_gc> **placement, tree type, tree nelts,
cookie_size = NULL_TREE;
/* No size arithmetic necessary, so the size check is
not needed. */
if (outer_nelts_check != NULL && inner_size.is_one ())
if (outer_nelts_check != NULL && inner_size == 1)
outer_nelts_check = NULL_TREE;
}

View file

@ -57,6 +57,7 @@ along with GCC; see the file COPYING3. If not see
#include "flags.h"
#include "target.h"
#include "cgraph.h"
#include "wide-int.h"
/* Debugging support. */
@ -1513,8 +1514,8 @@ static inline void
write_integer_cst (const tree cst)
{
int sign = tree_int_cst_sgn (cst);
if (TREE_INT_CST_HIGH (cst) + (sign < 0))
widest_int abs_value = wi::abs (wi::to_widest (cst));
if (!wi::fits_uhwi_p (abs_value))
{
/* A bignum. We do this in chunks, each of which fits in a
HOST_WIDE_INT. */
@ -1540,8 +1541,7 @@ write_integer_cst (const tree cst)
type = c_common_signed_or_unsigned_type (1, TREE_TYPE (cst));
base = build_int_cstu (type, chunk);
n = build_int_cst_wide (type,
TREE_INT_CST_LOW (cst), TREE_INT_CST_HIGH (cst));
n = wide_int_to_tree (type, cst);
if (sign < 0)
{
@ -1568,14 +1568,9 @@ write_integer_cst (const tree cst)
else
{
/* A small num. */
unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (cst);
if (sign < 0)
{
write_char ('n');
low = -low;
}
write_unsigned_number (low);
write_char ('n');
write_unsigned_number (abs_value.to_uhwi ());
}
}
@ -3226,12 +3221,12 @@ write_array_type (const tree type)
{
/* The ABI specifies that we should mangle the number of
elements in the array, not the largest allowed index. */
double_int dmax = tree_to_double_int (max) + double_int_one;
offset_int wmax = wi::to_offset (max) + 1;
/* Truncate the result - this will mangle [0, SIZE_INT_MAX]
number of elements as zero. */
dmax = dmax.zext (TYPE_PRECISION (TREE_TYPE (max)));
gcc_assert (dmax.fits_uhwi ());
write_unsigned_number (dmax.low);
wmax = wi::zext (wmax, TYPE_PRECISION (TREE_TYPE (max)));
gcc_assert (wi::fits_uhwi_p (wmax));
write_unsigned_number (wmax.to_uhwi ());
}
else
{

View file

@ -36,6 +36,7 @@ along with GCC; see the file COPYING3. If not see
#include "hash-table.h"
#include "gimple-expr.h"
#include "gimplify.h"
#include "wide-int.h"
static tree bot_manip (tree *, int *, void *);
static tree bot_replace (tree *, int *, void *);
@ -2620,8 +2621,7 @@ cp_tree_equal (tree t1, tree t2)
switch (code1)
{
case INTEGER_CST:
return TREE_INT_CST_LOW (t1) == TREE_INT_CST_LOW (t2)
&& TREE_INT_CST_HIGH (t1) == TREE_INT_CST_HIGH (t2);
return tree_int_cst_equal (t1, t2);
case REAL_CST:
return REAL_VALUES_EQUAL (TREE_REAL_CST (t1), TREE_REAL_CST (t2));

View file

@ -36,6 +36,7 @@ along with GCC; see the file COPYING3. If not see
#include "cp-tree.h"
#include "flags.h"
#include "diagnostic-core.h"
#include "wide-int.h"
static tree
process_init_constructor (tree type, tree init, tsubst_flags_t complain);
@ -1165,12 +1166,10 @@ process_init_constructor_array (tree type, tree init,
{
tree domain = TYPE_DOMAIN (type);
if (domain && TREE_CONSTANT (TYPE_MAX_VALUE (domain)))
len = (tree_to_double_int (TYPE_MAX_VALUE (domain))
- tree_to_double_int (TYPE_MIN_VALUE (domain))
+ double_int_one)
.ext (TYPE_PRECISION (TREE_TYPE (domain)),
TYPE_UNSIGNED (TREE_TYPE (domain)))
.low;
len = wi::ext (wi::to_offset (TYPE_MAX_VALUE (domain))
- wi::to_offset (TYPE_MIN_VALUE (domain)) + 1,
TYPE_PRECISION (TREE_TYPE (domain)),
TYPE_SIGN (TREE_TYPE (domain))).to_uhwi ();
else
unbounded = true; /* Take as many as there are. */
}

View file

@ -2336,15 +2336,20 @@ hash_rtx_cb (const_rtx x, enum machine_mode mode,
+ (unsigned int) INTVAL (x));
return hash;
case CONST_WIDE_INT:
for (i = 0; i < CONST_WIDE_INT_NUNITS (x); i++)
hash += CONST_WIDE_INT_ELT (x, i);
return hash;
case CONST_DOUBLE:
/* This is like the general case, except that it only counts
the integers representing the constant. */
hash += (unsigned int) code + (unsigned int) GET_MODE (x);
if (GET_MODE (x) != VOIDmode)
hash += real_hash (CONST_DOUBLE_REAL_VALUE (x));
else
if (TARGET_SUPPORTS_WIDE_INT == 0 && GET_MODE (x) == VOIDmode)
hash += ((unsigned int) CONST_DOUBLE_LOW (x)
+ (unsigned int) CONST_DOUBLE_HIGH (x));
else
hash += real_hash (CONST_DOUBLE_REAL_VALUE (x));
return hash;
case CONST_FIXED:
@ -3779,6 +3784,7 @@ equiv_constant (rtx x)
/* See if we previously assigned a constant value to this SUBREG. */
if ((new_rtx = lookup_as_function (x, CONST_INT)) != 0
|| (new_rtx = lookup_as_function (x, CONST_WIDE_INT)) != 0
|| (new_rtx = lookup_as_function (x, CONST_DOUBLE)) != 0
|| (new_rtx = lookup_as_function (x, CONST_FIXED)) != 0)
return new_rtx;

View file

@ -942,8 +942,7 @@ rtx_equal_for_cselib_1 (rtx x, rtx y, enum machine_mode memmode)
/* These won't be handled correctly by the code below. */
switch (GET_CODE (x))
{
case CONST_DOUBLE:
case CONST_FIXED:
CASE_CONST_UNIQUE:
case DEBUG_EXPR:
return 0;
@ -1125,15 +1124,20 @@ cselib_hash_rtx (rtx x, int create, enum machine_mode memmode)
hash += ((unsigned) CONST_INT << 7) + UINTVAL (x);
return hash ? hash : (unsigned int) CONST_INT;
case CONST_WIDE_INT:
for (i = 0; i < CONST_WIDE_INT_NUNITS (x); i++)
hash += CONST_WIDE_INT_ELT (x, i);
return hash;
case CONST_DOUBLE:
/* This is like the general case, except that it only counts
the integers representing the constant. */
hash += (unsigned) code + (unsigned) GET_MODE (x);
if (GET_MODE (x) != VOIDmode)
hash += real_hash (CONST_DOUBLE_REAL_VALUE (x));
else
if (TARGET_SUPPORTS_WIDE_INT == 0 && GET_MODE (x) == VOIDmode)
hash += ((unsigned) CONST_DOUBLE_LOW (x)
+ (unsigned) CONST_DOUBLE_HIGH (x));
else
hash += real_hash (CONST_DOUBLE_REAL_VALUE (x));
return hash ? hash : (unsigned int) CONST_DOUBLE;
case CONST_FIXED:

View file

@ -692,88 +692,39 @@ stabstr_U (unsigned HOST_WIDE_INT num)
static void
stabstr_O (tree cst)
{
unsigned HOST_WIDE_INT high = TREE_INT_CST_HIGH (cst);
unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (cst);
char buf[128];
char *p = buf + sizeof buf;
/* GDB wants constants with no extra leading "1" bits, so
we need to remove any sign-extension that might be
present. */
{
const unsigned int width = TYPE_PRECISION (TREE_TYPE (cst));
if (width == HOST_BITS_PER_DOUBLE_INT)
;
else if (width > HOST_BITS_PER_WIDE_INT)
high &= (((HOST_WIDE_INT) 1 << (width - HOST_BITS_PER_WIDE_INT)) - 1);
else if (width == HOST_BITS_PER_WIDE_INT)
high = 0;
else
high = 0, low &= (((HOST_WIDE_INT) 1 << width) - 1);
}
int prec = TYPE_PRECISION (TREE_TYPE (cst));
int res_pres = prec % 3;
int i;
unsigned int digit;
/* Leading zero for base indicator. */
stabstr_C ('0');
/* If the value is zero, the base indicator will serve as the value
all by itself. */
if (high == 0 && low == 0)
if (wi::eq_p (cst, 0))
return;
/* If the high half is zero, we need only print the low half normally. */
if (high == 0)
NUMBER_FMT_LOOP (p, low, 8);
else
/* GDB wants constants with no extra leading "1" bits, so
we need to remove any sign-extension that might be
present. */
if (res_pres == 1)
{
/* When high != 0, we need to print enough zeroes from low to
give the digits from high their proper place-values. Hence
NUMBER_FMT_LOOP cannot be used. */
const int n_digits = HOST_BITS_PER_WIDE_INT / 3;
int i;
for (i = 1; i <= n_digits; i++)
{
unsigned int digit = low % 8;
low /= 8;
*--p = '0' + digit;
}
/* Octal digits carry exactly three bits of information. The
width of a HOST_WIDE_INT is not normally a multiple of three.
Therefore, the next digit printed probably needs to carry
information from both low and high. */
if (HOST_BITS_PER_WIDE_INT % 3 != 0)
{
const int n_leftover_bits = HOST_BITS_PER_WIDE_INT % 3;
const int n_bits_from_high = 3 - n_leftover_bits;
const unsigned HOST_WIDE_INT
low_mask = (((unsigned HOST_WIDE_INT)1) << n_leftover_bits) - 1;
const unsigned HOST_WIDE_INT
high_mask = (((unsigned HOST_WIDE_INT)1) << n_bits_from_high) - 1;
unsigned int digit;
/* At this point, only the bottom n_leftover_bits bits of low
should be set. */
gcc_assert (!(low & ~low_mask));
digit = (low | ((high & high_mask) << n_leftover_bits));
high >>= n_bits_from_high;
*--p = '0' + digit;
}
/* Now we can format high in the normal manner. However, if
the only bits of high that were set were handled by the
digit split between low and high, high will now be zero, and
we don't want to print extra digits in that case. */
if (high)
NUMBER_FMT_LOOP (p, high, 8);
digit = wi::extract_uhwi (cst, prec - 1, 1);
stabstr_C ('0' + digit);
}
else if (res_pres == 2)
{
digit = wi::extract_uhwi (cst, prec - 2, 2);
stabstr_C ('0' + digit);
}
obstack_grow (&stabstr_ob, p, (buf + sizeof buf) - p);
prec -= res_pres;
for (i = prec - 3; i >= 0; i = i - 3)
{
digit = wi::extract_uhwi (cst, i, 3);
stabstr_C ('0' + digit);
}
}
/* Called whenever it is safe to break a stabs string into multiple
@ -2301,10 +2252,7 @@ dbxout_type (tree type, int full)
if (TREE_CODE (value) == CONST_DECL)
value = DECL_INITIAL (value);
if (TREE_INT_CST_HIGH (value) == 0)
stabstr_D (TREE_INT_CST_LOW (value));
else if (TREE_INT_CST_HIGH (value) == -1
&& (HOST_WIDE_INT) TREE_INT_CST_LOW (value) < 0)
if (cst_and_fits_in_hwi (value))
stabstr_D (TREE_INT_CST_LOW (value));
else
stabstr_O (value);

View file

@ -471,6 +471,14 @@ see the files COPYING3 and COPYING.RUNTIME respectively. If not, see
your target, you should override these values by defining the
appropriate symbols in your tm.h file. */
#if BITS_PER_UNIT == 8
#define LOG2_BITS_PER_UNIT 3
#elif BITS_PER_UNIT == 16
#define LOG2_BITS_PER_UNIT 4
#else
#error Unknown BITS_PER_UNIT
#endif
#ifndef BITS_PER_WORD
#define BITS_PER_WORD (BITS_PER_UNIT * UNITS_PER_WORD)
#endif
@ -1392,6 +1400,14 @@ see the files COPYING3 and COPYING.RUNTIME respectively. If not, see
#define SWITCHABLE_TARGET 0
#endif
/* If the target supports integers that are wider than two
HOST_WIDE_INTs on the host compiler, then the target should define
TARGET_SUPPORTS_WIDE_INT and make the appropriate fixups.
Otherwise the compiler really is not robust. */
#ifndef TARGET_SUPPORTS_WIDE_INT
#define TARGET_SUPPORTS_WIDE_INT 0
#endif
#endif /* GCC_INSN_FLAGS_H */
#endif /* ! GCC_DEFAULTS_H */

View file

@ -24,6 +24,7 @@ along with GCC; see the file COPYING3. If not see
#include "tree.h"
#include "tm_p.h"
#include "dfp.h"
#include "wide-int.h"
/* The order of the following headers is important for making sure
decNumber structure is large enough to hold decimal128 digits. */
@ -604,11 +605,11 @@ decimal_real_to_integer (const REAL_VALUE_TYPE *r)
return real_to_integer (&to);
}
/* Likewise, but to an integer pair, HI+LOW. */
/* Likewise, but returns a wide_int with PRECISION. *FAIL is set if the
value does not fit. */
void
decimal_real_to_integer2 (HOST_WIDE_INT *plow, HOST_WIDE_INT *phigh,
const REAL_VALUE_TYPE *r)
wide_int
decimal_real_to_integer (const REAL_VALUE_TYPE *r, bool *fail, int precision)
{
decContext set;
decNumber dn, dn2, dn3;
@ -628,7 +629,7 @@ decimal_real_to_integer2 (HOST_WIDE_INT *plow, HOST_WIDE_INT *phigh,
function. */
decNumberToString (&dn, string);
real_from_string (&to, string);
real_to_integer2 (plow, phigh, &to);
return real_to_integer (&to, fail, precision);
}
/* Perform the decimal floating point operation described by CODE.

View file

@ -38,7 +38,7 @@ void decimal_real_convert (REAL_VALUE_TYPE *, enum machine_mode, const REAL_VALU
void decimal_real_to_decimal (char *, const REAL_VALUE_TYPE *, size_t, size_t, int);
void decimal_do_fix_trunc (REAL_VALUE_TYPE *, const REAL_VALUE_TYPE *);
void decimal_real_maxval (REAL_VALUE_TYPE *, int, enum machine_mode);
void decimal_real_to_integer2 (HOST_WIDE_INT *, HOST_WIDE_INT *, const REAL_VALUE_TYPE *);
wide_int decimal_real_to_integer (const REAL_VALUE_TYPE *, bool *, int);
HOST_WIDE_INT decimal_real_to_integer (const REAL_VALUE_TYPE *);
#ifdef TREE_CODE

View file

@ -1022,10 +1022,15 @@ As this example indicates, the operands are zero-indexed.
@node Constant expressions
@subsection Constant expressions
@tindex INTEGER_CST
@findex TREE_INT_CST_HIGH
@findex TREE_INT_CST_LOW
@findex tree_int_cst_lt
@findex tree_int_cst_equal
@tindex tree_fits_uhwi_p
@tindex tree_fits_shwi_p
@tindex tree_to_uhwi
@tindex tree_to_shwi
@tindex TREE_INT_CST_NUNITS
@tindex TREE_INT_CST_ELT
@tindex TREE_INT_CST_LOW
@tindex REAL_CST
@tindex FIXED_CST
@tindex COMPLEX_CST
@ -1044,36 +1049,18 @@ These nodes represent integer constants. Note that the type of these
constants is obtained with @code{TREE_TYPE}; they are not always of type
@code{int}. In particular, @code{char} constants are represented with
@code{INTEGER_CST} nodes. The value of the integer constant @code{e} is
given by
@smallexample
((TREE_INT_CST_HIGH (e) << HOST_BITS_PER_WIDE_INT)
+ TREE_INST_CST_LOW (e))
@end smallexample
@noindent
HOST_BITS_PER_WIDE_INT is at least thirty-two on all platforms. Both
@code{TREE_INT_CST_HIGH} and @code{TREE_INT_CST_LOW} return a
@code{HOST_WIDE_INT}. The value of an @code{INTEGER_CST} is interpreted
as a signed or unsigned quantity depending on the type of the constant.
In general, the expression given above will overflow, so it should not
be used to calculate the value of the constant.
represented in an array of HOST_WIDE_INT. There are enough elements
in the array to represent the value without taking extra elements for
redundant 0s or -1. The number of elements used to represent @code{e}
is available via @code{TREE_INT_CST_NUNITS}. Element @code{i} can be
extracted by using @code{TREE_INT_CST_ELT (e, i)}.
@code{TREE_INT_CST_LOW} is a shorthand for @code{TREE_INT_CST_ELT (e, 0)}.
The variable @code{integer_zero_node} is an integer constant with value
zero. Similarly, @code{integer_one_node} is an integer constant with
value one. The @code{size_zero_node} and @code{size_one_node} variables
are analogous, but have type @code{size_t} rather than @code{int}.
The function @code{tree_int_cst_lt} is a predicate which holds if its
first argument is less than its second. Both constants are assumed to
have the same signedness (i.e., either both should be signed or both
should be unsigned.) The full width of the constant is used when doing
the comparison; the usual rules about promotions and conversions are
ignored. Similarly, @code{tree_int_cst_equal} holds if the two
constants are equal. The @code{tree_int_cst_sgn} function returns the
sign of a constant. The value is @code{1}, @code{0}, or @code{-1}
according on whether the constant is greater than, equal to, or less
than zero. Again, the signedness of the constant's type is taken into
account; an unsigned constant is never less than zero, no matter what
its bit-pattern.
The functions @code{tree_fits_shwi_p} and @code{tree_fits_uhwi_p}
can be used to tell if the value is small enough to fit in a
signed HOST_WIDE_INT or an unsigned HOST_WIDE_INT respectively.
The value can then be extracted using @code{tree_to_shwi} and
@code{tree_to_uhwi}.
@item REAL_CST

View file

@ -1540,17 +1540,21 @@ Similarly, there is only one object for the integer whose value is
@findex const_double
@item (const_double:@var{m} @var{i0} @var{i1} @dots{})
Represents either a floating-point constant of mode @var{m} or an
integer constant too large to fit into @code{HOST_BITS_PER_WIDE_INT}
bits but small enough to fit within twice that number of bits (GCC
does not provide a mechanism to represent even larger constants). In
the latter case, @var{m} will be @code{VOIDmode}. For integral values
constants for modes with more bits than twice the number in
@code{HOST_WIDE_INT} the implied high order bits of that constant are
copies of the top bit of @code{CONST_DOUBLE_HIGH}. Note however that
integral values are neither inherently signed nor inherently unsigned;
where necessary, signedness is determined by the rtl operation
instead.
This represents either a floating-point constant of mode @var{m} or
(on older ports that do not define
@code{TARGET_SUPPORTS_WIDE_INT}) an integer constant too large to fit
into @code{HOST_BITS_PER_WIDE_INT} bits but small enough to fit within
twice that number of bits. In the latter case, @var{m} will be
@code{VOIDmode}. For integral values constants for modes with more
bits than twice the number in @code{HOST_WIDE_INT} the implied high
order bits of that constant are copies of the top bit of
@code{CONST_DOUBLE_HIGH}. Note however that integral values are
neither inherently signed nor inherently unsigned; where necessary,
signedness is determined by the rtl operation instead.
On more modern ports, @code{CONST_DOUBLE} only represents floating
point values. New ports define @code{TARGET_SUPPORTS_WIDE_INT} to
make this designation.
@findex CONST_DOUBLE_LOW
If @var{m} is @code{VOIDmode}, the bits of the value are stored in
@ -1565,6 +1569,37 @@ machine's or host machine's floating point format. To convert them to
the precise bit pattern used by the target machine, use the macro
@code{REAL_VALUE_TO_TARGET_DOUBLE} and friends (@pxref{Data Output}).
@findex CONST_WIDE_INT
@item (const_wide_int:@var{m} @var{nunits} @var{elt0} @dots{})
This contains an array of @code{HOST_WIDE_INT}s that is large enough
to hold any constant that can be represented on the target. This form
of rtl is only used on targets that define
@code{TARGET_SUPPORTS_WIDE_INT} to be nonzero and then
@code{CONST_DOUBLE}s are only used to hold floating-point values. If
the target leaves @code{TARGET_SUPPORTS_WIDE_INT} defined as 0,
@code{CONST_WIDE_INT}s are not used and @code{CONST_DOUBLE}s are as
they were before.
The values are stored in a compressed format. The higher-order
0s or -1s are not represented if they are just the logical sign
extension of the number that is represented.
@findex CONST_WIDE_INT_VEC
@item CONST_WIDE_INT_VEC (@var{code})
Returns the entire array of @code{HOST_WIDE_INT}s that are used to
store the value. This macro should be rarely used.
@findex CONST_WIDE_INT_NUNITS
@item CONST_WIDE_INT_NUNITS (@var{code})
The number of @code{HOST_WIDE_INT}s used to represent the number.
Note that this generally is smaller than the number of
@code{HOST_WIDE_INT}s implied by the mode size.
@findex CONST_WIDE_INT_ELT
@item CONST_WIDE_INT_NUNITS (@var{code},@var{i})
Returns the @code{i}th element of the array. Element 0 is contains
the low order bits of the constant.
@findex const_fixed
@item (const_fixed:@var{m} @dots{})
Represents a fixed-point constant of mode @var{m}.

View file

@ -9704,18 +9704,6 @@ Returns the negative of the floating point value @var{x}.
Returns the absolute value of @var{x}.
@end deftypefn
@deftypefn Macro void REAL_VALUE_TO_INT (HOST_WIDE_INT @var{low}, HOST_WIDE_INT @var{high}, REAL_VALUE_TYPE @var{x})
Converts a floating point value @var{x} into a double-precision integer
which is then stored into @var{low} and @var{high}. If the value is not
integral, it is truncated.
@end deftypefn
@deftypefn Macro void REAL_VALUE_FROM_INT (REAL_VALUE_TYPE @var{x}, HOST_WIDE_INT @var{low}, HOST_WIDE_INT @var{high}, enum machine_mode @var{mode})
Converts a double-precision integer found in @var{low} and @var{high},
into a floating point value which is then stored into @var{x}. The
value is truncated to fit in mode @var{mode}.
@end deftypefn
@node Mode Switching
@section Mode Switching Instructions
@cindex mode switching
@ -11024,7 +11012,7 @@ function version at run-time for a given set of function versions.
body must be generated.
@end deftypefn
@deftypefn {Target Hook} bool TARGET_CAN_USE_DOLOOP_P (double_int @var{iterations}, double_int @var{iterations_max}, unsigned int @var{loop_depth}, bool @var{entered_at_top})
@deftypefn {Target Hook} bool TARGET_CAN_USE_DOLOOP_P (const widest_int @var{&iterations}, const widest_int @var{&iterations_max}, unsigned int @var{loop_depth}, bool @var{entered_at_top})
Return true if it is possible to use low-overhead loops (@code{doloop_end}
and @code{doloop_begin}) for a particular loop. @var{iterations} gives the
exact number of iterations, or 0 if not known. @var{iterations_max} gives
@ -11440,3 +11428,49 @@ If defined, this function returns an appropriate alignment in bits for an atomic
@deftypefn {Target Hook} void TARGET_ATOMIC_ASSIGN_EXPAND_FENV (tree *@var{hold}, tree *@var{clear}, tree *@var{update})
ISO C11 requires atomic compound assignments that may raise floating-point exceptions to raise exceptions corresponding to the arithmetic operation whose result was successfully stored in a compare-and-exchange sequence. This requires code equivalent to calls to @code{feholdexcept}, @code{feclearexcept} and @code{feupdateenv} to be generated at appropriate points in the compare-and-exchange sequence. This hook should set @code{*@var{hold}} to an expression equivalent to the call to @code{feholdexcept}, @code{*@var{clear}} to an expression equivalent to the call to @code{feclearexcept} and @code{*@var{update}} to an expression equivalent to the call to @code{feupdateenv}. The three expressions are @code{NULL_TREE} on entry to the hook and may be left as @code{NULL_TREE} if no code is required in a particular place. The default implementation leaves all three expressions as @code{NULL_TREE}. The @code{__atomic_feraiseexcept} function from @code{libatomic} may be of use as part of the code generated in @code{*@var{update}}.
@end deftypefn
@defmac TARGET_SUPPORTS_WIDE_INT
On older ports, large integers are stored in @code{CONST_DOUBLE} rtl
objects. Newer ports define @code{TARGET_SUPPORTS_WIDE_INT} to be nonzero
to indicate that large integers are stored in
@code{CONST_WIDE_INT} rtl objects. The @code{CONST_WIDE_INT} allows
very large integer constants to be represented. @code{CONST_DOUBLE}
is limited to twice the size of the host's @code{HOST_WIDE_INT}
representation.
Converting a port mostly requires looking for the places where
@code{CONST_DOUBLE}s are used with @code{VOIDmode} and replacing that
code with code that accesses @code{CONST_WIDE_INT}s. @samp{"grep -i
const_double"} at the port level gets you to 95% of the changes that
need to be made. There are a few places that require a deeper look.
@itemize @bullet
@item
There is no equivalent to @code{hval} and @code{lval} for
@code{CONST_WIDE_INT}s. This would be difficult to express in the md
language since there are a variable number of elements.
Most ports only check that @code{hval} is either 0 or -1 to see if the
value is small. As mentioned above, this will no longer be necessary
since small constants are always @code{CONST_INT}. Of course there
are still a few exceptions, the alpha's constraint used by the zap
instruction certainly requires careful examination by C code.
However, all the current code does is pass the hval and lval to C
code, so evolving the c code to look at the @code{CONST_WIDE_INT} is
not really a large change.
@item
Because there is no standard template that ports use to materialize
constants, there is likely to be some futzing that is unique to each
port in this code.
@item
The rtx costs may have to be adjusted to properly account for larger
constants that are represented as @code{CONST_WIDE_INT}.
@end itemize
All and all it does not take long to convert ports that the
maintainer is familiar with.
@end defmac

View file

@ -7362,18 +7362,6 @@ Returns the negative of the floating point value @var{x}.
Returns the absolute value of @var{x}.
@end deftypefn
@deftypefn Macro void REAL_VALUE_TO_INT (HOST_WIDE_INT @var{low}, HOST_WIDE_INT @var{high}, REAL_VALUE_TYPE @var{x})
Converts a floating point value @var{x} into a double-precision integer
which is then stored into @var{low} and @var{high}. If the value is not
integral, it is truncated.
@end deftypefn
@deftypefn Macro void REAL_VALUE_FROM_INT (REAL_VALUE_TYPE @var{x}, HOST_WIDE_INT @var{low}, HOST_WIDE_INT @var{high}, enum machine_mode @var{mode})
Converts a double-precision integer found in @var{low} and @var{high},
into a floating point value which is then stored into @var{x}. The
value is truncated to fit in mode @var{mode}.
@end deftypefn
@node Mode Switching
@section Mode Switching Instructions
@cindex mode switching
@ -8425,3 +8413,49 @@ and the associated definitions of those functions.
@hook TARGET_ATOMIC_ALIGN_FOR_MODE
@hook TARGET_ATOMIC_ASSIGN_EXPAND_FENV
@defmac TARGET_SUPPORTS_WIDE_INT
On older ports, large integers are stored in @code{CONST_DOUBLE} rtl
objects. Newer ports define @code{TARGET_SUPPORTS_WIDE_INT} to be nonzero
to indicate that large integers are stored in
@code{CONST_WIDE_INT} rtl objects. The @code{CONST_WIDE_INT} allows
very large integer constants to be represented. @code{CONST_DOUBLE}
is limited to twice the size of the host's @code{HOST_WIDE_INT}
representation.
Converting a port mostly requires looking for the places where
@code{CONST_DOUBLE}s are used with @code{VOIDmode} and replacing that
code with code that accesses @code{CONST_WIDE_INT}s. @samp{"grep -i
const_double"} at the port level gets you to 95% of the changes that
need to be made. There are a few places that require a deeper look.
@itemize @bullet
@item
There is no equivalent to @code{hval} and @code{lval} for
@code{CONST_WIDE_INT}s. This would be difficult to express in the md
language since there are a variable number of elements.
Most ports only check that @code{hval} is either 0 or -1 to see if the
value is small. As mentioned above, this will no longer be necessary
since small constants are always @code{CONST_INT}. Of course there
are still a few exceptions, the alpha's constraint used by the zap
instruction certainly requires careful examination by C code.
However, all the current code does is pass the hval and lval to C
code, so evolving the c code to look at the @code{CONST_WIDE_INT} is
not really a large change.
@item
Because there is no standard template that ports use to materialize
constants, there is likely to be some futzing that is unique to each
port in this code.
@item
The rtx costs may have to be adjusted to properly account for larger
constants that are represented as @code{CONST_WIDE_INT}.
@end itemize
All and all it does not take long to convert ports that the
maintainer is familiar with.
@end defmac

View file

@ -166,6 +166,7 @@ static bool
prefer_and_bit_test (enum machine_mode mode, int bitnum)
{
bool speed_p;
wide_int mask = wi::set_bit_in_zero (bitnum, GET_MODE_PRECISION (mode));
if (and_test == 0)
{
@ -186,8 +187,7 @@ prefer_and_bit_test (enum machine_mode mode, int bitnum)
}
/* Fill in the integers. */
XEXP (and_test, 1)
= immed_double_int_const (double_int_zero.set_bit (bitnum), mode);
XEXP (and_test, 1) = immed_wide_int_const (mask, mode);
XEXP (XEXP (shift_test, 0), 1) = GEN_INT (bitnum);
speed_p = optimize_insn_for_speed_p ();

View file

@ -20,6 +20,8 @@ along with GCC; see the file COPYING3. If not see
#ifndef DOUBLE_INT_H
#define DOUBLE_INT_H
#include "wide-int.h"
/* A large integer is currently represented as a pair of HOST_WIDE_INTs.
It therefore represents a number with precision of
2 * HOST_BITS_PER_WIDE_INT bits (it is however possible that the
@ -435,4 +437,36 @@ void mpz_set_double_int (mpz_t, double_int, bool);
double_int mpz_get_double_int (const_tree, mpz_t, bool);
#endif
namespace wi
{
template <>
struct int_traits <double_int>
{
static const enum precision_type precision_type = CONST_PRECISION;
static const bool host_dependent_precision = true;
static const unsigned int precision = HOST_BITS_PER_DOUBLE_INT;
static unsigned int get_precision (const double_int &);
static wi::storage_ref decompose (HOST_WIDE_INT *, unsigned int,
const double_int &);
};
}
inline unsigned int
wi::int_traits <double_int>::get_precision (const double_int &)
{
return precision;
}
inline wi::storage_ref
wi::int_traits <double_int>::decompose (HOST_WIDE_INT *scratch, unsigned int p,
const double_int &x)
{
gcc_checking_assert (precision == p);
scratch[0] = x.low;
if ((x.high == 0 && scratch[0] >= 0) || (x.high == -1 && scratch[0] < 0))
return wi::storage_ref (scratch, 1, precision);
scratch[1] = x.high;
return wi::storage_ref (scratch, 2, precision);
}
#endif /* DOUBLE_INT_H */

View file

@ -357,6 +357,16 @@ dump_struct_debug (tree type, enum debug_info_usage usage,
#endif
/* Get the number of HOST_WIDE_INTs needed to represent the precision
of the number. */
static unsigned int
get_full_len (const wide_int &op)
{
return ((op.get_precision () + HOST_BITS_PER_WIDE_INT - 1)
/ HOST_BITS_PER_WIDE_INT);
}
static bool
should_emit_struct_debug (tree type, enum debug_info_usage usage)
{
@ -1392,6 +1402,9 @@ dw_val_equal_p (dw_val_node *a, dw_val_node *b)
return (a->v.val_double.high == b->v.val_double.high
&& a->v.val_double.low == b->v.val_double.low);
case dw_val_class_wide_int:
return *a->v.val_wide == *b->v.val_wide;
case dw_val_class_vec:
{
size_t a_len = a->v.val_vec.elt_size * a->v.val_vec.length;
@ -1648,6 +1661,10 @@ size_of_loc_descr (dw_loc_descr_ref loc)
case dw_val_class_const_double:
size += HOST_BITS_PER_DOUBLE_INT / BITS_PER_UNIT;
break;
case dw_val_class_wide_int:
size += (get_full_len (*loc->dw_loc_oprnd2.v.val_wide)
* HOST_BITS_PER_WIDE_INT / BITS_PER_UNIT);
break;
default:
gcc_unreachable ();
}
@ -1825,6 +1842,20 @@ output_loc_operands (dw_loc_descr_ref loc, int for_eh_or_skip)
second, NULL);
}
break;
case dw_val_class_wide_int:
{
int i;
int len = get_full_len (*val2->v.val_wide);
if (WORDS_BIG_ENDIAN)
for (i = len - 1; i >= 0; --i)
dw2_asm_output_data (HOST_BITS_PER_WIDE_INT / HOST_BITS_PER_CHAR,
val2->v.val_wide->elt (i), NULL);
else
for (i = 0; i < len; ++i)
dw2_asm_output_data (HOST_BITS_PER_WIDE_INT / HOST_BITS_PER_CHAR,
val2->v.val_wide->elt (i), NULL);
}
break;
case dw_val_class_addr:
gcc_assert (val1->v.val_unsigned == DWARF2_ADDR_SIZE);
dw2_asm_output_addr_rtx (DWARF2_ADDR_SIZE, val2->v.val_addr, NULL);
@ -2034,6 +2065,21 @@ output_loc_operands (dw_loc_descr_ref loc, int for_eh_or_skip)
dw2_asm_output_data (l, second, NULL);
}
break;
case dw_val_class_wide_int:
{
int i;
int len = get_full_len (*val2->v.val_wide);
l = HOST_BITS_PER_WIDE_INT / HOST_BITS_PER_CHAR;
dw2_asm_output_data (1, len * l, NULL);
if (WORDS_BIG_ENDIAN)
for (i = len - 1; i >= 0; --i)
dw2_asm_output_data (l, val2->v.val_wide->elt (i), NULL);
else
for (i = 0; i < len; ++i)
dw2_asm_output_data (l, val2->v.val_wide->elt (i), NULL);
}
break;
default:
gcc_unreachable ();
}
@ -3126,7 +3172,7 @@ static void add_AT_location_description (dw_die_ref, enum dwarf_attribute,
static void add_data_member_location_attribute (dw_die_ref, tree);
static bool add_const_value_attribute (dw_die_ref, rtx);
static void insert_int (HOST_WIDE_INT, unsigned, unsigned char *);
static void insert_double (double_int, unsigned char *);
static void insert_wide_int (const wide_int &, unsigned char *, int);
static void insert_float (const_rtx, unsigned char *);
static rtx rtl_for_decl_location (tree);
static bool add_location_or_const_value_attribute (dw_die_ref, tree, bool,
@ -3758,6 +3804,21 @@ AT_unsigned (dw_attr_ref a)
return a->dw_attr_val.v.val_unsigned;
}
/* Add an unsigned wide integer attribute value to a DIE. */
static inline void
add_AT_wide (dw_die_ref die, enum dwarf_attribute attr_kind,
const wide_int& w)
{
dw_attr_node attr;
attr.dw_attr = attr_kind;
attr.dw_attr_val.val_class = dw_val_class_wide_int;
attr.dw_attr_val.v.val_wide = ggc_alloc_cleared_wide_int ();
*attr.dw_attr_val.v.val_wide = w;
add_dwarf_attr (die, &attr);
}
/* Add an unsigned double integer attribute value to a DIE. */
static inline void
@ -5332,6 +5393,21 @@ print_die (dw_die_ref die, FILE *outfile)
a->dw_attr_val.v.val_double.high,
a->dw_attr_val.v.val_double.low);
break;
case dw_val_class_wide_int:
{
int i = a->dw_attr_val.v.val_wide->get_len ();
fprintf (outfile, "constant (");
gcc_assert (i > 0);
if (a->dw_attr_val.v.val_wide->elt (i - 1) == 0)
fprintf (outfile, "0x");
fprintf (outfile, HOST_WIDE_INT_PRINT_HEX,
a->dw_attr_val.v.val_wide->elt (--i));
while (--i >= 0)
fprintf (outfile, HOST_WIDE_INT_PRINT_PADDED_HEX,
a->dw_attr_val.v.val_wide->elt (i));
fprintf (outfile, ")");
break;
}
case dw_val_class_vec:
fprintf (outfile, "floating-point or vector constant");
break;
@ -5505,6 +5581,9 @@ attr_checksum (dw_attr_ref at, struct md5_ctx *ctx, int *mark)
case dw_val_class_const_double:
CHECKSUM (at->dw_attr_val.v.val_double);
break;
case dw_val_class_wide_int:
CHECKSUM (*at->dw_attr_val.v.val_wide);
break;
case dw_val_class_vec:
CHECKSUM_BLOCK (at->dw_attr_val.v.val_vec.array,
(at->dw_attr_val.v.val_vec.length
@ -5782,6 +5861,12 @@ attr_checksum_ordered (enum dwarf_tag tag, dw_attr_ref at,
CHECKSUM (at->dw_attr_val.v.val_double);
break;
case dw_val_class_wide_int:
CHECKSUM_ULEB128 (DW_FORM_block);
CHECKSUM_ULEB128 (sizeof (*at->dw_attr_val.v.val_wide));
CHECKSUM (*at->dw_attr_val.v.val_wide);
break;
case dw_val_class_vec:
CHECKSUM_ULEB128 (DW_FORM_block);
CHECKSUM_ULEB128 (at->dw_attr_val.v.val_vec.length
@ -6264,6 +6349,8 @@ same_dw_val_p (const dw_val_node *v1, const dw_val_node *v2, int *mark)
case dw_val_class_const_double:
return v1->v.val_double.high == v2->v.val_double.high
&& v1->v.val_double.low == v2->v.val_double.low;
case dw_val_class_wide_int:
return *v1->v.val_wide == *v2->v.val_wide;
case dw_val_class_vec:
if (v1->v.val_vec.length != v2->v.val_vec.length
|| v1->v.val_vec.elt_size != v2->v.val_vec.elt_size)
@ -7819,6 +7906,13 @@ size_of_die (dw_die_ref die)
if (HOST_BITS_PER_WIDE_INT >= 64)
size++; /* block */
break;
case dw_val_class_wide_int:
size += (get_full_len (*a->dw_attr_val.v.val_wide)
* HOST_BITS_PER_WIDE_INT / HOST_BITS_PER_CHAR);
if (get_full_len (*a->dw_attr_val.v.val_wide) * HOST_BITS_PER_WIDE_INT
> 64)
size++; /* block */
break;
case dw_val_class_vec:
size += constant_size (a->dw_attr_val.v.val_vec.length
* a->dw_attr_val.v.val_vec.elt_size)
@ -8188,6 +8282,20 @@ value_format (dw_attr_ref a)
default:
return DW_FORM_block1;
}
case dw_val_class_wide_int:
switch (get_full_len (*a->dw_attr_val.v.val_wide) * HOST_BITS_PER_WIDE_INT)
{
case 8:
return DW_FORM_data1;
case 16:
return DW_FORM_data2;
case 32:
return DW_FORM_data4;
case 64:
return DW_FORM_data8;
default:
return DW_FORM_block1;
}
case dw_val_class_vec:
switch (constant_size (a->dw_attr_val.v.val_vec.length
* a->dw_attr_val.v.val_vec.elt_size))
@ -8627,6 +8735,32 @@ output_die (dw_die_ref die)
}
break;
case dw_val_class_wide_int:
{
int i;
int len = get_full_len (*a->dw_attr_val.v.val_wide);
int l = HOST_BITS_PER_WIDE_INT / HOST_BITS_PER_CHAR;
if (len * HOST_BITS_PER_WIDE_INT > 64)
dw2_asm_output_data (1, get_full_len (*a->dw_attr_val.v.val_wide) * l,
NULL);
if (WORDS_BIG_ENDIAN)
for (i = len - 1; i >= 0; --i)
{
dw2_asm_output_data (l, a->dw_attr_val.v.val_wide->elt (i),
name);
name = NULL;
}
else
for (i = 0; i < len; ++i)
{
dw2_asm_output_data (l, a->dw_attr_val.v.val_wide->elt (i),
name);
name = NULL;
}
}
break;
case dw_val_class_vec:
{
unsigned int elt_size = a->dw_attr_val.v.val_vec.elt_size;
@ -10320,19 +10454,19 @@ simple_type_size_in_bits (const_tree type)
return TYPE_ALIGN (type);
}
/* Similarly, but return a double_int instead of UHWI. */
/* Similarly, but return an offset_int instead of UHWI. */
static inline double_int
double_int_type_size_in_bits (const_tree type)
static inline offset_int
offset_int_type_size_in_bits (const_tree type)
{
if (TREE_CODE (type) == ERROR_MARK)
return double_int::from_uhwi (BITS_PER_WORD);
return BITS_PER_WORD;
else if (TYPE_SIZE (type) == NULL_TREE)
return double_int_zero;
return 0;
else if (TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST)
return tree_to_double_int (TYPE_SIZE (type));
return wi::to_offset (TYPE_SIZE (type));
else
return double_int::from_uhwi (TYPE_ALIGN (type));
return TYPE_ALIGN (type);
}
/* Given a pointer to a tree node for a subrange type, return a pointer
@ -11826,9 +11960,7 @@ clz_loc_descriptor (rtx rtl, enum machine_mode mode,
rtx msb;
if (GET_MODE_CLASS (mode) != MODE_INT
|| GET_MODE (XEXP (rtl, 0)) != mode
|| (GET_CODE (rtl) == CLZ
&& GET_MODE_BITSIZE (mode) > HOST_BITS_PER_DOUBLE_INT))
|| GET_MODE (XEXP (rtl, 0)) != mode)
return NULL;
op0 = mem_loc_descriptor (XEXP (rtl, 0), mode, mem_mode,
@ -11872,9 +12004,9 @@ clz_loc_descriptor (rtx rtl, enum machine_mode mode,
msb = GEN_INT ((unsigned HOST_WIDE_INT) 1
<< (GET_MODE_BITSIZE (mode) - 1));
else
msb = immed_double_const (0, (unsigned HOST_WIDE_INT) 1
<< (GET_MODE_BITSIZE (mode)
- HOST_BITS_PER_WIDE_INT - 1), mode);
msb = immed_wide_int_const
(wi::set_bit_in_zero (GET_MODE_PRECISION (mode) - 1,
GET_MODE_PRECISION (mode)), mode);
if (GET_CODE (msb) == CONST_INT && INTVAL (msb) < 0)
tmp = new_loc_descr (HOST_BITS_PER_WIDE_INT == 32
? DW_OP_const4u : HOST_BITS_PER_WIDE_INT == 64
@ -12800,10 +12932,14 @@ mem_loc_descriptor (rtx rtl, enum machine_mode mode,
{
dw_die_ref type_die;
/* Note that a CONST_DOUBLE rtx could represent either an integer
or a floating-point constant. A CONST_DOUBLE is used whenever
the constant requires more than one word in order to be
adequately represented. We output CONST_DOUBLEs as blocks. */
/* Note that if TARGET_SUPPORTS_WIDE_INT == 0, a
CONST_DOUBLE rtx could represent either a large integer
or a floating-point constant. If TARGET_SUPPORTS_WIDE_INT != 0,
the value is always a floating point constant.
When it is an integer, a CONST_DOUBLE is used whenever
the constant requires 2 HWIs to be adequately represented.
We output CONST_DOUBLEs as blocks. */
if (mode == VOIDmode
|| (GET_MODE (rtl) == VOIDmode
&& GET_MODE_BITSIZE (mode) != HOST_BITS_PER_DOUBLE_INT))
@ -12816,7 +12952,16 @@ mem_loc_descriptor (rtx rtl, enum machine_mode mode,
mem_loc_result->dw_loc_oprnd1.val_class = dw_val_class_die_ref;
mem_loc_result->dw_loc_oprnd1.v.val_die_ref.die = type_die;
mem_loc_result->dw_loc_oprnd1.v.val_die_ref.external = 0;
if (SCALAR_FLOAT_MODE_P (mode))
#if TARGET_SUPPORTS_WIDE_INT == 0
if (!SCALAR_FLOAT_MODE_P (mode))
{
mem_loc_result->dw_loc_oprnd2.val_class
= dw_val_class_const_double;
mem_loc_result->dw_loc_oprnd2.v.val_double
= rtx_to_double_int (rtl);
}
else
#endif
{
unsigned int length = GET_MODE_SIZE (mode);
unsigned char *array
@ -12828,13 +12973,26 @@ mem_loc_descriptor (rtx rtl, enum machine_mode mode,
mem_loc_result->dw_loc_oprnd2.v.val_vec.elt_size = 4;
mem_loc_result->dw_loc_oprnd2.v.val_vec.array = array;
}
else
{
mem_loc_result->dw_loc_oprnd2.val_class
= dw_val_class_const_double;
mem_loc_result->dw_loc_oprnd2.v.val_double
= rtx_to_double_int (rtl);
}
}
break;
case CONST_WIDE_INT:
if (!dwarf_strict)
{
dw_die_ref type_die;
type_die = base_type_for_mode (mode,
GET_MODE_CLASS (mode) == MODE_INT);
if (type_die == NULL)
return NULL;
mem_loc_result = new_loc_descr (DW_OP_GNU_const_type, 0, 0);
mem_loc_result->dw_loc_oprnd1.val_class = dw_val_class_die_ref;
mem_loc_result->dw_loc_oprnd1.v.val_die_ref.die = type_die;
mem_loc_result->dw_loc_oprnd1.v.val_die_ref.external = 0;
mem_loc_result->dw_loc_oprnd2.val_class
= dw_val_class_wide_int;
mem_loc_result->dw_loc_oprnd2.v.val_wide = ggc_alloc_cleared_wide_int ();
*mem_loc_result->dw_loc_oprnd2.v.val_wide = std::make_pair (rtl, mode);
}
break;
@ -13305,7 +13463,15 @@ loc_descriptor (rtx rtl, enum machine_mode mode,
adequately represented. We output CONST_DOUBLEs as blocks. */
loc_result = new_loc_descr (DW_OP_implicit_value,
GET_MODE_SIZE (mode), 0);
if (SCALAR_FLOAT_MODE_P (mode))
#if TARGET_SUPPORTS_WIDE_INT == 0
if (!SCALAR_FLOAT_MODE_P (mode))
{
loc_result->dw_loc_oprnd2.val_class = dw_val_class_const_double;
loc_result->dw_loc_oprnd2.v.val_double
= rtx_to_double_int (rtl);
}
else
#endif
{
unsigned int length = GET_MODE_SIZE (mode);
unsigned char *array
@ -13317,12 +13483,20 @@ loc_descriptor (rtx rtl, enum machine_mode mode,
loc_result->dw_loc_oprnd2.v.val_vec.elt_size = 4;
loc_result->dw_loc_oprnd2.v.val_vec.array = array;
}
else
{
loc_result->dw_loc_oprnd2.val_class = dw_val_class_const_double;
loc_result->dw_loc_oprnd2.v.val_double
= rtx_to_double_int (rtl);
}
}
break;
case CONST_WIDE_INT:
if (mode == VOIDmode)
mode = GET_MODE (rtl);
if (mode != VOIDmode && (dwarf_version >= 4 || !dwarf_strict))
{
loc_result = new_loc_descr (DW_OP_implicit_value,
GET_MODE_SIZE (mode), 0);
loc_result->dw_loc_oprnd2.val_class = dw_val_class_wide_int;
loc_result->dw_loc_oprnd2.v.val_wide = ggc_alloc_cleared_wide_int ();
*loc_result->dw_loc_oprnd2.v.val_wide = std::make_pair (rtl, mode);
}
break;
@ -13338,6 +13512,7 @@ loc_descriptor (rtx rtl, enum machine_mode mode,
ggc_alloc_atomic (length * elt_size);
unsigned int i;
unsigned char *p;
enum machine_mode imode = GET_MODE_INNER (mode);
gcc_assert (mode == GET_MODE (rtl) || VOIDmode == GET_MODE (rtl));
switch (GET_MODE_CLASS (mode))
@ -13346,15 +13521,7 @@ loc_descriptor (rtx rtl, enum machine_mode mode,
for (i = 0, p = array; i < length; i++, p += elt_size)
{
rtx elt = CONST_VECTOR_ELT (rtl, i);
double_int val = rtx_to_double_int (elt);
if (elt_size <= sizeof (HOST_WIDE_INT))
insert_int (val.to_shwi (), elt_size, p);
else
{
gcc_assert (elt_size == 2 * sizeof (HOST_WIDE_INT));
insert_double (val, p);
}
insert_wide_int (std::make_pair (elt, imode), p, elt_size);
}
break;
@ -14676,15 +14843,10 @@ simple_decl_align_in_bits (const_tree decl)
/* Return the result of rounding T up to ALIGN. */
static inline double_int
round_up_to_align (double_int t, unsigned int align)
static inline offset_int
round_up_to_align (const offset_int &t, unsigned int align)
{
double_int alignd = double_int::from_uhwi (align);
t += alignd;
t += double_int_minus_one;
t = t.div (alignd, true, TRUNC_DIV_EXPR);
t *= alignd;
return t;
return wi::udiv_trunc (t + align - 1, align) * align;
}
/* Given a pointer to a FIELD_DECL, compute and return the byte offset of the
@ -14697,9 +14859,9 @@ round_up_to_align (double_int t, unsigned int align)
static HOST_WIDE_INT
field_byte_offset (const_tree decl)
{
double_int object_offset_in_bits;
double_int object_offset_in_bytes;
double_int bitpos_int;
offset_int object_offset_in_bits;
offset_int object_offset_in_bytes;
offset_int bitpos_int;
if (TREE_CODE (decl) == ERROR_MARK)
return 0;
@ -14712,21 +14874,21 @@ field_byte_offset (const_tree decl)
if (TREE_CODE (bit_position (decl)) != INTEGER_CST)
return 0;
bitpos_int = tree_to_double_int (bit_position (decl));
bitpos_int = wi::to_offset (bit_position (decl));
#ifdef PCC_BITFIELD_TYPE_MATTERS
if (PCC_BITFIELD_TYPE_MATTERS)
{
tree type;
tree field_size_tree;
double_int deepest_bitpos;
double_int field_size_in_bits;
offset_int deepest_bitpos;
offset_int field_size_in_bits;
unsigned int type_align_in_bits;
unsigned int decl_align_in_bits;
double_int type_size_in_bits;
offset_int type_size_in_bits;
type = field_type (decl);
type_size_in_bits = double_int_type_size_in_bits (type);
type_size_in_bits = offset_int_type_size_in_bits (type);
type_align_in_bits = simple_type_align_in_bits (type);
field_size_tree = DECL_SIZE (decl);
@ -14738,7 +14900,7 @@ field_byte_offset (const_tree decl)
/* If the size of the field is not constant, use the type size. */
if (TREE_CODE (field_size_tree) == INTEGER_CST)
field_size_in_bits = tree_to_double_int (field_size_tree);
field_size_in_bits = wi::to_offset (field_size_tree);
else
field_size_in_bits = type_size_in_bits;
@ -14802,7 +14964,7 @@ field_byte_offset (const_tree decl)
object_offset_in_bits
= round_up_to_align (object_offset_in_bits, type_align_in_bits);
if (object_offset_in_bits.ugt (bitpos_int))
if (wi::gtu_p (object_offset_in_bits, bitpos_int))
{
object_offset_in_bits = deepest_bitpos - type_size_in_bits;
@ -14816,8 +14978,7 @@ field_byte_offset (const_tree decl)
object_offset_in_bits = bitpos_int;
object_offset_in_bytes
= object_offset_in_bits.div (double_int::from_uhwi (BITS_PER_UNIT),
true, TRUNC_DIV_EXPR);
= wi::lrshift (object_offset_in_bits, LOG2_BITS_PER_UNIT);
return object_offset_in_bytes.to_shwi ();
}
@ -14993,22 +15154,36 @@ extract_int (const unsigned char *src, unsigned int size)
return val;
}
/* Writes double_int values to dw_vec_const array. */
/* Writes wide_int values to dw_vec_const array. */
static void
insert_double (double_int val, unsigned char *dest)
insert_wide_int (const wide_int &val, unsigned char *dest, int elt_size)
{
unsigned char *p0 = dest;
unsigned char *p1 = dest + sizeof (HOST_WIDE_INT);
int i;
if (WORDS_BIG_ENDIAN)
if (elt_size <= HOST_BITS_PER_WIDE_INT/BITS_PER_UNIT)
{
p0 = p1;
p1 = dest;
insert_int ((HOST_WIDE_INT) val.elt (0), elt_size, dest);
return;
}
insert_int ((HOST_WIDE_INT) val.low, sizeof (HOST_WIDE_INT), p0);
insert_int ((HOST_WIDE_INT) val.high, sizeof (HOST_WIDE_INT), p1);
/* We'd have to extend this code to support odd sizes. */
gcc_assert (elt_size % (HOST_BITS_PER_WIDE_INT / BITS_PER_UNIT) == 0);
int n = elt_size / (HOST_BITS_PER_WIDE_INT / BITS_PER_UNIT);
if (WORDS_BIG_ENDIAN)
for (i = n - 1; i >= 0; i--)
{
insert_int ((HOST_WIDE_INT) val.elt (i), sizeof (HOST_WIDE_INT), dest);
dest += sizeof (HOST_WIDE_INT);
}
else
for (i = 0; i < n; i++)
{
insert_int ((HOST_WIDE_INT) val.elt (i), sizeof (HOST_WIDE_INT), dest);
dest += sizeof (HOST_WIDE_INT);
}
}
/* Writes floating point values to dw_vec_const array. */
@ -15053,6 +15228,11 @@ add_const_value_attribute (dw_die_ref die, rtx rtl)
}
return true;
case CONST_WIDE_INT:
add_AT_wide (die, DW_AT_const_value,
std::make_pair (rtl, GET_MODE (rtl)));
return true;
case CONST_DOUBLE:
/* Note that a CONST_DOUBLE rtx could represent either an integer or a
floating-point constant. A CONST_DOUBLE is used whenever the
@ -15061,7 +15241,10 @@ add_const_value_attribute (dw_die_ref die, rtx rtl)
{
enum machine_mode mode = GET_MODE (rtl);
if (SCALAR_FLOAT_MODE_P (mode))
if (TARGET_SUPPORTS_WIDE_INT == 0 && !SCALAR_FLOAT_MODE_P (mode))
add_AT_double (die, DW_AT_const_value,
CONST_DOUBLE_HIGH (rtl), CONST_DOUBLE_LOW (rtl));
else
{
unsigned int length = GET_MODE_SIZE (mode);
unsigned char *array = (unsigned char *) ggc_alloc_atomic (length);
@ -15069,9 +15252,6 @@ add_const_value_attribute (dw_die_ref die, rtx rtl)
insert_float (rtl, array);
add_AT_vec (die, DW_AT_const_value, length / 4, 4, array);
}
else
add_AT_double (die, DW_AT_const_value,
CONST_DOUBLE_HIGH (rtl), CONST_DOUBLE_LOW (rtl));
}
return true;
@ -15084,6 +15264,7 @@ add_const_value_attribute (dw_die_ref die, rtx rtl)
(length * elt_size);
unsigned int i;
unsigned char *p;
enum machine_mode imode = GET_MODE_INNER (mode);
switch (GET_MODE_CLASS (mode))
{
@ -15091,15 +15272,7 @@ add_const_value_attribute (dw_die_ref die, rtx rtl)
for (i = 0, p = array; i < length; i++, p += elt_size)
{
rtx elt = CONST_VECTOR_ELT (rtl, i);
double_int val = rtx_to_double_int (elt);
if (elt_size <= sizeof (HOST_WIDE_INT))
insert_int (val.to_shwi (), elt_size, p);
else
{
gcc_assert (elt_size == 2 * sizeof (HOST_WIDE_INT));
insert_double (val, p);
}
insert_wide_int (std::make_pair (elt, imode), p, elt_size);
}
break;
@ -16237,7 +16410,7 @@ add_bound_info (dw_die_ref subrange_die, enum dwarf_attribute bound_attr, tree b
consumers will treat DW_FORM_data[1248] as unsigned values,
regardless of the underlying type. */
else if (prec <= HOST_BITS_PER_WIDE_INT
|| TREE_INT_CST_HIGH (bound) == 0)
|| tree_fits_uhwi_p (bound))
{
if (TYPE_UNSIGNED (TREE_TYPE (bound)))
add_AT_unsigned (subrange_die, bound_attr,
@ -16250,8 +16423,7 @@ add_bound_info (dw_die_ref subrange_die, enum dwarf_attribute bound_attr, tree b
the precision of its type. The precision and signedness
of the type will be necessary to re-interpret it
unambiguously. */
add_AT_double (subrange_die, bound_attr, TREE_INT_CST_HIGH (bound),
TREE_INT_CST_LOW (bound));
add_AT_wide (subrange_die, bound_attr, bound);
}
break;
@ -17410,8 +17582,7 @@ gen_enumeration_type_die (tree type, dw_die_ref context_die)
/* Enumeration constants may be wider than HOST_WIDE_INT. Handle
that here. TODO: This should be re-worked to use correct
signed/unsigned double tags for all cases. */
add_AT_double (enum_die, DW_AT_const_value,
TREE_INT_CST_HIGH (value), TREE_INT_CST_LOW (value));
add_AT_wide (enum_die, DW_AT_const_value, value);
}
add_gnat_descriptive_type_attribute (type_die, type, context_die);
@ -23549,6 +23720,9 @@ hash_loc_operands (dw_loc_descr_ref loc, hashval_t hash)
hash = iterative_hash_object (val2->v.val_double.low, hash);
hash = iterative_hash_object (val2->v.val_double.high, hash);
break;
case dw_val_class_wide_int:
hash = iterative_hash_object (*val2->v.val_wide, hash);
break;
case dw_val_class_addr:
hash = iterative_hash_rtx (val2->v.val_addr, hash);
break;
@ -23638,6 +23812,9 @@ hash_loc_operands (dw_loc_descr_ref loc, hashval_t hash)
hash = iterative_hash_object (val2->v.val_double.low, hash);
hash = iterative_hash_object (val2->v.val_double.high, hash);
break;
case dw_val_class_wide_int:
hash = iterative_hash_object (*val2->v.val_wide, hash);
break;
default:
gcc_unreachable ();
}
@ -23786,6 +23963,8 @@ compare_loc_operands (dw_loc_descr_ref x, dw_loc_descr_ref y)
case dw_val_class_const_double:
return valx2->v.val_double.low == valy2->v.val_double.low
&& valx2->v.val_double.high == valy2->v.val_double.high;
case dw_val_class_wide_int:
return *valx2->v.val_wide == *valy2->v.val_wide;
case dw_val_class_addr:
return rtx_equal_p (valx2->v.val_addr, valy2->v.val_addr);
default:
@ -23829,6 +24008,8 @@ compare_loc_operands (dw_loc_descr_ref x, dw_loc_descr_ref y)
case dw_val_class_const_double:
return valx2->v.val_double.low == valy2->v.val_double.low
&& valx2->v.val_double.high == valy2->v.val_double.high;
case dw_val_class_wide_int:
return *valx2->v.val_wide == *valy2->v.val_wide;
default:
gcc_unreachable ();
}

View file

@ -21,6 +21,7 @@ along with GCC; see the file COPYING3. If not see
#define GCC_DWARF2OUT_H 1
#include "dwarf2.h" /* ??? Remove this once only used by dwarf2foo.c. */
#include "wide-int.h"
typedef struct die_struct *dw_die_ref;
typedef const struct die_struct *const_dw_die_ref;
@ -29,6 +30,7 @@ typedef struct dw_val_node *dw_val_ref;
typedef struct dw_cfi_node *dw_cfi_ref;
typedef struct dw_loc_descr_node *dw_loc_descr_ref;
typedef struct dw_loc_list_struct *dw_loc_list_ref;
typedef wide_int *wide_int_ptr;
/* Call frames are described using a sequence of Call Frame
@ -136,6 +138,7 @@ enum dw_val_class
dw_val_class_const,
dw_val_class_unsigned_const,
dw_val_class_const_double,
dw_val_class_wide_int,
dw_val_class_vec,
dw_val_class_flag,
dw_val_class_die_ref,
@ -176,6 +179,7 @@ struct GTY(()) dw_val_node {
HOST_WIDE_INT GTY ((default)) val_int;
unsigned HOST_WIDE_INT GTY ((tag ("dw_val_class_unsigned_const"))) val_unsigned;
double_int GTY ((tag ("dw_val_class_const_double"))) val_double;
wide_int_ptr GTY ((tag ("dw_val_class_wide_int"))) val_wide;
dw_vec_const GTY ((tag ("dw_val_class_vec"))) val_vec;
struct dw_val_die_union
{

View file

@ -126,6 +126,9 @@ rtx cc0_rtx;
static GTY ((if_marked ("ggc_marked_p"), param_is (struct rtx_def)))
htab_t const_int_htab;
static GTY ((if_marked ("ggc_marked_p"), param_is (struct rtx_def)))
htab_t const_wide_int_htab;
/* A hash table storing register attribute structures. */
static GTY ((if_marked ("ggc_marked_p"), param_is (struct reg_attrs)))
htab_t reg_attrs_htab;
@ -147,6 +150,11 @@ static void set_used_decls (tree);
static void mark_label_nuses (rtx);
static hashval_t const_int_htab_hash (const void *);
static int const_int_htab_eq (const void *, const void *);
#if TARGET_SUPPORTS_WIDE_INT
static hashval_t const_wide_int_htab_hash (const void *);
static int const_wide_int_htab_eq (const void *, const void *);
static rtx lookup_const_wide_int (rtx);
#endif
static hashval_t const_double_htab_hash (const void *);
static int const_double_htab_eq (const void *, const void *);
static rtx lookup_const_double (rtx);
@ -181,6 +189,43 @@ const_int_htab_eq (const void *x, const void *y)
return (INTVAL ((const_rtx) x) == *((const HOST_WIDE_INT *) y));
}
#if TARGET_SUPPORTS_WIDE_INT
/* Returns a hash code for X (which is a really a CONST_WIDE_INT). */
static hashval_t
const_wide_int_htab_hash (const void *x)
{
int i;
HOST_WIDE_INT hash = 0;
const_rtx xr = (const_rtx) x;
for (i = 0; i < CONST_WIDE_INT_NUNITS (xr); i++)
hash += CONST_WIDE_INT_ELT (xr, i);
return (hashval_t) hash;
}
/* Returns nonzero if the value represented by X (which is really a
CONST_WIDE_INT) is the same as that given by Y (which is really a
CONST_WIDE_INT). */
static int
const_wide_int_htab_eq (const void *x, const void *y)
{
int i;
const_rtx xr = (const_rtx) x;
const_rtx yr = (const_rtx) y;
if (CONST_WIDE_INT_NUNITS (xr) != CONST_WIDE_INT_NUNITS (yr))
return 0;
for (i = 0; i < CONST_WIDE_INT_NUNITS (xr); i++)
if (CONST_WIDE_INT_ELT (xr, i) != CONST_WIDE_INT_ELT (yr, i))
return 0;
return 1;
}
#endif
/* Returns a hash code for X (which is really a CONST_DOUBLE). */
static hashval_t
const_double_htab_hash (const void *x)
@ -188,7 +233,7 @@ const_double_htab_hash (const void *x)
const_rtx const value = (const_rtx) x;
hashval_t h;
if (GET_MODE (value) == VOIDmode)
if (TARGET_SUPPORTS_WIDE_INT == 0 && GET_MODE (value) == VOIDmode)
h = CONST_DOUBLE_LOW (value) ^ CONST_DOUBLE_HIGH (value);
else
{
@ -208,7 +253,7 @@ const_double_htab_eq (const void *x, const void *y)
if (GET_MODE (a) != GET_MODE (b))
return 0;
if (GET_MODE (a) == VOIDmode)
if (TARGET_SUPPORTS_WIDE_INT == 0 && GET_MODE (a) == VOIDmode)
return (CONST_DOUBLE_LOW (a) == CONST_DOUBLE_LOW (b)
&& CONST_DOUBLE_HIGH (a) == CONST_DOUBLE_HIGH (b));
else
@ -446,6 +491,7 @@ const_fixed_from_fixed_value (FIXED_VALUE_TYPE value, enum machine_mode mode)
return lookup_const_fixed (fixed);
}
#if TARGET_SUPPORTS_WIDE_INT == 0
/* Constructs double_int from rtx CST. */
double_int
@ -465,17 +511,70 @@ rtx_to_double_int (const_rtx cst)
return r;
}
#endif
#if TARGET_SUPPORTS_WIDE_INT
/* Determine whether CONST_WIDE_INT WINT already exists in the hash table.
If so, return its counterpart; otherwise add it to the hash table and
return it. */
/* Return a CONST_DOUBLE or CONST_INT for a value specified as
a double_int. */
static rtx
lookup_const_wide_int (rtx wint)
{
void **slot = htab_find_slot (const_wide_int_htab, wint, INSERT);
if (*slot == 0)
*slot = wint;
return (rtx) *slot;
}
#endif
/* Return an rtx constant for V, given that the constant has mode MODE.
The returned rtx will be a CONST_INT if V fits, otherwise it will be
a CONST_DOUBLE (if !TARGET_SUPPORTS_WIDE_INT) or a CONST_WIDE_INT
(if TARGET_SUPPORTS_WIDE_INT). */
rtx
immed_double_int_const (double_int i, enum machine_mode mode)
immed_wide_int_const (const wide_int_ref &v, enum machine_mode mode)
{
return immed_double_const (i.low, i.high, mode);
unsigned int len = v.get_len ();
unsigned int prec = GET_MODE_PRECISION (mode);
/* Allow truncation but not extension since we do not know if the
number is signed or unsigned. */
gcc_assert (prec <= v.get_precision ());
if (len < 2 || prec <= HOST_BITS_PER_WIDE_INT)
return gen_int_mode (v.elt (0), mode);
#if TARGET_SUPPORTS_WIDE_INT
{
unsigned int i;
rtx value;
unsigned int blocks_needed
= (prec + HOST_BITS_PER_WIDE_INT - 1) / HOST_BITS_PER_WIDE_INT;
if (len > blocks_needed)
len = blocks_needed;
value = const_wide_int_alloc (len);
/* It is so tempting to just put the mode in here. Must control
myself ... */
PUT_MODE (value, VOIDmode);
CWI_PUT_NUM_ELEM (value, len);
for (i = 0; i < len; i++)
CONST_WIDE_INT_ELT (value, i) = v.elt (i);
return lookup_const_wide_int (value);
}
#else
return immed_double_const (v.elt (0), v.elt (1), mode);
#endif
}
#if TARGET_SUPPORTS_WIDE_INT == 0
/* Return a CONST_DOUBLE or CONST_INT for a value specified as a pair
of ints: I0 is the low-order word and I1 is the high-order word.
For values that are larger than HOST_BITS_PER_DOUBLE_INT, the
@ -527,6 +626,7 @@ immed_double_const (HOST_WIDE_INT i0, HOST_WIDE_INT i1, enum machine_mode mode)
return lookup_const_double (value);
}
#endif
rtx
gen_rtx_REG (enum machine_mode mode, unsigned int regno)
@ -5629,11 +5729,15 @@ init_emit_once (void)
enum machine_mode mode;
enum machine_mode double_mode;
/* Initialize the CONST_INT, CONST_DOUBLE, CONST_FIXED, and memory attribute
hash tables. */
/* Initialize the CONST_INT, CONST_WIDE_INT, CONST_DOUBLE,
CONST_FIXED, and memory attribute hash tables. */
const_int_htab = htab_create_ggc (37, const_int_htab_hash,
const_int_htab_eq, NULL);
#if TARGET_SUPPORTS_WIDE_INT
const_wide_int_htab = htab_create_ggc (37, const_wide_int_htab_hash,
const_wide_int_htab_eq, NULL);
#endif
const_double_htab = htab_create_ggc (37, const_double_htab_hash,
const_double_htab_eq, NULL);
@ -5695,9 +5799,9 @@ init_emit_once (void)
else
const_true_rtx = gen_rtx_CONST_INT (VOIDmode, STORE_FLAG_VALUE);
REAL_VALUE_FROM_INT (dconst0, 0, 0, double_mode);
REAL_VALUE_FROM_INT (dconst1, 1, 0, double_mode);
REAL_VALUE_FROM_INT (dconst2, 2, 0, double_mode);
real_from_integer (&dconst0, double_mode, 0, SIGNED);
real_from_integer (&dconst1, double_mode, 1, SIGNED);
real_from_integer (&dconst2, double_mode, 2, SIGNED);
dconstm1 = dconst1;
dconstm1.sign = 1;

View file

@ -96,38 +96,9 @@ plus_constant (enum machine_mode mode, rtx x, HOST_WIDE_INT c)
switch (code)
{
case CONST_INT:
if (GET_MODE_BITSIZE (mode) > HOST_BITS_PER_WIDE_INT)
{
double_int di_x = double_int::from_shwi (INTVAL (x));
double_int di_c = double_int::from_shwi (c);
bool overflow;
double_int v = di_x.add_with_sign (di_c, false, &overflow);
if (overflow)
gcc_unreachable ();
return immed_double_int_const (v, mode);
}
return gen_int_mode (UINTVAL (x) + c, mode);
case CONST_DOUBLE:
{
double_int di_x = double_int::from_pair (CONST_DOUBLE_HIGH (x),
CONST_DOUBLE_LOW (x));
double_int di_c = double_int::from_shwi (c);
bool overflow;
double_int v = di_x.add_with_sign (di_c, false, &overflow);
if (overflow)
/* Sorry, we have no way to represent overflows this wide.
To fix, add constant support wider than CONST_DOUBLE. */
gcc_assert (GET_MODE_BITSIZE (mode) <= HOST_BITS_PER_DOUBLE_INT);
return immed_double_int_const (v, mode);
}
CASE_CONST_SCALAR_INT:
return immed_wide_int_const (wi::add (std::make_pair (x, mode), c),
mode);
case MEM:
/* If this is a reference to the constant pool, try replacing it with
a reference to a new constant. If the resulting address isn't

View file

@ -62,7 +62,6 @@ static rtx extract_fixed_bit_field (enum machine_mode, rtx,
static rtx extract_fixed_bit_field_1 (enum machine_mode, rtx,
unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT, rtx, int);
static rtx mask_rtx (enum machine_mode, int, int, int);
static rtx lshift_value (enum machine_mode, unsigned HOST_WIDE_INT, int);
static rtx extract_split_bit_field (rtx, unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT, int);
@ -70,6 +69,19 @@ static void do_cmp_and_jump (rtx, rtx, enum rtx_code, enum machine_mode, rtx);
static rtx expand_smod_pow2 (enum machine_mode, rtx, HOST_WIDE_INT);
static rtx expand_sdiv_pow2 (enum machine_mode, rtx, HOST_WIDE_INT);
/* Return a constant integer mask value of mode MODE with BITSIZE ones
followed by BITPOS zeros, or the complement of that if COMPLEMENT.
The mask is truncated if necessary to the width of mode MODE. The
mask is zero-extended if BITSIZE+BITPOS is too small for MODE. */
static inline rtx
mask_rtx (enum machine_mode mode, int bitpos, int bitsize, bool complement)
{
return immed_wide_int_const
(wi::shifted_mask (bitpos, bitsize, complement,
GET_MODE_PRECISION (mode)), mode);
}
/* Test whether a value is zero of a power of two. */
#define EXACT_POWER_OF_2_OR_ZERO_P(x) \
(((x) & ((x) - (unsigned HOST_WIDE_INT) 1)) == 0)
@ -1885,26 +1897,6 @@ extract_fixed_bit_field_1 (enum machine_mode tmode, rtx op0,
return expand_shift (RSHIFT_EXPR, mode, op0,
GET_MODE_BITSIZE (mode) - bitsize, target, 0);
}
/* Return a constant integer (CONST_INT or CONST_DOUBLE) mask value
of mode MODE with BITSIZE ones followed by BITPOS zeros, or the
complement of that if COMPLEMENT. The mask is truncated if
necessary to the width of mode MODE. The mask is zero-extended if
BITSIZE+BITPOS is too small for MODE. */
static rtx
mask_rtx (enum machine_mode mode, int bitpos, int bitsize, int complement)
{
double_int mask;
mask = double_int::mask (bitsize);
mask = mask.llshift (bitpos, HOST_BITS_PER_DOUBLE_INT);
if (complement)
mask = ~mask;
return immed_double_int_const (mask, mode);
}
/* Return a constant integer (CONST_INT or CONST_DOUBLE) rtx with the value
VALUE << BITPOS. */
@ -1913,12 +1905,7 @@ static rtx
lshift_value (enum machine_mode mode, unsigned HOST_WIDE_INT value,
int bitpos)
{
double_int val;
val = double_int::from_uhwi (value);
val = val.llshift (bitpos, HOST_BITS_PER_DOUBLE_INT);
return immed_double_int_const (val, mode);
return immed_wide_int_const (wi::lshift (value, bitpos), mode);
}
/* Extract a bit field that is split across two words
@ -3154,38 +3141,22 @@ expand_mult (enum machine_mode mode, rtx op0, rtx op1, rtx target,
only if the constant value exactly fits in an `unsigned int' without
any truncation. This means that multiplying by negative values does
not work; results are off by 2^32 on a 32 bit machine. */
if (CONST_INT_P (scalar_op1))
{
coeff = INTVAL (scalar_op1);
is_neg = coeff < 0;
}
#if TARGET_SUPPORTS_WIDE_INT
else if (CONST_WIDE_INT_P (scalar_op1))
#else
else if (CONST_DOUBLE_AS_INT_P (scalar_op1))
#endif
{
/* If we are multiplying in DImode, it may still be a win
to try to work with shifts and adds. */
if (CONST_DOUBLE_HIGH (scalar_op1) == 0
&& (CONST_DOUBLE_LOW (scalar_op1) > 0
|| (CONST_DOUBLE_LOW (scalar_op1) < 0
&& EXACT_POWER_OF_2_OR_ZERO_P
(CONST_DOUBLE_LOW (scalar_op1)))))
{
coeff = CONST_DOUBLE_LOW (scalar_op1);
is_neg = false;
}
else if (CONST_DOUBLE_LOW (scalar_op1) == 0)
{
coeff = CONST_DOUBLE_HIGH (scalar_op1);
if (EXACT_POWER_OF_2_OR_ZERO_P (coeff))
{
int shift = floor_log2 (coeff) + HOST_BITS_PER_WIDE_INT;
if (shift < HOST_BITS_PER_DOUBLE_INT - 1
|| mode_bitsize <= HOST_BITS_PER_DOUBLE_INT)
return expand_shift (LSHIFT_EXPR, mode, op0,
shift, target, unsignedp);
}
goto skip_synth;
}
int shift = wi::exact_log2 (std::make_pair (scalar_op1, mode));
/* Perfect power of 2 (other than 1, which is handled above). */
if (shift > 0)
return expand_shift (LSHIFT_EXPR, mode, op0,
shift, target, unsignedp);
else
goto skip_synth;
}
@ -3362,7 +3333,6 @@ choose_multiplier (unsigned HOST_WIDE_INT d, int n, int precision,
unsigned HOST_WIDE_INT *multiplier_ptr,
int *post_shift_ptr, int *lgup_ptr)
{
double_int mhigh, mlow;
int lgup, post_shift;
int pow, pow2;
@ -3374,23 +3344,13 @@ choose_multiplier (unsigned HOST_WIDE_INT d, int n, int precision,
pow = n + lgup;
pow2 = n + lgup - precision;
/* We could handle this with some effort, but this case is much
better handled directly with a scc insn, so rely on caller using
that. */
gcc_assert (pow != HOST_BITS_PER_DOUBLE_INT);
/* mlow = 2^(N + lgup)/d */
double_int val = double_int_zero.set_bit (pow);
mlow = val.div (double_int::from_uhwi (d), true, TRUNC_DIV_EXPR);
wide_int val = wi::set_bit_in_zero (pow, HOST_BITS_PER_DOUBLE_INT);
wide_int mlow = wi::udiv_trunc (val, d);
/* mhigh = (2^(N + lgup) + 2^(N + lgup - precision))/d */
val |= double_int_zero.set_bit (pow2);
mhigh = val.div (double_int::from_uhwi (d), true, TRUNC_DIV_EXPR);
gcc_assert (!mhigh.high || val.high - d < d);
gcc_assert (mhigh.high <= 1 && mlow.high <= 1);
/* Assert that mlow < mhigh. */
gcc_assert (mlow.ult (mhigh));
val |= wi::set_bit_in_zero (pow2, HOST_BITS_PER_DOUBLE_INT);
wide_int mhigh = wi::udiv_trunc (val, d);
/* If precision == N, then mlow, mhigh exceed 2^N
(but they do not exceed 2^(N+1)). */
@ -3398,14 +3358,15 @@ choose_multiplier (unsigned HOST_WIDE_INT d, int n, int precision,
/* Reduce to lowest terms. */
for (post_shift = lgup; post_shift > 0; post_shift--)
{
int shft = HOST_BITS_PER_WIDE_INT - 1;
unsigned HOST_WIDE_INT ml_lo = (mlow.high << shft) | (mlow.low >> 1);
unsigned HOST_WIDE_INT mh_lo = (mhigh.high << shft) | (mhigh.low >> 1);
unsigned HOST_WIDE_INT ml_lo = wi::extract_uhwi (mlow, 1,
HOST_BITS_PER_WIDE_INT);
unsigned HOST_WIDE_INT mh_lo = wi::extract_uhwi (mhigh, 1,
HOST_BITS_PER_WIDE_INT);
if (ml_lo >= mh_lo)
break;
mlow = double_int::from_uhwi (ml_lo);
mhigh = double_int::from_uhwi (mh_lo);
mlow = wi::uhwi (ml_lo, HOST_BITS_PER_DOUBLE_INT);
mhigh = wi::uhwi (mh_lo, HOST_BITS_PER_DOUBLE_INT);
}
*post_shift_ptr = post_shift;
@ -3413,13 +3374,13 @@ choose_multiplier (unsigned HOST_WIDE_INT d, int n, int precision,
if (n < HOST_BITS_PER_WIDE_INT)
{
unsigned HOST_WIDE_INT mask = ((unsigned HOST_WIDE_INT) 1 << n) - 1;
*multiplier_ptr = mhigh.low & mask;
return mhigh.low >= mask;
*multiplier_ptr = mhigh.to_uhwi () & mask;
return mhigh.to_uhwi () >= mask;
}
else
{
*multiplier_ptr = mhigh.low;
return mhigh.high;
*multiplier_ptr = mhigh.to_uhwi ();
return wi::extract_uhwi (mhigh, HOST_BITS_PER_WIDE_INT, 1);
}
}
@ -3686,9 +3647,9 @@ expmed_mult_highpart (enum machine_mode mode, rtx op0, rtx op1,
static rtx
expand_smod_pow2 (enum machine_mode mode, rtx op0, HOST_WIDE_INT d)
{
unsigned HOST_WIDE_INT masklow, maskhigh;
rtx result, temp, shift, label;
int logd;
int prec = GET_MODE_PRECISION (mode);
logd = floor_log2 (d);
result = gen_reg_rtx (mode);
@ -3701,8 +3662,8 @@ expand_smod_pow2 (enum machine_mode mode, rtx op0, HOST_WIDE_INT d)
mode, 0, -1);
if (signmask)
{
HOST_WIDE_INT masklow = ((HOST_WIDE_INT) 1 << logd) - 1;
signmask = force_reg (mode, signmask);
masklow = ((HOST_WIDE_INT) 1 << logd) - 1;
shift = GEN_INT (GET_MODE_BITSIZE (mode) - logd);
/* Use the rtx_cost of a LSHIFTRT instruction to determine
@ -3749,19 +3710,11 @@ expand_smod_pow2 (enum machine_mode mode, rtx op0, HOST_WIDE_INT d)
modulus. By including the signbit in the operation, many targets
can avoid an explicit compare operation in the following comparison
against zero. */
masklow = ((HOST_WIDE_INT) 1 << logd) - 1;
if (GET_MODE_BITSIZE (mode) <= HOST_BITS_PER_WIDE_INT)
{
masklow |= HOST_WIDE_INT_M1U << (GET_MODE_BITSIZE (mode) - 1);
maskhigh = -1;
}
else
maskhigh = HOST_WIDE_INT_M1U
<< (GET_MODE_BITSIZE (mode) - HOST_BITS_PER_WIDE_INT - 1);
wide_int mask = wi::mask (logd, false, prec);
mask = wi::set_bit (mask, prec - 1);
temp = expand_binop (mode, and_optab, op0,
immed_double_const (masklow, maskhigh, mode),
immed_wide_int_const (mask, mode),
result, 1, OPTAB_LIB_WIDEN);
if (temp != result)
emit_move_insn (result, temp);
@ -3771,10 +3724,10 @@ expand_smod_pow2 (enum machine_mode mode, rtx op0, HOST_WIDE_INT d)
temp = expand_binop (mode, sub_optab, result, const1_rtx, result,
0, OPTAB_LIB_WIDEN);
masklow = HOST_WIDE_INT_M1U << logd;
maskhigh = -1;
mask = wi::mask (logd, true, prec);
temp = expand_binop (mode, ior_optab, temp,
immed_double_const (masklow, maskhigh, mode),
immed_wide_int_const (mask, mode),
result, 1, OPTAB_LIB_WIDEN);
temp = expand_binop (mode, add_optab, temp, const1_rtx, result,
0, OPTAB_LIB_WIDEN);
@ -5013,24 +4966,16 @@ make_tree (tree type, rtx x)
switch (GET_CODE (x))
{
case CONST_INT:
{
HOST_WIDE_INT hi = 0;
if (INTVAL (x) < 0
&& !(TYPE_UNSIGNED (type)
&& (GET_MODE_BITSIZE (TYPE_MODE (type))
< HOST_BITS_PER_WIDE_INT)))
hi = -1;
t = build_int_cst_wide (type, INTVAL (x), hi);
return t;
}
case CONST_WIDE_INT:
t = wide_int_to_tree (type, std::make_pair (x, TYPE_MODE (type)));
return t;
case CONST_DOUBLE:
if (GET_MODE (x) == VOIDmode)
t = build_int_cst_wide (type,
CONST_DOUBLE_LOW (x), CONST_DOUBLE_HIGH (x));
STATIC_ASSERT (HOST_BITS_PER_WIDE_INT * 2 <= MAX_BITSIZE_MODE_ANY_INT);
if (TARGET_SUPPORTS_WIDE_INT == 0 && GET_MODE (x) == VOIDmode)
t = wide_int_to_tree (type,
wide_int::from_array (&CONST_DOUBLE_LOW (x), 2,
HOST_BITS_PER_WIDE_INT * 2));
else
{
REAL_VALUE_TYPE d;

View file

@ -711,64 +711,32 @@ convert_modes (enum machine_mode mode, enum machine_mode oldmode, rtx x, int uns
if (mode == oldmode)
return x;
/* There is one case that we must handle specially: If we are converting
a CONST_INT into a mode whose size is twice HOST_BITS_PER_WIDE_INT and
we are to interpret the constant as unsigned, gen_lowpart will do
the wrong if the constant appears negative. What we want to do is
make the high-order word of the constant zero, not all ones. */
if (unsignedp && GET_MODE_CLASS (mode) == MODE_INT
&& GET_MODE_BITSIZE (mode) == HOST_BITS_PER_DOUBLE_INT
&& CONST_INT_P (x) && INTVAL (x) < 0)
if (CONST_SCALAR_INT_P (x) && GET_MODE_CLASS (mode) == MODE_INT)
{
double_int val = double_int::from_uhwi (INTVAL (x));
/* We need to zero extend VAL. */
if (oldmode != VOIDmode)
val = val.zext (GET_MODE_BITSIZE (oldmode));
return immed_double_int_const (val, mode);
/* If the caller did not tell us the old mode, then there is not
much to do with respect to canonicalization. We have to
assume that all the bits are significant. */
if (GET_MODE_CLASS (oldmode) != MODE_INT)
oldmode = MAX_MODE_INT;
wide_int w = wide_int::from (std::make_pair (x, oldmode),
GET_MODE_PRECISION (mode),
unsignedp ? UNSIGNED : SIGNED);
return immed_wide_int_const (w, mode);
}
/* We can do this with a gen_lowpart if both desired and current modes
are integer, and this is either a constant integer, a register, or a
non-volatile MEM. Except for the constant case where MODE is no
wider than HOST_BITS_PER_WIDE_INT, we must be narrowing the operand. */
non-volatile MEM. */
if (GET_MODE_CLASS (mode) == MODE_INT
&& GET_MODE_CLASS (oldmode) == MODE_INT
&& GET_MODE_PRECISION (mode) <= GET_MODE_PRECISION (oldmode)
&& ((MEM_P (x) && !MEM_VOLATILE_P (x) && direct_load[(int) mode])
|| (REG_P (x)
&& (!HARD_REGISTER_P (x)
|| HARD_REGNO_MODE_OK (REGNO (x), mode))
&& TRULY_NOOP_TRUNCATION_MODES_P (mode, GET_MODE (x)))))
if ((CONST_INT_P (x)
&& GET_MODE_PRECISION (mode) <= HOST_BITS_PER_WIDE_INT)
|| (GET_MODE_CLASS (mode) == MODE_INT
&& GET_MODE_CLASS (oldmode) == MODE_INT
&& (CONST_DOUBLE_AS_INT_P (x)
|| (GET_MODE_PRECISION (mode) <= GET_MODE_PRECISION (oldmode)
&& ((MEM_P (x) && ! MEM_VOLATILE_P (x)
&& direct_load[(int) mode])
|| (REG_P (x)
&& (! HARD_REGISTER_P (x)
|| HARD_REGNO_MODE_OK (REGNO (x), mode))
&& TRULY_NOOP_TRUNCATION_MODES_P (mode,
GET_MODE (x))))))))
{
/* ?? If we don't know OLDMODE, we have to assume here that
X does not need sign- or zero-extension. This may not be
the case, but it's the best we can do. */
if (CONST_INT_P (x) && oldmode != VOIDmode
&& GET_MODE_PRECISION (mode) > GET_MODE_PRECISION (oldmode))
{
HOST_WIDE_INT val = INTVAL (x);
/* We must sign or zero-extend in this case. Start by
zero-extending, then sign extend if we need to. */
val &= GET_MODE_MASK (oldmode);
if (! unsignedp
&& val_signbit_known_set_p (oldmode, val))
val |= ~GET_MODE_MASK (oldmode);
return gen_int_mode (val, mode);
}
return gen_lowpart (mode, x);
}
return gen_lowpart (mode, x);
/* Converting from integer constant into mode is always equivalent to an
subreg operation. */
@ -1794,6 +1762,7 @@ emit_group_load_1 (rtx *tmps, rtx dst, rtx orig_src, tree type, int ssize)
{
rtx first, second;
/* TODO: const_wide_int can have sizes other than this... */
gcc_assert (2 * len == ssize);
split_double (src, &first, &second);
if (i)
@ -5330,8 +5299,8 @@ store_expr (tree exp, rtx target, int call_param_p, bool nontemporal)
/* If TEMP is a VOIDmode constant and the mode of the type of EXP is not
the same as that of TARGET, adjust the constant. This is needed, for
example, in case it is a CONST_DOUBLE and we want only a word-sized
value. */
example, in case it is a CONST_DOUBLE or CONST_WIDE_INT and we want
only a word-sized value. */
if (CONSTANT_P (temp) && GET_MODE (temp) == VOIDmode
&& TREE_CODE (exp) != ERROR_MARK
&& GET_MODE (target) != TYPE_MODE (TREE_TYPE (exp)))
@ -6692,7 +6661,7 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
enum machine_mode mode = VOIDmode;
bool blkmode_bitfield = false;
tree offset = size_zero_node;
double_int bit_offset = double_int_zero;
offset_int bit_offset = 0;
/* First get the mode, signedness, and size. We do this from just the
outermost expression. */
@ -6755,7 +6724,7 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
switch (TREE_CODE (exp))
{
case BIT_FIELD_REF:
bit_offset += tree_to_double_int (TREE_OPERAND (exp, 2));
bit_offset += wi::to_offset (TREE_OPERAND (exp, 2));
break;
case COMPONENT_REF:
@ -6770,7 +6739,7 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
break;
offset = size_binop (PLUS_EXPR, offset, this_offset);
bit_offset += tree_to_double_int (DECL_FIELD_BIT_OFFSET (field));
bit_offset += wi::to_offset (DECL_FIELD_BIT_OFFSET (field));
/* ??? Right now we don't do anything with DECL_OFFSET_ALIGN. */
}
@ -6802,7 +6771,7 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
break;
case IMAGPART_EXPR:
bit_offset += double_int::from_uhwi (*pbitsize);
bit_offset += *pbitsize;
break;
case VIEW_CONVERT_EXPR:
@ -6823,9 +6792,8 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
tree off = TREE_OPERAND (exp, 1);
if (!integer_zerop (off))
{
double_int boff, coff = mem_ref_offset (exp);
boff = coff.lshift (BITS_PER_UNIT == 8
? 3 : exact_log2 (BITS_PER_UNIT));
offset_int boff, coff = mem_ref_offset (exp);
boff = wi::lshift (coff, LOG2_BITS_PER_UNIT);
bit_offset += boff;
}
exp = TREE_OPERAND (TREE_OPERAND (exp, 0), 0);
@ -6849,11 +6817,11 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
this conversion. */
if (TREE_CODE (offset) == INTEGER_CST)
{
double_int tem = tree_to_double_int (offset);
tem = tem.sext (TYPE_PRECISION (sizetype));
tem = tem.lshift (BITS_PER_UNIT == 8 ? 3 : exact_log2 (BITS_PER_UNIT));
offset_int tem = wi::sext (wi::to_offset (offset),
TYPE_PRECISION (sizetype));
tem = wi::lshift (tem, LOG2_BITS_PER_UNIT);
tem += bit_offset;
if (tem.fits_shwi ())
if (wi::fits_shwi_p (tem))
{
*pbitpos = tem.to_shwi ();
*poffset = offset = NULL_TREE;
@ -6864,20 +6832,16 @@ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
if (offset)
{
/* Avoid returning a negative bitpos as this may wreak havoc later. */
if (bit_offset.is_negative ())
if (wi::neg_p (bit_offset))
{
double_int mask
= double_int::mask (BITS_PER_UNIT == 8
? 3 : exact_log2 (BITS_PER_UNIT));
double_int tem = bit_offset.and_not (mask);
offset_int mask = wi::mask <offset_int> (LOG2_BITS_PER_UNIT, false);
offset_int tem = bit_offset.and_not (mask);
/* TEM is the bitpos rounded to BITS_PER_UNIT towards -Inf.
Subtract it to BIT_OFFSET and add it (scaled) to OFFSET. */
bit_offset -= tem;
tem = tem.arshift (BITS_PER_UNIT == 8
? 3 : exact_log2 (BITS_PER_UNIT),
HOST_BITS_PER_DOUBLE_INT);
tem = wi::arshift (tem, LOG2_BITS_PER_UNIT);
offset = size_binop (PLUS_EXPR, offset,
double_int_to_tree (sizetype, tem));
wide_int_to_tree (sizetype, tem));
}
*pbitpos = bit_offset.to_shwi ();
@ -7813,11 +7777,12 @@ expand_constructor (tree exp, rtx target, enum expand_modifier modifier,
/* All elts simple constants => refer to a constant in memory. But
if this is a non-BLKmode mode, let it store a field at a time
since that should make a CONST_INT or CONST_DOUBLE when we
fold. Likewise, if we have a target we can use, it is best to
store directly into the target unless the type is large enough
that memcpy will be used. If we are making an initializer and
all operands are constant, put it in memory as well.
since that should make a CONST_INT, CONST_WIDE_INT or
CONST_DOUBLE when we fold. Likewise, if we have a target we can
use, it is best to store directly into the target unless the type
is large enough that memcpy will be used. If we are making an
initializer and all operands are constant, put it in memory as
well.
FIXME: Avoid trying to fill vector constructors piece-meal.
Output them with output_constant_def below unless we're sure
@ -8294,17 +8259,18 @@ expand_expr_real_2 (sepops ops, rtx target, enum machine_mode tmode,
&& TREE_CONSTANT (treeop1))
{
rtx constant_part;
HOST_WIDE_INT wc;
enum machine_mode wmode = TYPE_MODE (TREE_TYPE (treeop1));
op1 = expand_expr (treeop1, subtarget, VOIDmode,
EXPAND_SUM);
/* Use immed_double_const to ensure that the constant is
/* Use wi::shwi to ensure that the constant is
truncated according to the mode of OP1, then sign extended
to a HOST_WIDE_INT. Using the constant directly can result
in non-canonical RTL in a 64x32 cross compile. */
constant_part
= immed_double_const (TREE_INT_CST_LOW (treeop0),
(HOST_WIDE_INT) 0,
TYPE_MODE (TREE_TYPE (treeop1)));
wc = TREE_INT_CST_LOW (treeop0);
constant_part =
immed_wide_int_const (wi::shwi (wc, wmode), wmode);
op1 = plus_constant (mode, op1, INTVAL (constant_part));
if (modifier != EXPAND_SUM && modifier != EXPAND_INITIALIZER)
op1 = force_operand (op1, target);
@ -8316,6 +8282,8 @@ expand_expr_real_2 (sepops ops, rtx target, enum machine_mode tmode,
&& TREE_CONSTANT (treeop0))
{
rtx constant_part;
HOST_WIDE_INT wc;
enum machine_mode wmode = TYPE_MODE (TREE_TYPE (treeop0));
op0 = expand_expr (treeop0, subtarget, VOIDmode,
(modifier == EXPAND_INITIALIZER
@ -8330,14 +8298,13 @@ expand_expr_real_2 (sepops ops, rtx target, enum machine_mode tmode,
return simplify_gen_binary (PLUS, mode, op0, op1);
goto binop2;
}
/* Use immed_double_const to ensure that the constant is
/* Use wi::shwi to ensure that the constant is
truncated according to the mode of OP1, then sign extended
to a HOST_WIDE_INT. Using the constant directly can result
in non-canonical RTL in a 64x32 cross compile. */
wc = TREE_INT_CST_LOW (treeop1);
constant_part
= immed_double_const (TREE_INT_CST_LOW (treeop1),
(HOST_WIDE_INT) 0,
TYPE_MODE (TREE_TYPE (treeop0)));
= immed_wide_int_const (wi::shwi (wc, wmode), wmode);
op0 = plus_constant (mode, op0, INTVAL (constant_part));
if (modifier != EXPAND_SUM && modifier != EXPAND_INITIALIZER)
op0 = force_operand (op0, target);
@ -8860,10 +8827,14 @@ expand_expr_real_2 (sepops ops, rtx target, enum machine_mode tmode,
for unsigned bitfield expand this as XOR with a proper constant
instead. */
if (reduce_bit_field && TYPE_UNSIGNED (type))
temp = expand_binop (mode, xor_optab, op0,
immed_double_int_const
(double_int::mask (TYPE_PRECISION (type)), mode),
target, 1, OPTAB_LIB_WIDEN);
{
wide_int mask = wi::mask (TYPE_PRECISION (type),
false, GET_MODE_PRECISION (mode));
temp = expand_binop (mode, xor_optab, op0,
immed_wide_int_const (mask, mode),
target, 1, OPTAB_LIB_WIDEN);
}
else
temp = expand_unop (mode, one_cmpl_optab, op0, target, 1);
gcc_assert (temp);
@ -9534,9 +9505,15 @@ expand_expr_real_1 (tree exp, rtx target, enum machine_mode tmode,
return decl_rtl;
case INTEGER_CST:
temp = immed_double_const (TREE_INT_CST_LOW (exp),
TREE_INT_CST_HIGH (exp), mode);
/* Given that TYPE_PRECISION (type) is not always equal to
GET_MODE_PRECISION (TYPE_MODE (type)), we need to extend from
the former to the latter according to the signedness of the
type. */
temp = immed_wide_int_const (wide_int::from
(exp,
GET_MODE_PRECISION (TYPE_MODE (type)),
TYPE_SIGN (type)),
TYPE_MODE (type));
return temp;
case VECTOR_CST:
@ -9723,7 +9700,7 @@ expand_expr_real_1 (tree exp, rtx target, enum machine_mode tmode,
might end up in a register. */
if (mem_ref_refers_to_non_mem_p (exp))
{
HOST_WIDE_INT offset = mem_ref_offset (exp).low;
HOST_WIDE_INT offset = mem_ref_offset (exp).to_short_addr ();
base = TREE_OPERAND (base, 0);
if (offset == 0
&& tree_fits_uhwi_p (TYPE_SIZE (type))
@ -9758,8 +9735,7 @@ expand_expr_real_1 (tree exp, rtx target, enum machine_mode tmode,
op0 = memory_address_addr_space (mode, op0, as);
if (!integer_zerop (TREE_OPERAND (exp, 1)))
{
rtx off
= immed_double_int_const (mem_ref_offset (exp), address_mode);
rtx off = immed_wide_int_const (mem_ref_offset (exp), address_mode);
op0 = simplify_gen_binary (PLUS, address_mode, op0, off);
op0 = memory_address_addr_space (mode, op0, as);
}
@ -10649,9 +10625,10 @@ reduce_to_bit_field_precision (rtx exp, rtx target, tree type)
}
else if (TYPE_UNSIGNED (type))
{
rtx mask = immed_double_int_const (double_int::mask (prec),
GET_MODE (exp));
return expand_and (GET_MODE (exp), exp, mask, target);
enum machine_mode mode = GET_MODE (exp);
rtx mask = immed_wide_int_const
(wi::mask (prec, false, GET_MODE_PRECISION (mode)), mode);
return expand_and (mode, exp, mask, target);
}
else
{
@ -11226,8 +11203,7 @@ const_vector_from_tree (tree exp)
RTVEC_ELT (v, i) = CONST_FIXED_FROM_FIXED_VALUE (TREE_FIXED_CST (elt),
inner);
else
RTVEC_ELT (v, i) = immed_double_int_const (tree_to_double_int (elt),
inner);
RTVEC_ELT (v, i) = immed_wide_int_const (elt, inner);
}
return gen_rtx_CONST_VECTOR (mode, v);

View file

@ -80,6 +80,7 @@ along with GCC; see the file COPYING3. If not see
#include "params.h"
#include "tree-pretty-print.h" /* for dump_function_header */
#include "asan.h"
#include "wide-int-print.h"
#ifdef XCOFF_DEBUGGING_INFO
#include "xcoffout.h" /* Needed for external data
@ -3885,8 +3886,21 @@ output_addr_const (FILE *file, rtx x)
output_addr_const (file, XEXP (x, 0));
break;
case CONST_WIDE_INT:
/* We do not know the mode here so we have to use a round about
way to build a wide-int to get it printed properly. */
{
wide_int w = wide_int::from_array (&CONST_WIDE_INT_ELT (x, 0),
CONST_WIDE_INT_NUNITS (x),
CONST_WIDE_INT_NUNITS (x)
* HOST_BITS_PER_WIDE_INT,
false);
print_decs (w, file);
}
break;
case CONST_DOUBLE:
if (GET_MODE (x) == VOIDmode)
if (CONST_DOUBLE_AS_INT_P (x))
{
/* We can use %d if the number is one word and positive. */
if (CONST_DOUBLE_HIGH (x))

View file

@ -23,6 +23,7 @@ along with GCC; see the file COPYING3. If not see
#include "tm.h"
#include "tree.h"
#include "diagnostic-core.h"
#include "wide-int.h"
/* Compare two fixed objects for bitwise identity. */
@ -113,6 +114,7 @@ fixed_from_string (FIXED_VALUE_TYPE *f, const char *str, enum machine_mode mode)
REAL_VALUE_TYPE real_value, fixed_value, base_value;
unsigned int fbit;
enum fixed_value_range_code temp;
bool fail;
f->mode = mode;
fbit = GET_MODE_FBIT (mode);
@ -127,8 +129,10 @@ fixed_from_string (FIXED_VALUE_TYPE *f, const char *str, enum machine_mode mode)
"large fixed-point constant implicitly truncated to fixed-point type");
real_2expN (&base_value, fbit, mode);
real_arithmetic (&fixed_value, MULT_EXPR, &real_value, &base_value);
real_to_integer2 ((HOST_WIDE_INT *)&f->data.low, &f->data.high,
&fixed_value);
wide_int w = real_to_integer (&fixed_value, &fail,
GET_MODE_PRECISION (mode));
f->data.low = w.elt (0);
f->data.high = w.elt (1);
if (temp == FIXED_MAX_EPS && ALL_FRACT_MODE_P (f->mode))
{
@ -153,9 +157,12 @@ fixed_to_decimal (char *str, const FIXED_VALUE_TYPE *f_orig,
{
REAL_VALUE_TYPE real_value, base_value, fixed_value;
signop sgn = UNSIGNED_FIXED_POINT_MODE_P (f_orig->mode) ? UNSIGNED : SIGNED;
real_2expN (&base_value, GET_MODE_FBIT (f_orig->mode), f_orig->mode);
real_from_integer (&real_value, VOIDmode, f_orig->data.low, f_orig->data.high,
UNSIGNED_FIXED_POINT_MODE_P (f_orig->mode));
real_from_integer (&real_value, VOIDmode,
wide_int::from (f_orig->data,
GET_MODE_PRECISION (f_orig->mode), sgn),
sgn);
real_arithmetic (&fixed_value, RDIV_EXPR, &real_value, &base_value);
real_to_decimal (str, &fixed_value, buf_size, 0, 1);
}
@ -1041,12 +1048,17 @@ fixed_convert_from_real (FIXED_VALUE_TYPE *f, enum machine_mode mode,
int i_f_bits = GET_MODE_IBIT (mode) + GET_MODE_FBIT (mode);
unsigned int fbit = GET_MODE_FBIT (mode);
enum fixed_value_range_code temp;
bool fail;
real_value = *a;
f->mode = mode;
real_2expN (&base_value, fbit, mode);
real_arithmetic (&fixed_value, MULT_EXPR, &real_value, &base_value);
real_to_integer2 ((HOST_WIDE_INT *)&f->data.low, &f->data.high, &fixed_value);
wide_int w = real_to_integer (&fixed_value, &fail,
GET_MODE_PRECISION (mode));
f->data.low = w.elt (0);
f->data.high = w.elt (1);
temp = check_real_for_fixed_mode (&real_value, mode);
if (temp == FIXED_UNDERFLOW) /* Minimum. */
{
@ -1091,9 +1103,11 @@ real_convert_from_fixed (REAL_VALUE_TYPE *r, enum machine_mode mode,
{
REAL_VALUE_TYPE base_value, fixed_value, real_value;
signop sgn = UNSIGNED_FIXED_POINT_MODE_P (f->mode) ? UNSIGNED : SIGNED;
real_2expN (&base_value, GET_MODE_FBIT (f->mode), f->mode);
real_from_integer (&fixed_value, VOIDmode, f->data.low, f->data.high,
UNSIGNED_FIXED_POINT_MODE_P (f->mode));
real_from_integer (&fixed_value, VOIDmode,
wide_int::from (f->data, GET_MODE_PRECISION (f->mode),
sgn), sgn);
real_arithmetic (&real_value, RDIV_EXPR, &fixed_value, &base_value);
real_convert (r, mode, &real_value);
}

File diff suppressed because it is too large Load diff

View file

@ -118,10 +118,10 @@ extern tree fold_indirect_ref_loc (location_t, tree);
extern tree build_simple_mem_ref_loc (location_t, tree);
#define build_simple_mem_ref(T)\
build_simple_mem_ref_loc (UNKNOWN_LOCATION, T)
extern double_int mem_ref_offset (const_tree);
extern offset_int mem_ref_offset (const_tree);
extern tree build_invariant_address (tree, tree, HOST_WIDE_INT);
extern tree constant_boolean_node (bool, tree);
extern tree div_if_zero_remainder (enum tree_code, const_tree, const_tree);
extern tree div_if_zero_remainder (const_tree, const_tree);
extern bool tree_swap_operands_p (const_tree, const_tree, bool);
extern enum tree_code swap_tree_comparison (enum tree_code);

View file

@ -32,6 +32,7 @@ along with GCC; see the file COPYING3. If not see
#include "trans-const.h"
#include "trans-types.h"
#include "target-memory.h"
#include "wide-int.h"
/* --------------------------------------------------------------- */
/* Calculate the size of an expression. */
@ -430,7 +431,7 @@ gfc_interpret_logical (int kind, unsigned char *buffer, size_t buffer_size,
{
tree t = native_interpret_expr (gfc_get_logical_type (kind), buffer,
buffer_size);
*logical = tree_to_double_int (t).is_zero () ? 0 : 1;
*logical = wi::eq_p (t, 0) ? 0 : 1;
return size_logical (kind);
}

View file

@ -90,6 +90,7 @@ along with GCC; see the file COPYING3. If not see
#include "trans-array.h"
#include "trans-const.h"
#include "dependency.h"
#include "wide-int.h"
static bool gfc_get_array_constructor_size (mpz_t *, gfc_constructor_base);
@ -5380,9 +5381,8 @@ gfc_conv_array_initializer (tree type, gfc_expr * expr)
{
gfc_constructor *c;
tree tmp;
offset_int wtmp;
gfc_se se;
HOST_WIDE_INT hi;
unsigned HOST_WIDE_INT lo;
tree index, range;
vec<constructor_elt, va_gc> *v = NULL;
@ -5404,20 +5404,13 @@ gfc_conv_array_initializer (tree type, gfc_expr * expr)
else
gfc_conv_structure (&se, expr, 1);
tmp = TYPE_MAX_VALUE (TYPE_DOMAIN (type));
gcc_assert (tmp && INTEGER_CST_P (tmp));
hi = TREE_INT_CST_HIGH (tmp);
lo = TREE_INT_CST_LOW (tmp);
lo++;
if (lo == 0)
hi++;
wtmp = wi::to_offset (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
gcc_assert (wtmp != 0);
/* This will probably eat buckets of memory for large arrays. */
while (hi != 0 || lo != 0)
while (wtmp != 0)
{
CONSTRUCTOR_APPEND_ELT (v, NULL_TREE, se.expr);
if (lo == 0)
hi--;
lo--;
wtmp -= 1;
}
break;

View file

@ -33,6 +33,7 @@ along with GCC; see the file COPYING3. If not see
#include "trans-const.h"
#include "trans-types.h"
#include "target-memory.h"
#include "wide-int.h"
tree gfc_rank_cst[GFC_MAX_DIMENSIONS + 1];
@ -145,8 +146,7 @@ gfc_conv_string_init (tree length, gfc_expr * expr)
gcc_assert (expr->expr_type == EXPR_CONSTANT);
gcc_assert (expr->ts.type == BT_CHARACTER);
gcc_assert (INTEGER_CST_P (length));
gcc_assert (TREE_INT_CST_HIGH (length) == 0);
gcc_assert (tree_fits_uhwi_p (length));
len = TREE_INT_CST_LOW (length);
slen = expr->value.character.length;
@ -201,8 +201,8 @@ gfc_init_constants (void)
tree
gfc_conv_mpz_to_tree (mpz_t i, int kind)
{
double_int val = mpz_get_double_int (gfc_get_int_type (kind), i, true);
return double_int_to_tree (gfc_get_int_type (kind), val);
wide_int val = wi::from_mpz (gfc_get_int_type (kind), i, true);
return wide_int_to_tree (gfc_get_int_type (kind), val);
}
/* Converts a backend tree into a GMP integer. */
@ -210,8 +210,7 @@ gfc_conv_mpz_to_tree (mpz_t i, int kind)
void
gfc_conv_tree_to_mpz (mpz_t i, tree source)
{
double_int val = tree_to_double_int (source);
mpz_set_double_int (i, val, TYPE_UNSIGNED (TREE_TYPE (source)));
wi::to_mpz (source, i, TYPE_SIGN (TREE_TYPE (source)));
}
/* Converts a real constant into backend form. */

View file

@ -406,7 +406,7 @@ gfc_can_put_var_on_stack (tree size)
if (gfc_option.flag_max_stack_var_size < 0)
return 1;
if (TREE_INT_CST_HIGH (size) != 0)
if (!tree_fits_uhwi_p (size))
return 0;
low = TREE_INT_CST_LOW (size);

View file

@ -40,7 +40,7 @@ along with GCC; see the file COPYING3. If not see
#include "trans-stmt.h"
#include "dependency.h"
#include "gimplify.h"
#include "wide-int.h"
/* Convert a scalar to an array descriptor. To be used for assumed-rank
arrays. */
@ -2112,13 +2112,14 @@ gfc_conv_cst_int_power (gfc_se * se, tree lhs, tree rhs)
HOST_WIDE_INT m;
unsigned HOST_WIDE_INT n;
int sgn;
wide_int wrhs = rhs;
/* If exponent is too large, we won't expand it anyway, so don't bother
with large integer values. */
if (!TREE_INT_CST (rhs).fits_shwi ())
if (!wi::fits_shwi_p (wrhs))
return 0;
m = TREE_INT_CST (rhs).to_shwi ();
m = wrhs.to_shwi ();
/* There's no ABS for HOST_WIDE_INT, so here we go. It also takes care
of the asymmetric range of the integer type. */
n = (unsigned HOST_WIDE_INT) (m < 0 ? -m : m);
@ -2657,7 +2658,7 @@ gfc_string_to_single_character (tree len, tree str, int kind)
{
if (len == NULL
|| !INTEGER_CST_P (len) || TREE_INT_CST_HIGH (len) != 0
|| !tree_fits_uhwi_p (len)
|| !POINTER_TYPE_P (TREE_TYPE (str)))
return NULL_TREE;
@ -2771,8 +2772,9 @@ gfc_optimize_len_trim (tree len, tree str, int kind)
&& TREE_CODE (TREE_OPERAND (TREE_OPERAND (str, 0), 0)) == STRING_CST
&& array_ref_low_bound (TREE_OPERAND (str, 0))
== TREE_OPERAND (TREE_OPERAND (str, 0), 1)
&& TREE_INT_CST_LOW (len) >= 1
&& TREE_INT_CST_LOW (len)
&& tree_fits_uhwi_p (len)
&& tree_to_uhwi (len) >= 1
&& tree_to_uhwi (len)
== (unsigned HOST_WIDE_INT)
TREE_STRING_LENGTH (TREE_OPERAND (TREE_OPERAND (str, 0), 0)))
{

View file

@ -43,6 +43,7 @@ along with GCC; see the file COPYING3. If not see
/* Only for gfc_trans_assign and gfc_trans_pointer_assign. */
#include "trans-stmt.h"
#include "tree-nested.h"
#include "wide-int.h"
/* This maps Fortran intrinsic math functions to external library or GCC
builtin functions. */
@ -987,12 +988,8 @@ trans_this_image (gfc_se * se, gfc_expr *expr)
if (INTEGER_CST_P (dim_arg))
{
int hi, co_dim;
hi = TREE_INT_CST_HIGH (dim_arg);
co_dim = TREE_INT_CST_LOW (dim_arg);
if (hi || co_dim < 1
|| co_dim > GFC_TYPE_ARRAY_CORANK (TREE_TYPE (desc)))
if (wi::ltu_p (dim_arg, 1)
|| wi::gtu_p (dim_arg, GFC_TYPE_ARRAY_CORANK (TREE_TYPE (desc))))
gfc_error ("'dim' argument of %s intrinsic at %L is not a valid "
"dimension index", expr->value.function.isym->name,
&expr->where);
@ -1352,14 +1349,9 @@ gfc_conv_intrinsic_bound (gfc_se * se, gfc_expr * expr, int upper)
if (INTEGER_CST_P (bound))
{
int hi, low;
hi = TREE_INT_CST_HIGH (bound);
low = TREE_INT_CST_LOW (bound);
if (hi || low < 0
|| ((!as || as->type != AS_ASSUMED_RANK)
&& low >= GFC_TYPE_ARRAY_RANK (TREE_TYPE (desc)))
|| low > GFC_MAX_DIMENSIONS)
if (((!as || as->type != AS_ASSUMED_RANK)
&& wi::geu_p (bound, GFC_TYPE_ARRAY_RANK (TREE_TYPE (desc))))
|| wi::gtu_p (bound, GFC_MAX_DIMENSIONS))
gfc_error ("'dim' argument of %s intrinsic at %L is not a valid "
"dimension index", upper ? "UBOUND" : "LBOUND",
&expr->where);
@ -1554,11 +1546,8 @@ conv_intrinsic_cobound (gfc_se * se, gfc_expr * expr)
if (INTEGER_CST_P (bound))
{
int hi, low;
hi = TREE_INT_CST_HIGH (bound);
low = TREE_INT_CST_LOW (bound);
if (hi || low < 1 || low > GFC_TYPE_ARRAY_CORANK (TREE_TYPE (desc)))
if (wi::ltu_p (bound, 1)
|| wi::gtu_p (bound, GFC_TYPE_ARRAY_CORANK (TREE_TYPE (desc))))
gfc_error ("'dim' argument of %s intrinsic at %L is not a valid "
"dimension index", expr->value.function.isym->name,
&expr->where);

View file

@ -863,8 +863,6 @@ gfc_init_types (void)
int index;
tree type;
unsigned n;
unsigned HOST_WIDE_INT hi;
unsigned HOST_WIDE_INT lo;
/* Create and name the types. */
#define PUSH_TYPE(name, node) \
@ -956,13 +954,10 @@ gfc_init_types (void)
descriptor. */
n = TYPE_PRECISION (gfc_array_index_type) - GFC_DTYPE_SIZE_SHIFT;
lo = ~ (unsigned HOST_WIDE_INT) 0;
if (n > HOST_BITS_PER_WIDE_INT)
hi = lo >> (2*HOST_BITS_PER_WIDE_INT - n);
else
hi = 0, lo >>= HOST_BITS_PER_WIDE_INT - n;
gfc_max_array_element_size
= build_int_cst_wide (long_unsigned_type_node, lo, hi);
= wide_int_to_tree (long_unsigned_type_node,
wi::mask (n, UNSIGNED,
TYPE_PRECISION (long_unsigned_type_node)));
boolean_type_node = gfc_get_logical_type (gfc_default_logical_kind);
boolean_true_node = build_int_cst (boolean_type_node, 1);
@ -1902,7 +1897,7 @@ gfc_get_array_type_bounds (tree etype, int dimen, int codimen, tree * lbound,
if (stride)
rtype = build_range_type (gfc_array_index_type, gfc_index_zero_node,
int_const_binop (MINUS_EXPR, stride,
integer_one_node));
build_int_cst (TREE_TYPE (stride), 1)));
else
rtype = gfc_array_range_type;
arraytype = build_array_type (etype, rtype);

View file

@ -17,6 +17,9 @@ You should have received a copy of the GNU General Public License
along with GCC; see the file COPYING3. If not see
<http://www.gnu.org/licenses/>. */
/* We don't have insn-modes.h, but we include tm.h. */
#define BITS_PER_UNIT 8
#include "bconfig.h"
#include "system.h"
#include "coretypes.h"

View file

@ -204,6 +204,7 @@ gen_exp (rtx x, enum rtx_code subroutine_type, char *used)
case CONST_DOUBLE:
case CONST_FIXED:
case CONST_WIDE_INT:
/* These shouldn't be written in MD files. Instead, the appropriate
routines in varasm.c should be called. */
gcc_unreachable ();

View file

@ -142,6 +142,7 @@ static int
excluded_rtx (int idx)
{
return ((strcmp (defs[idx].enumname, "CONST_DOUBLE") == 0)
|| (strcmp (defs[idx].enumname, "CONST_WIDE_INT") == 0)
|| (strcmp (defs[idx].enumname, "CONST_FIXED") == 0));
}

View file

@ -57,7 +57,7 @@ ITYPE {IWORD}({WS}{IWORD})*
/* Include '::' in identifiers to capture C++ scope qualifiers. */
ID {CID}({HWS}::{HWS}{CID})*
EOID [^[:alnum:]_]
CXX_KEYWORD inline|public:|private:|protected:|template|operator|friend
CXX_KEYWORD inline|public:|private:|protected:|template|operator|friend|static
%x in_struct in_struct_comment in_comment
%option warn noyywrap nounput nodefault perf-report
@ -110,6 +110,7 @@ CXX_KEYWORD inline|public:|private:|protected:|template|operator|friend
"const"/{EOID} /* don't care */
{CXX_KEYWORD}/{EOID} |
"~" |
"^" |
"&" {
*yylval = XDUPVAR (const char, yytext, yyleng, yyleng + 1);
return IGNORABLE_CXX_KEYWORD;

View file

@ -197,6 +197,23 @@ require2 (int t1, int t2)
return v;
}
/* If the next token does not have one of the codes T1, T2 or T3, report a
parse error; otherwise return the token's value. */
static const char *
require3 (int t1, int t2, int t3)
{
int u = token ();
const char *v = advance ();
if (u != t1 && u != t2 && u != t3)
{
parse_error ("expected %s, %s or %s, have %s",
print_token (t1, 0), print_token (t2, 0),
print_token (t3, 0), print_token (u, v));
return 0;
}
return v;
}
/* Near-terminals. */
/* C-style string constant concatenation: STRING+
@ -243,18 +260,45 @@ require_template_declaration (const char *tmpl_name)
str = concat (tmpl_name, "<", (char *) 0);
/* Read the comma-separated list of identifiers. */
while (token () != '>')
int depth = 1;
while (depth > 0)
{
const char *id = require2 (ID, ',');
if (token () == ENUM)
{
advance ();
str = concat (str, "enum ", (char *) 0);
continue;
}
if (token () == NUM)
{
str = concat (str, advance (), (char *) 0);
continue;
}
if (token () == ':')
{
advance ();
str = concat (str, ":", (char *) 0);
continue;
}
if (token () == '<')
{
advance ();
str = concat (str, "<", (char *) 0);
depth += 1;
continue;
}
if (token () == '>')
{
advance ();
str = concat (str, ">", (char *) 0);
depth -= 1;
continue;
}
const char *id = require3 (SCALAR, ID, ',');
if (id == NULL)
id = ",";
str = concat (str, id, (char *) 0);
}
/* Recognize the closing '>'. */
require ('>');
str = concat (str, ">", (char *) 0);
return str;
}

View file

@ -30,7 +30,6 @@
#endif
#include "system.h"
#include "errors.h" /* For fatal. */
#include "double-int.h"
#include "hashtab.h"
#include "version.h" /* For version_string & pkgversion_string. */
#include "obstack.h"

View file

@ -25,7 +25,6 @@
#include "system.h"
#include "errors.h" /* for fatal */
#include "getopt.h"
#include "double-int.h"
#include "version.h" /* for version_string & pkgversion_string. */
#include "hashtab.h"
#include "xregex.h"
@ -535,7 +534,7 @@ do_typedef (const char *s, type_p t, struct fileloc *pos)
for (p = typedefs; p != NULL; p = p->next)
if (strcmp (p->name, s) == 0)
{
if (p->type != t)
if (p->type != t && strcmp (s, "result_type") != 0)
{
error_at_line (pos, "type `%s' previously defined", s);
error_at_line (&p->line, "previously defined here");
@ -1766,7 +1765,7 @@ open_base_files (void)
static const char *const ifiles[] = {
"config.h", "system.h", "coretypes.h", "tm.h",
"hashtab.h", "splay-tree.h", "obstack.h", "bitmap.h", "input.h",
"tree.h", "rtl.h", "function.h", "insn-config.h", "expr.h",
"tree.h", "rtl.h", "wide-int.h", "function.h", "insn-config.h", "expr.h",
"hard-reg-set.h", "basic-block.h", "cselib.h", "insn-addr.h",
"optabs.h", "libfuncs.h", "debug.h", "ggc.h", "cgraph.h",
"pointer-set.h", "hash-table.h", "vec.h", "ggc.h", "basic-block.h",
@ -5670,6 +5669,8 @@ main (int argc, char **argv)
POS_HERE (do_scalar_typedef ("REAL_VALUE_TYPE", &pos));
POS_HERE (do_scalar_typedef ("FIXED_VALUE_TYPE", &pos));
POS_HERE (do_scalar_typedef ("double_int", &pos));
POS_HERE (do_scalar_typedef ("offset_int", &pos));
POS_HERE (do_scalar_typedef ("widest_int", &pos));
POS_HERE (do_scalar_typedef ("uint64_t", &pos));
POS_HERE (do_scalar_typedef ("uint8", &pos));
POS_HERE (do_scalar_typedef ("uintptr_t", &pos));

View file

@ -612,7 +612,7 @@ write_one_predicate_function (struct pred_data *p)
add_mode_tests (p);
/* A normal predicate can legitimately not look at enum machine_mode
if it accepts only CONST_INTs and/or CONST_DOUBLEs. */
if it accepts only CONST_INTs and/or CONST_WIDE_INT and/or CONST_DOUBLEs. */
printf ("int\n%s (rtx op, enum machine_mode mode ATTRIBUTE_UNUSED)\n{\n",
p->name);
write_predicate_stmts (p->exp);
@ -1075,12 +1075,17 @@ write_tm_constrs_h (void)
if (needs_ival)
puts (" if (CONST_INT_P (op))\n"
" ival = INTVAL (op);");
#if TARGET_SUPPORTS_WIDE_INT
if (needs_lval || needs_hval)
error ("you can't use lval or hval");
#else
if (needs_hval)
puts (" if (GET_CODE (op) == CONST_DOUBLE && mode == VOIDmode)"
" hval = CONST_DOUBLE_HIGH (op);");
if (needs_lval)
puts (" if (GET_CODE (op) == CONST_DOUBLE && mode == VOIDmode)"
" lval = CONST_DOUBLE_LOW (op);");
#endif
if (needs_rval)
puts (" if (GET_CODE (op) == CONST_DOUBLE && mode != VOIDmode)"
" rval = CONST_DOUBLE_REAL_VALUE (op);");

View file

@ -586,6 +586,7 @@ validate_pattern (rtx pattern, rtx insn, rtx set, int set_code)
&& GET_CODE (src) != PC
&& GET_CODE (src) != CC0
&& !CONST_INT_P (src)
&& !CONST_WIDE_INT_P (src)
&& GET_CODE (src) != CALL)
{
const char *which;
@ -770,13 +771,14 @@ add_to_sequence (rtx pattern, struct decision_head *last,
We can optimize the generated code a little if either
(a) the predicate only accepts one code, or (b) the
predicate does not allow CONST_INT, in which case it
can match only if the modes match. */
predicate does not allow CONST_INT or CONST_WIDE_INT,
in which case it can match only if the modes match. */
pred = lookup_predicate (pred_name);
if (pred)
{
test->u.pred.data = pred;
allows_const_int = pred->codes[CONST_INT];
allows_const_int = (pred->codes[CONST_INT]
|| pred->codes[CONST_WIDE_INT]);
if (was_code == MATCH_PARALLEL
&& pred->singleton != PARALLEL)
error_with_line (pattern_lineno,

View file

@ -2806,7 +2806,12 @@ static const struct std_pred_table std_preds[] = {
{"scratch_operand", false, false, {SCRATCH, REG}},
{"immediate_operand", false, true, {UNKNOWN}},
{"const_int_operand", false, false, {CONST_INT}},
#if TARGET_SUPPORTS_WIDE_INT
{"const_scalar_int_operand", false, false, {CONST_INT, CONST_WIDE_INT}},
{"const_double_operand", false, false, {CONST_DOUBLE}},
#else
{"const_double_operand", false, false, {CONST_INT, CONST_DOUBLE}},
#endif
{"nonimmediate_operand", false, false, {SUBREG, REG, MEM}},
{"nonmemory_operand", false, true, {SUBREG, REG}},
{"push_operand", false, false, {MEM}},

View file

@ -2836,7 +2836,7 @@ get_base_constructor (tree base, HOST_WIDE_INT *bit_offset,
{
if (!tree_fits_shwi_p (TREE_OPERAND (base, 1)))
return NULL_TREE;
*bit_offset += (mem_ref_offset (base).low
*bit_offset += (mem_ref_offset (base).to_short_addr ()
* BITS_PER_UNIT);
}
@ -2931,9 +2931,10 @@ fold_array_ctor_reference (tree type, tree ctor,
{
unsigned HOST_WIDE_INT cnt;
tree cfield, cval;
double_int low_bound, elt_size;
double_int index, max_index;
double_int access_index;
offset_int low_bound;
offset_int elt_size;
offset_int index, max_index;
offset_int access_index;
tree domain_type = NULL_TREE, index_type = NULL_TREE;
HOST_WIDE_INT inner_offset;
@ -2945,32 +2946,30 @@ fold_array_ctor_reference (tree type, tree ctor,
/* Static constructors for variably sized objects makes no sense. */
gcc_assert (TREE_CODE (TYPE_MIN_VALUE (domain_type)) == INTEGER_CST);
index_type = TREE_TYPE (TYPE_MIN_VALUE (domain_type));
low_bound = tree_to_double_int (TYPE_MIN_VALUE (domain_type));
low_bound = wi::to_offset (TYPE_MIN_VALUE (domain_type));
}
else
low_bound = double_int_zero;
low_bound = 0;
/* Static constructors for variably sized objects makes no sense. */
gcc_assert (TREE_CODE (TYPE_SIZE_UNIT (TREE_TYPE (TREE_TYPE (ctor))))
== INTEGER_CST);
elt_size =
tree_to_double_int (TYPE_SIZE_UNIT (TREE_TYPE (TREE_TYPE (ctor))));
elt_size = wi::to_offset (TYPE_SIZE_UNIT (TREE_TYPE (TREE_TYPE (ctor))));
/* We can handle only constantly sized accesses that are known to not
be larger than size of array element. */
if (!TYPE_SIZE_UNIT (type)
|| TREE_CODE (TYPE_SIZE_UNIT (type)) != INTEGER_CST
|| elt_size.slt (tree_to_double_int (TYPE_SIZE_UNIT (type)))
|| elt_size.is_zero ())
|| wi::lts_p (elt_size, wi::to_offset (TYPE_SIZE_UNIT (type)))
|| elt_size == 0)
return NULL_TREE;
/* Compute the array index we look for. */
access_index = double_int::from_uhwi (offset / BITS_PER_UNIT)
.udiv (elt_size, TRUNC_DIV_EXPR);
access_index = wi::udiv_trunc (offset_int (offset / BITS_PER_UNIT),
elt_size);
access_index += low_bound;
if (index_type)
access_index = access_index.ext (TYPE_PRECISION (index_type),
TYPE_UNSIGNED (index_type));
access_index = wi::ext (access_index, TYPE_PRECISION (index_type),
TYPE_SIGN (index_type));
/* And offset within the access. */
inner_offset = offset % (elt_size.to_uhwi () * BITS_PER_UNIT);
@ -2980,9 +2979,10 @@ fold_array_ctor_reference (tree type, tree ctor,
if (inner_offset + size > elt_size.to_uhwi () * BITS_PER_UNIT)
return NULL_TREE;
index = low_bound - double_int_one;
index = low_bound - 1;
if (index_type)
index = index.ext (TYPE_PRECISION (index_type), TYPE_UNSIGNED (index_type));
index = wi::ext (index, TYPE_PRECISION (index_type),
TYPE_SIGN (index_type));
FOR_EACH_CONSTRUCTOR_ELT (CONSTRUCTOR_ELTS (ctor), cnt, cfield, cval)
{
@ -2992,26 +2992,26 @@ fold_array_ctor_reference (tree type, tree ctor,
if (cfield)
{
if (TREE_CODE (cfield) == INTEGER_CST)
max_index = index = tree_to_double_int (cfield);
max_index = index = wi::to_offset (cfield);
else
{
gcc_assert (TREE_CODE (cfield) == RANGE_EXPR);
index = tree_to_double_int (TREE_OPERAND (cfield, 0));
max_index = tree_to_double_int (TREE_OPERAND (cfield, 1));
index = wi::to_offset (TREE_OPERAND (cfield, 0));
max_index = wi::to_offset (TREE_OPERAND (cfield, 1));
}
}
else
{
index += double_int_one;
index += 1;
if (index_type)
index = index.ext (TYPE_PRECISION (index_type),
TYPE_UNSIGNED (index_type));
index = wi::ext (index, TYPE_PRECISION (index_type),
TYPE_SIGN (index_type));
max_index = index;
}
/* Do we have match? */
if (access_index.cmp (index, 1) >= 0
&& access_index.cmp (max_index, 1) <= 0)
if (wi::cmpu (access_index, index) >= 0
&& wi::cmpu (access_index, max_index) <= 0)
return fold_ctor_reference (type, cval, inner_offset, size,
from_decl);
}
@ -3038,10 +3038,8 @@ fold_nonarray_ctor_reference (tree type, tree ctor,
tree byte_offset = DECL_FIELD_OFFSET (cfield);
tree field_offset = DECL_FIELD_BIT_OFFSET (cfield);
tree field_size = DECL_SIZE (cfield);
double_int bitoffset;
double_int byte_offset_cst = tree_to_double_int (byte_offset);
double_int bits_per_unit_cst = double_int::from_uhwi (BITS_PER_UNIT);
double_int bitoffset_end, access_end;
offset_int bitoffset;
offset_int bitoffset_end, access_end;
/* Variable sized objects in static constructors makes no sense,
but field_size can be NULL for flexible array members. */
@ -3052,30 +3050,30 @@ fold_nonarray_ctor_reference (tree type, tree ctor,
: TREE_CODE (TREE_TYPE (cfield)) == ARRAY_TYPE));
/* Compute bit offset of the field. */
bitoffset = tree_to_double_int (field_offset)
+ byte_offset_cst * bits_per_unit_cst;
bitoffset = (wi::to_offset (field_offset)
+ wi::lshift (wi::to_offset (byte_offset),
LOG2_BITS_PER_UNIT));
/* Compute bit offset where the field ends. */
if (field_size != NULL_TREE)
bitoffset_end = bitoffset + tree_to_double_int (field_size);
bitoffset_end = bitoffset + wi::to_offset (field_size);
else
bitoffset_end = double_int_zero;
bitoffset_end = 0;
access_end = double_int::from_uhwi (offset)
+ double_int::from_uhwi (size);
access_end = offset_int (offset) + size;
/* Is there any overlap between [OFFSET, OFFSET+SIZE) and
[BITOFFSET, BITOFFSET_END)? */
if (access_end.cmp (bitoffset, 0) > 0
if (wi::cmps (access_end, bitoffset) > 0
&& (field_size == NULL_TREE
|| double_int::from_uhwi (offset).slt (bitoffset_end)))
|| wi::lts_p (offset, bitoffset_end)))
{
double_int inner_offset = double_int::from_uhwi (offset) - bitoffset;
offset_int inner_offset = offset_int (offset) - bitoffset;
/* We do have overlap. Now see if field is large enough to
cover the access. Give up for accesses spanning multiple
fields. */
if (access_end.cmp (bitoffset_end, 0) > 0)
if (wi::cmps (access_end, bitoffset_end) > 0)
return NULL_TREE;
if (double_int::from_uhwi (offset).slt (bitoffset))
if (wi::lts_p (offset, bitoffset))
return NULL_TREE;
return fold_ctor_reference (type, cval,
inner_offset.to_uhwi (), size,
@ -3166,37 +3164,42 @@ fold_const_aggregate_ref_1 (tree t, tree (*valueize) (tree))
&& TREE_CODE (idx) == INTEGER_CST)
{
tree low_bound, unit_size;
double_int doffset;
/* If the resulting bit-offset is constant, track it. */
if ((low_bound = array_ref_low_bound (t),
TREE_CODE (low_bound) == INTEGER_CST)
&& (unit_size = array_ref_element_size (t),
tree_fits_uhwi_p (unit_size))
&& (doffset = (TREE_INT_CST (idx) - TREE_INT_CST (low_bound))
.sext (TYPE_PRECISION (TREE_TYPE (idx))),
doffset.fits_shwi ()))
tree_fits_uhwi_p (unit_size)))
{
offset = doffset.to_shwi ();
offset *= tree_to_uhwi (unit_size);
offset *= BITS_PER_UNIT;
offset_int woffset
= wi::sext (wi::to_offset (idx) - wi::to_offset (low_bound),
TYPE_PRECISION (TREE_TYPE (idx)));
base = TREE_OPERAND (t, 0);
ctor = get_base_constructor (base, &offset, valueize);
/* Empty constructor. Always fold to 0. */
if (ctor == error_mark_node)
return build_zero_cst (TREE_TYPE (t));
/* Out of bound array access. Value is undefined,
but don't fold. */
if (offset < 0)
return NULL_TREE;
/* We can not determine ctor. */
if (!ctor)
return NULL_TREE;
return fold_ctor_reference (TREE_TYPE (t), ctor, offset,
tree_to_uhwi (unit_size)
* BITS_PER_UNIT,
base);
if (wi::fits_shwi_p (woffset))
{
offset = woffset.to_shwi ();
/* TODO: This code seems wrong, multiply then check
to see if it fits. */
offset *= tree_to_uhwi (unit_size);
offset *= BITS_PER_UNIT;
base = TREE_OPERAND (t, 0);
ctor = get_base_constructor (base, &offset, valueize);
/* Empty constructor. Always fold to 0. */
if (ctor == error_mark_node)
return build_zero_cst (TREE_TYPE (t));
/* Out of bound array access. Value is undefined,
but don't fold. */
if (offset < 0)
return NULL_TREE;
/* We can not determine ctor. */
if (!ctor)
return NULL_TREE;
return fold_ctor_reference (TREE_TYPE (t), ctor, offset,
tree_to_uhwi (unit_size)
* BITS_PER_UNIT,
base);
}
}
}
/* Fallthru. */
@ -3503,7 +3506,7 @@ gimple_val_nonnegative_real_p (tree val)
if ((n & 1) == 0)
{
REAL_VALUE_TYPE cint;
real_from_integer (&cint, VOIDmode, n, n < 0 ? -1 : 0, 0);
real_from_integer (&cint, VOIDmode, n, SIGNED);
if (real_identical (&c, &cint))
return true;
}
@ -3616,9 +3619,7 @@ gimple_fold_indirect_ref (tree t)
|| DECL_P (TREE_OPERAND (addr, 0)))
return fold_build2 (MEM_REF, type,
addr,
build_int_cst_wide (ptype,
TREE_INT_CST_LOW (off),
TREE_INT_CST_HIGH (off)));
wide_int_to_tree (ptype, off));
}
/* *(foo *)fooarrptr => (*fooarrptr)[0] */

View file

@ -1755,7 +1755,7 @@ dump_ssaname_info (pretty_printer *buffer, tree node, int spc)
if (!POINTER_TYPE_P (TREE_TYPE (node))
&& SSA_NAME_RANGE_INFO (node))
{
double_int min, max, nonzero_bits;
wide_int min, max, nonzero_bits;
value_range_type range_type = get_range_info (node, &min, &max);
if (range_type == VR_VARYING)
@ -1764,22 +1764,16 @@ dump_ssaname_info (pretty_printer *buffer, tree node, int spc)
{
pp_printf (buffer, "# RANGE ");
pp_printf (buffer, "%s[", range_type == VR_RANGE ? "" : "~");
pp_double_int (buffer, min, TYPE_UNSIGNED (TREE_TYPE (node)));
pp_wide_int (buffer, min, TYPE_SIGN (TREE_TYPE (node)));
pp_printf (buffer, ", ");
pp_double_int (buffer, max, TYPE_UNSIGNED (TREE_TYPE (node)));
pp_wide_int (buffer, max, TYPE_SIGN (TREE_TYPE (node)));
pp_printf (buffer, "]");
}
nonzero_bits = get_nonzero_bits (node);
if (nonzero_bits != double_int_minus_one
&& (nonzero_bits
!= double_int::mask (TYPE_PRECISION (TREE_TYPE (node)))))
if (nonzero_bits != -1)
{
pp_string (buffer, " NONZERO ");
sprintf (pp_buffer (buffer)->digit_buffer,
HOST_WIDE_INT_PRINT_DOUBLE_HEX,
(unsigned HOST_WIDE_INT) nonzero_bits.high,
nonzero_bits.low);
pp_string (buffer, pp_buffer (buffer)->digit_buffer);
pp_wide_int (buffer, nonzero_bits, UNSIGNED);
}
newline_and_indent (buffer, spc);
}

View file

@ -63,6 +63,7 @@ along with GCC; see the file COPYING3. If not see
#include "params.h"
#include "tree-ssa-address.h"
#include "tree-affine.h"
#include "wide-int-print.h"
/* Information about a strength reduction candidate. Each statement
in the candidate table represents an expression of one of the
@ -244,7 +245,7 @@ struct slsr_cand_d
tree stride;
/* The index constant i. */
double_int index;
widest_int index;
/* The type of the candidate. This is normally the type of base_expr,
but casts may have occurred when combining feeding instructions.
@ -319,7 +320,7 @@ typedef const struct cand_chain_d *const_cand_chain_t;
struct incr_info_d
{
/* The increment that relates a candidate to its basis. */
double_int incr;
widest_int incr;
/* How many times the increment occurs in the candidate tree. */
unsigned count;
@ -454,7 +455,7 @@ get_alternative_base (tree base)
tree_to_aff_combination_expand (base, TREE_TYPE (base),
&aff, &name_expansions);
aff.offset = tree_to_double_int (integer_zero_node);
aff.offset = 0;
expr = aff_combination_to_tree (&aff);
result = (tree *) pointer_map_insert (alt_base_map, base);
@ -627,7 +628,7 @@ record_potential_basis (slsr_cand_t c, tree base)
static slsr_cand_t
alloc_cand_and_find_basis (enum cand_kind kind, gimple gs, tree base,
double_int index, tree stride, tree ctype,
const widest_int &index, tree stride, tree ctype,
unsigned savings)
{
slsr_cand_t c = (slsr_cand_t) obstack_alloc (&cand_obstack,
@ -824,8 +825,8 @@ slsr_process_phi (gimple phi, bool speed)
CAND_PHI. */
base_type = TREE_TYPE (arg0_base);
c = alloc_cand_and_find_basis (CAND_PHI, phi, arg0_base, double_int_zero,
integer_one_node, base_type, savings);
c = alloc_cand_and_find_basis (CAND_PHI, phi, arg0_base,
0, integer_one_node, base_type, savings);
/* Add the candidate to the statement-candidate mapping. */
add_cand_for_stmt (phi, c);
@ -842,7 +843,7 @@ slsr_process_phi (gimple phi, bool speed)
int (i * S).
Otherwise, just return double int zero. */
static double_int
static widest_int
backtrace_base_for_ref (tree *pbase)
{
tree base_in = *pbase;
@ -858,19 +859,19 @@ backtrace_base_for_ref (tree *pbase)
base_in = get_unwidened (base_in, NULL_TREE);
if (TREE_CODE (base_in) != SSA_NAME)
return tree_to_double_int (integer_zero_node);
return 0;
base_cand = base_cand_from_table (base_in);
while (base_cand && base_cand->kind != CAND_PHI)
{
if (base_cand->kind == CAND_ADD
&& base_cand->index.is_one ()
&& base_cand->index == 1
&& TREE_CODE (base_cand->stride) == INTEGER_CST)
{
/* X = B + (1 * S), S is integer constant. */
*pbase = base_cand->base_expr;
return tree_to_double_int (base_cand->stride);
return wi::to_widest (base_cand->stride);
}
else if (base_cand->kind == CAND_ADD
&& TREE_CODE (base_cand->stride) == INTEGER_CST
@ -887,7 +888,7 @@ backtrace_base_for_ref (tree *pbase)
base_cand = NULL;
}
return tree_to_double_int (integer_zero_node);
return 0;
}
/* Look for the following pattern:
@ -917,38 +918,35 @@ backtrace_base_for_ref (tree *pbase)
*PINDEX: C1 + (C2 * C3) + C4 + (C5 * C3) */
static bool
restructure_reference (tree *pbase, tree *poffset, double_int *pindex,
restructure_reference (tree *pbase, tree *poffset, widest_int *pindex,
tree *ptype)
{
tree base = *pbase, offset = *poffset;
double_int index = *pindex;
double_int bpu = double_int::from_uhwi (BITS_PER_UNIT);
tree mult_op0, mult_op1, t1, t2, type;
double_int c1, c2, c3, c4, c5;
widest_int index = *pindex;
tree mult_op0, t1, t2, type;
widest_int c1, c2, c3, c4, c5;
if (!base
|| !offset
|| TREE_CODE (base) != MEM_REF
|| TREE_CODE (offset) != MULT_EXPR
|| TREE_CODE (TREE_OPERAND (offset, 1)) != INTEGER_CST
|| !index.umod (bpu, FLOOR_MOD_EXPR).is_zero ())
|| wi::umod_floor (index, BITS_PER_UNIT) != 0)
return false;
t1 = TREE_OPERAND (base, 0);
c1 = mem_ref_offset (base);
c1 = widest_int::from (mem_ref_offset (base), SIGNED);
type = TREE_TYPE (TREE_OPERAND (base, 1));
mult_op0 = TREE_OPERAND (offset, 0);
mult_op1 = TREE_OPERAND (offset, 1);
c3 = tree_to_double_int (mult_op1);
c3 = wi::to_widest (TREE_OPERAND (offset, 1));
if (TREE_CODE (mult_op0) == PLUS_EXPR)
if (TREE_CODE (TREE_OPERAND (mult_op0, 1)) == INTEGER_CST)
{
t2 = TREE_OPERAND (mult_op0, 0);
c2 = tree_to_double_int (TREE_OPERAND (mult_op0, 1));
c2 = wi::to_widest (TREE_OPERAND (mult_op0, 1));
}
else
return false;
@ -958,7 +956,7 @@ restructure_reference (tree *pbase, tree *poffset, double_int *pindex,
if (TREE_CODE (TREE_OPERAND (mult_op0, 1)) == INTEGER_CST)
{
t2 = TREE_OPERAND (mult_op0, 0);
c2 = -tree_to_double_int (TREE_OPERAND (mult_op0, 1));
c2 = -wi::to_widest (TREE_OPERAND (mult_op0, 1));
}
else
return false;
@ -966,15 +964,15 @@ restructure_reference (tree *pbase, tree *poffset, double_int *pindex,
else
{
t2 = mult_op0;
c2 = double_int_zero;
c2 = 0;
}
c4 = index.udiv (bpu, FLOOR_DIV_EXPR);
c4 = wi::lrshift (index, LOG2_BITS_PER_UNIT);
c5 = backtrace_base_for_ref (&t2);
*pbase = t1;
*poffset = fold_build2 (MULT_EXPR, sizetype, fold_convert (sizetype, t2),
double_int_to_tree (sizetype, c3));
wide_int_to_tree (sizetype, c3));
*pindex = c1 + c2 * c3 + c4 + c5 * c3;
*ptype = type;
@ -991,7 +989,6 @@ slsr_process_ref (gimple gs)
HOST_WIDE_INT bitsize, bitpos;
enum machine_mode mode;
int unsignedp, volatilep;
double_int index;
slsr_cand_t c;
if (gimple_vdef (gs))
@ -1007,7 +1004,7 @@ slsr_process_ref (gimple gs)
base = get_inner_reference (ref_expr, &bitsize, &bitpos, &offset, &mode,
&unsignedp, &volatilep, false);
index = double_int::from_uhwi (bitpos);
widest_int index = bitpos;
if (!restructure_reference (&base, &offset, &index, &type))
return;
@ -1028,7 +1025,7 @@ static slsr_cand_t
create_mul_ssa_cand (gimple gs, tree base_in, tree stride_in, bool speed)
{
tree base = NULL_TREE, stride = NULL_TREE, ctype = NULL_TREE;
double_int index;
widest_int index;
unsigned savings = 0;
slsr_cand_t c;
slsr_cand_t base_cand = base_cand_from_table (base_in);
@ -1060,7 +1057,7 @@ create_mul_ssa_cand (gimple gs, tree base_in, tree stride_in, bool speed)
============================
X = B + ((i' * S) * Z) */
base = base_cand->base_expr;
index = base_cand->index * tree_to_double_int (base_cand->stride);
index = base_cand->index * wi::to_widest (base_cand->stride);
stride = stride_in;
ctype = base_cand->cand_type;
if (has_single_use (base_in))
@ -1079,7 +1076,7 @@ create_mul_ssa_cand (gimple gs, tree base_in, tree stride_in, bool speed)
/* No interpretations had anything useful to propagate, so
produce X = (Y + 0) * Z. */
base = base_in;
index = double_int_zero;
index = 0;
stride = stride_in;
ctype = TREE_TYPE (base_in);
}
@ -1098,7 +1095,7 @@ static slsr_cand_t
create_mul_imm_cand (gimple gs, tree base_in, tree stride_in, bool speed)
{
tree base = NULL_TREE, stride = NULL_TREE, ctype = NULL_TREE;
double_int index, temp;
widest_int index, temp;
unsigned savings = 0;
slsr_cand_t c;
slsr_cand_t base_cand = base_cand_from_table (base_in);
@ -1114,13 +1111,12 @@ create_mul_imm_cand (gimple gs, tree base_in, tree stride_in, bool speed)
X = Y * c
============================
X = (B + i') * (S * c) */
temp = tree_to_double_int (base_cand->stride)
* tree_to_double_int (stride_in);
if (double_int_fits_to_tree_p (TREE_TYPE (stride_in), temp))
temp = wi::to_widest (base_cand->stride) * wi::to_widest (stride_in);
if (wi::fits_to_tree_p (temp, TREE_TYPE (stride_in)))
{
base = base_cand->base_expr;
index = base_cand->index;
stride = double_int_to_tree (TREE_TYPE (stride_in), temp);
stride = wide_int_to_tree (TREE_TYPE (stride_in), temp);
ctype = base_cand->cand_type;
if (has_single_use (base_in))
savings = (base_cand->dead_savings
@ -1142,7 +1138,7 @@ create_mul_imm_cand (gimple gs, tree base_in, tree stride_in, bool speed)
+ stmt_cost (base_cand->cand_stmt, speed));
}
else if (base_cand->kind == CAND_ADD
&& base_cand->index.is_one ()
&& base_cand->index == 1
&& TREE_CODE (base_cand->stride) == INTEGER_CST)
{
/* Y = B + (1 * S), S constant
@ -1150,7 +1146,7 @@ create_mul_imm_cand (gimple gs, tree base_in, tree stride_in, bool speed)
===========================
X = (B + S) * c */
base = base_cand->base_expr;
index = tree_to_double_int (base_cand->stride);
index = wi::to_widest (base_cand->stride);
stride = stride_in;
ctype = base_cand->cand_type;
if (has_single_use (base_in))
@ -1169,7 +1165,7 @@ create_mul_imm_cand (gimple gs, tree base_in, tree stride_in, bool speed)
/* No interpretations had anything useful to propagate, so
produce X = (Y + 0) * c. */
base = base_in;
index = double_int_zero;
index = 0;
stride = stride_in;
ctype = TREE_TYPE (base_in);
}
@ -1232,7 +1228,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
bool subtract_p, bool speed)
{
tree base = NULL_TREE, stride = NULL_TREE, ctype = NULL;
double_int index;
widest_int index;
unsigned savings = 0;
slsr_cand_t c;
slsr_cand_t base_cand = base_cand_from_table (base_in);
@ -1243,7 +1239,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
while (addend_cand && !base && addend_cand->kind != CAND_PHI)
{
if (addend_cand->kind == CAND_MULT
&& addend_cand->index.is_zero ()
&& addend_cand->index == 0
&& TREE_CODE (addend_cand->stride) == INTEGER_CST)
{
/* Z = (B + 0) * S, S constant
@ -1251,7 +1247,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
===========================
X = Y + ((+/-1 * S) * B) */
base = base_in;
index = tree_to_double_int (addend_cand->stride);
index = wi::to_widest (addend_cand->stride);
if (subtract_p)
index = -index;
stride = addend_cand->base_expr;
@ -1270,7 +1266,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
while (base_cand && !base && base_cand->kind != CAND_PHI)
{
if (base_cand->kind == CAND_ADD
&& (base_cand->index.is_zero ()
&& (base_cand->index == 0
|| operand_equal_p (base_cand->stride,
integer_zero_node, 0)))
{
@ -1279,7 +1275,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
============================
X = B + (+/-1 * Z) */
base = base_cand->base_expr;
index = subtract_p ? double_int_minus_one : double_int_one;
index = subtract_p ? -1 : 1;
stride = addend_in;
ctype = base_cand->cand_type;
if (has_single_use (base_in))
@ -1293,7 +1289,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
while (subtrahend_cand && !base && subtrahend_cand->kind != CAND_PHI)
{
if (subtrahend_cand->kind == CAND_MULT
&& subtrahend_cand->index.is_zero ()
&& subtrahend_cand->index == 0
&& TREE_CODE (subtrahend_cand->stride) == INTEGER_CST)
{
/* Z = (B + 0) * S, S constant
@ -1301,7 +1297,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
===========================
Value: X = Y + ((-1 * S) * B) */
base = base_in;
index = tree_to_double_int (subtrahend_cand->stride);
index = wi::to_widest (subtrahend_cand->stride);
index = -index;
stride = subtrahend_cand->base_expr;
ctype = TREE_TYPE (base_in);
@ -1328,7 +1324,7 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
/* No interpretations had anything useful to propagate, so
produce X = Y + (1 * Z). */
base = base_in;
index = subtract_p ? double_int_minus_one : double_int_one;
index = subtract_p ? -1 : 1;
stride = addend_in;
ctype = TREE_TYPE (base_in);
}
@ -1343,22 +1339,23 @@ create_add_ssa_cand (gimple gs, tree base_in, tree addend_in,
about BASE_IN into the new candidate. Return the new candidate. */
static slsr_cand_t
create_add_imm_cand (gimple gs, tree base_in, double_int index_in, bool speed)
create_add_imm_cand (gimple gs, tree base_in, const widest_int &index_in,
bool speed)
{
enum cand_kind kind = CAND_ADD;
tree base = NULL_TREE, stride = NULL_TREE, ctype = NULL_TREE;
double_int index, multiple;
widest_int index, multiple;
unsigned savings = 0;
slsr_cand_t c;
slsr_cand_t base_cand = base_cand_from_table (base_in);
while (base_cand && !base && base_cand->kind != CAND_PHI)
{
bool unsigned_p = TYPE_UNSIGNED (TREE_TYPE (base_cand->stride));
signop sign = TYPE_SIGN (TREE_TYPE (base_cand->stride));
if (TREE_CODE (base_cand->stride) == INTEGER_CST
&& index_in.multiple_of (tree_to_double_int (base_cand->stride),
unsigned_p, &multiple))
&& wi::multiple_of_p (index_in, wi::to_widest (base_cand->stride),
sign, &multiple))
{
/* Y = (B + i') * S, S constant, c = kS for some integer k
X = Y + c
@ -1443,10 +1440,8 @@ slsr_process_add (gimple gs, tree rhs1, tree rhs2, bool speed)
}
else
{
double_int index;
/* Record an interpretation for the add-immediate. */
index = tree_to_double_int (rhs2);
widest_int index = wi::to_widest (rhs2);
if (subtract_p)
index = -index;
@ -1594,10 +1589,10 @@ slsr_process_cast (gimple gs, tree rhs1, bool speed)
The first of these is somewhat arbitrary, but the choice of
1 for the stride simplifies the logic for propagating casts
into their uses. */
c = alloc_cand_and_find_basis (CAND_ADD, gs, rhs1, double_int_zero,
integer_one_node, ctype, 0);
c2 = alloc_cand_and_find_basis (CAND_MULT, gs, rhs1, double_int_zero,
integer_one_node, ctype, 0);
c = alloc_cand_and_find_basis (CAND_ADD, gs, rhs1,
0, integer_one_node, ctype, 0);
c2 = alloc_cand_and_find_basis (CAND_MULT, gs, rhs1,
0, integer_one_node, ctype, 0);
c->next_interp = c2->cand_num;
}
@ -1651,10 +1646,10 @@ slsr_process_copy (gimple gs, tree rhs1, bool speed)
The first of these is somewhat arbitrary, but the choice of
1 for the stride simplifies the logic for propagating casts
into their uses. */
c = alloc_cand_and_find_basis (CAND_ADD, gs, rhs1, double_int_zero,
integer_one_node, TREE_TYPE (rhs1), 0);
c2 = alloc_cand_and_find_basis (CAND_MULT, gs, rhs1, double_int_zero,
integer_one_node, TREE_TYPE (rhs1), 0);
c = alloc_cand_and_find_basis (CAND_ADD, gs, rhs1,
0, integer_one_node, TREE_TYPE (rhs1), 0);
c2 = alloc_cand_and_find_basis (CAND_MULT, gs, rhs1,
0, integer_one_node, TREE_TYPE (rhs1), 0);
c->next_interp = c2->cand_num;
}
@ -1771,7 +1766,7 @@ dump_candidate (slsr_cand_t c)
fputs (" MULT : (", dump_file);
print_generic_expr (dump_file, c->base_expr, 0);
fputs (" + ", dump_file);
dump_double_int (dump_file, c->index, false);
print_decs (c->index, dump_file);
fputs (") * ", dump_file);
print_generic_expr (dump_file, c->stride, 0);
fputs (" : ", dump_file);
@ -1780,7 +1775,7 @@ dump_candidate (slsr_cand_t c)
fputs (" ADD : ", dump_file);
print_generic_expr (dump_file, c->base_expr, 0);
fputs (" + (", dump_file);
dump_double_int (dump_file, c->index, false);
print_decs (c->index, dump_file);
fputs (" * ", dump_file);
print_generic_expr (dump_file, c->stride, 0);
fputs (") : ", dump_file);
@ -1791,7 +1786,7 @@ dump_candidate (slsr_cand_t c)
fputs (" + (", dump_file);
print_generic_expr (dump_file, c->stride, 0);
fputs (") + ", dump_file);
dump_double_int (dump_file, c->index, false);
print_decs (c->index, dump_file);
fputs (" : ", dump_file);
break;
case CAND_PHI:
@ -1870,7 +1865,7 @@ dump_incr_vec (void)
for (i = 0; i < incr_vec_len; i++)
{
fprintf (dump_file, "%3d increment: ", i);
dump_double_int (dump_file, incr_vec[i].incr, false);
print_decs (incr_vec[i].incr, dump_file);
fprintf (dump_file, "\n count: %d", incr_vec[i].count);
fprintf (dump_file, "\n cost: %d", incr_vec[i].cost);
fputs ("\n initializer: ", dump_file);
@ -1901,7 +1896,7 @@ replace_ref (tree *expr, slsr_cand_t c)
add_expr = fold_build2 (POINTER_PLUS_EXPR, TREE_TYPE (c->base_expr),
c->base_expr, c->stride);
mem_ref = fold_build2 (MEM_REF, acc_type, add_expr,
double_int_to_tree (c->cand_type, c->index));
wide_int_to_tree (c->cand_type, c->index));
/* Gimplify the base addressing expression for the new MEM_REF tree. */
gimple_stmt_iterator gsi = gsi_for_stmt (c->cand_stmt);
@ -1969,7 +1964,7 @@ phi_dependent_cand_p (slsr_cand_t c)
/* Calculate the increment required for candidate C relative to
its basis. */
static double_int
static widest_int
cand_increment (slsr_cand_t c)
{
slsr_cand_t basis;
@ -1992,12 +1987,12 @@ cand_increment (slsr_cand_t c)
for this candidate, return the absolute value of that increment
instead. */
static inline double_int
static inline widest_int
cand_abs_increment (slsr_cand_t c)
{
double_int increment = cand_increment (c);
widest_int increment = cand_increment (c);
if (!address_arithmetic_p && increment.is_negative ())
if (!address_arithmetic_p && wi::neg_p (increment))
increment = -increment;
return increment;
@ -2016,7 +2011,7 @@ cand_already_replaced (slsr_cand_t c)
replace_conditional_candidate. */
static void
replace_mult_candidate (slsr_cand_t c, tree basis_name, double_int bump)
replace_mult_candidate (slsr_cand_t c, tree basis_name, widest_int bump)
{
tree target_type = TREE_TYPE (gimple_assign_lhs (c->cand_stmt));
enum tree_code cand_code = gimple_assign_rhs_code (c->cand_stmt);
@ -2026,7 +2021,7 @@ replace_mult_candidate (slsr_cand_t c, tree basis_name, double_int bump)
in this case. This does not affect siblings or dependents
of C. Restriction to signed HWI is conservative for unsigned
types but allows for safe negation without twisted logic. */
if (bump.fits_shwi ()
if (wi::fits_shwi_p (bump)
&& bump.to_shwi () != HOST_WIDE_INT_MIN
/* It is not useful to replace casts, copies, or adds of
an SSA name and a constant. */
@ -2044,13 +2039,13 @@ replace_mult_candidate (slsr_cand_t c, tree basis_name, double_int bump)
types, introduce a cast. */
if (!useless_type_conversion_p (target_type, TREE_TYPE (basis_name)))
basis_name = introduce_cast_before_cand (c, target_type, basis_name);
if (bump.is_negative ())
if (wi::neg_p (bump))
{
code = MINUS_EXPR;
bump = -bump;
}
bump_tree = double_int_to_tree (target_type, bump);
bump_tree = wide_int_to_tree (target_type, bump);
if (dump_file && (dump_flags & TDF_DETAILS))
{
@ -2058,7 +2053,7 @@ replace_mult_candidate (slsr_cand_t c, tree basis_name, double_int bump)
print_gimple_stmt (dump_file, c->cand_stmt, 0, 0);
}
if (bump.is_zero ())
if (bump == 0)
{
tree lhs = gimple_assign_lhs (c->cand_stmt);
gimple copy_stmt = gimple_build_assign (lhs, basis_name);
@ -2119,14 +2114,12 @@ static void
replace_unconditional_candidate (slsr_cand_t c)
{
slsr_cand_t basis;
double_int stride, bump;
if (cand_already_replaced (c))
return;
basis = lookup_cand (c->basis);
stride = tree_to_double_int (c->stride);
bump = cand_increment (c) * stride;
widest_int bump = cand_increment (c) * wi::to_widest (c->stride);
replace_mult_candidate (c, gimple_assign_lhs (basis->cand_stmt), bump);
}
@ -2136,7 +2129,7 @@ replace_unconditional_candidate (slsr_cand_t c)
MAX_INCR_VEC_LEN increments have been found. */
static inline int
incr_vec_index (double_int increment)
incr_vec_index (const widest_int &increment)
{
unsigned i;
@ -2156,7 +2149,7 @@ incr_vec_index (double_int increment)
static tree
create_add_on_incoming_edge (slsr_cand_t c, tree basis_name,
double_int increment, edge e, location_t loc,
widest_int increment, edge e, location_t loc,
bool known_stride)
{
basic_block insert_bb;
@ -2167,7 +2160,7 @@ create_add_on_incoming_edge (slsr_cand_t c, tree basis_name,
/* If the add candidate along this incoming edge has the same
index as C's hidden basis, the hidden basis represents this
edge correctly. */
if (increment.is_zero ())
if (increment == 0)
return basis_name;
basis_type = TREE_TYPE (basis_name);
@ -2177,21 +2170,21 @@ create_add_on_incoming_edge (slsr_cand_t c, tree basis_name,
{
tree bump_tree;
enum tree_code code = PLUS_EXPR;
double_int bump = increment * tree_to_double_int (c->stride);
if (bump.is_negative ())
widest_int bump = increment * wi::to_widest (c->stride);
if (wi::neg_p (bump))
{
code = MINUS_EXPR;
bump = -bump;
}
bump_tree = double_int_to_tree (basis_type, bump);
bump_tree = wide_int_to_tree (basis_type, bump);
new_stmt = gimple_build_assign_with_ops (code, lhs, basis_name,
bump_tree);
}
else
{
int i;
bool negate_incr = (!address_arithmetic_p && increment.is_negative ());
bool negate_incr = (!address_arithmetic_p && wi::neg_p (increment));
i = incr_vec_index (negate_incr ? -increment : increment);
gcc_assert (i >= 0);
@ -2201,10 +2194,10 @@ create_add_on_incoming_edge (slsr_cand_t c, tree basis_name,
new_stmt = gimple_build_assign_with_ops (code, lhs, basis_name,
incr_vec[i].initializer);
}
else if (increment.is_one ())
else if (increment == 1)
new_stmt = gimple_build_assign_with_ops (PLUS_EXPR, lhs, basis_name,
c->stride);
else if (increment.is_minus_one ())
else if (increment == -1)
new_stmt = gimple_build_assign_with_ops (MINUS_EXPR, lhs, basis_name,
c->stride);
else
@ -2265,11 +2258,11 @@ create_phi_basis (slsr_cand_t c, gimple from_phi, tree basis_name,
/* If the phi argument is the base name of the CAND_PHI, then
this incoming arc should use the hidden basis. */
if (operand_equal_p (arg, phi_cand->base_expr, 0))
if (basis->index.is_zero ())
if (basis->index == 0)
feeding_def = gimple_assign_lhs (basis->cand_stmt);
else
{
double_int incr = -basis->index;
widest_int incr = -basis->index;
feeding_def = create_add_on_incoming_edge (c, basis_name, incr,
e, loc, known_stride);
}
@ -2286,7 +2279,7 @@ create_phi_basis (slsr_cand_t c, gimple from_phi, tree basis_name,
else
{
slsr_cand_t arg_cand = base_cand_from_table (arg);
double_int diff = arg_cand->index - basis->index;
widest_int diff = arg_cand->index - basis->index;
feeding_def = create_add_on_incoming_edge (c, basis_name, diff,
e, loc, known_stride);
}
@ -2332,7 +2325,6 @@ replace_conditional_candidate (slsr_cand_t c)
tree basis_name, name;
slsr_cand_t basis;
location_t loc;
double_int stride, bump;
/* Look up the LHS SSA name from C's basis. This will be the
RHS1 of the adds we will introduce to create new phi arguments. */
@ -2345,8 +2337,7 @@ replace_conditional_candidate (slsr_cand_t c)
name = create_phi_basis (c, lookup_cand (c->def_phi)->cand_stmt,
basis_name, loc, KNOWN_STRIDE);
/* Replace C with an add of the new basis phi and a constant. */
stride = tree_to_double_int (c->stride);
bump = c->index * stride;
widest_int bump = c->index * wi::to_widest (c->stride);
replace_mult_candidate (c, name, bump);
}
@ -2478,14 +2469,14 @@ count_candidates (slsr_cand_t c)
candidates with the same increment, also record T_0 for subsequent use. */
static void
record_increment (slsr_cand_t c, double_int increment, bool is_phi_adjust)
record_increment (slsr_cand_t c, widest_int increment, bool is_phi_adjust)
{
bool found = false;
unsigned i;
/* Treat increments that differ only in sign as identical so as to
share initializers, unless we are generating pointer arithmetic. */
if (!address_arithmetic_p && increment.is_negative ())
if (!address_arithmetic_p && wi::neg_p (increment))
increment = -increment;
for (i = 0; i < incr_vec_len; i++)
@ -2529,8 +2520,8 @@ record_increment (slsr_cand_t c, double_int increment, bool is_phi_adjust)
if (c->kind == CAND_ADD
&& !is_phi_adjust
&& c->index == increment
&& (increment.sgt (double_int_one)
|| increment.slt (double_int_minus_one))
&& (wi::gts_p (increment, 1)
|| wi::lts_p (increment, -1))
&& (gimple_assign_rhs_code (c->cand_stmt) == PLUS_EXPR
|| gimple_assign_rhs_code (c->cand_stmt) == POINTER_PLUS_EXPR))
{
@ -2588,7 +2579,7 @@ record_phi_increments (slsr_cand_t basis, gimple phi)
else
{
slsr_cand_t arg_cand = base_cand_from_table (arg);
double_int diff = arg_cand->index - basis->index;
widest_int diff = arg_cand->index - basis->index;
record_increment (arg_cand, diff, PHI_ADJUST);
}
}
@ -2639,7 +2630,7 @@ record_increments (slsr_cand_t c)
uses. */
static int
phi_incr_cost (slsr_cand_t c, double_int incr, gimple phi, int *savings)
phi_incr_cost (slsr_cand_t c, const widest_int &incr, gimple phi, int *savings)
{
unsigned i;
int cost = 0;
@ -2664,7 +2655,7 @@ phi_incr_cost (slsr_cand_t c, double_int incr, gimple phi, int *savings)
else
{
slsr_cand_t arg_cand = base_cand_from_table (arg);
double_int diff = arg_cand->index - basis->index;
widest_int diff = arg_cand->index - basis->index;
if (incr == diff)
{
@ -2729,10 +2720,10 @@ optimize_cands_for_speed_p (slsr_cand_t c)
static int
lowest_cost_path (int cost_in, int repl_savings, slsr_cand_t c,
double_int incr, bool count_phis)
const widest_int &incr, bool count_phis)
{
int local_cost, sib_cost, savings = 0;
double_int cand_incr = cand_abs_increment (c);
widest_int cand_incr = cand_abs_increment (c);
if (cand_already_replaced (c))
local_cost = cost_in;
@ -2775,11 +2766,11 @@ lowest_cost_path (int cost_in, int repl_savings, slsr_cand_t c,
would go dead. */
static int
total_savings (int repl_savings, slsr_cand_t c, double_int incr,
total_savings (int repl_savings, slsr_cand_t c, const widest_int &incr,
bool count_phis)
{
int savings = 0;
double_int cand_incr = cand_abs_increment (c);
widest_int cand_incr = cand_abs_increment (c);
if (incr == cand_incr && !cand_already_replaced (c))
savings += repl_savings + c->dead_savings;
@ -2829,7 +2820,7 @@ analyze_increments (slsr_cand_t first_dep, enum machine_mode mode, bool speed)
/* If somehow this increment is bigger than a HWI, we won't
be optimizing candidates that use it. And if the increment
has a count of zero, nothing will be done with it. */
if (!incr_vec[i].incr.fits_shwi () || !incr_vec[i].count)
if (!wi::fits_shwi_p (incr_vec[i].incr) || !incr_vec[i].count)
incr_vec[i].cost = COST_INFINITE;
/* Increments of 0, 1, and -1 are always profitable to replace,
@ -2983,7 +2974,7 @@ ncd_for_two_cands (basic_block bb1, basic_block bb2,
candidates, return the earliest candidate in the block in *WHERE. */
static basic_block
ncd_with_phi (slsr_cand_t c, double_int incr, gimple phi,
ncd_with_phi (slsr_cand_t c, const widest_int &incr, gimple phi,
basic_block ncd, slsr_cand_t *where)
{
unsigned i;
@ -3003,7 +2994,7 @@ ncd_with_phi (slsr_cand_t c, double_int incr, gimple phi,
else
{
slsr_cand_t arg_cand = base_cand_from_table (arg);
double_int diff = arg_cand->index - basis->index;
widest_int diff = arg_cand->index - basis->index;
basic_block pred = gimple_phi_arg_edge (phi, i)->src;
if ((incr == diff) || (!address_arithmetic_p && incr == -diff))
@ -3022,7 +3013,7 @@ ncd_with_phi (slsr_cand_t c, double_int incr, gimple phi,
return the earliest candidate in the block in *WHERE. */
static basic_block
ncd_of_cand_and_phis (slsr_cand_t c, double_int incr, slsr_cand_t *where)
ncd_of_cand_and_phis (slsr_cand_t c, const widest_int &incr, slsr_cand_t *where)
{
basic_block ncd = NULL;
@ -3047,7 +3038,7 @@ ncd_of_cand_and_phis (slsr_cand_t c, double_int incr, slsr_cand_t *where)
*WHERE. */
static basic_block
nearest_common_dominator_for_cands (slsr_cand_t c, double_int incr,
nearest_common_dominator_for_cands (slsr_cand_t c, const widest_int &incr,
slsr_cand_t *where)
{
basic_block sib_ncd = NULL, dep_ncd = NULL, this_ncd = NULL, ncd;
@ -3123,13 +3114,13 @@ insert_initializers (slsr_cand_t c)
slsr_cand_t where = NULL;
gimple init_stmt;
tree stride_type, new_name, incr_tree;
double_int incr = incr_vec[i].incr;
widest_int incr = incr_vec[i].incr;
if (!profitable_increment_p (i)
|| incr.is_one ()
|| (incr.is_minus_one ()
|| incr == 1
|| (incr == -1
&& gimple_assign_rhs_code (c->cand_stmt) != POINTER_PLUS_EXPR)
|| incr.is_zero ())
|| incr == 0)
continue;
/* We may have already identified an existing initializer that
@ -3158,7 +3149,7 @@ insert_initializers (slsr_cand_t c)
/* Create the initializer and insert it in the latest possible
dominating position. */
incr_tree = double_int_to_tree (stride_type, incr);
incr_tree = wide_int_to_tree (stride_type, incr);
init_stmt = gimple_build_assign_with_ops (MULT_EXPR, new_name,
c->stride, incr_tree);
if (where)
@ -3215,9 +3206,9 @@ all_phi_incrs_profitable (slsr_cand_t c, gimple phi)
{
int j;
slsr_cand_t arg_cand = base_cand_from_table (arg);
double_int increment = arg_cand->index - basis->index;
widest_int increment = arg_cand->index - basis->index;
if (!address_arithmetic_p && increment.is_negative ())
if (!address_arithmetic_p && wi::neg_p (increment))
increment = -increment;
j = incr_vec_index (increment);
@ -3228,7 +3219,7 @@ all_phi_incrs_profitable (slsr_cand_t c, gimple phi)
c->cand_num);
print_gimple_stmt (dump_file, phi, 0, 0);
fputs (" increment: ", dump_file);
dump_double_int (dump_file, increment, false);
print_decs (increment, dump_file);
if (j < 0)
fprintf (dump_file,
"\n Not replaced; incr_vec overflow.\n");
@ -3323,7 +3314,7 @@ replace_one_candidate (slsr_cand_t c, unsigned i, tree basis_name)
tree orig_rhs1, orig_rhs2;
tree rhs2;
enum tree_code orig_code, repl_code;
double_int cand_incr;
widest_int cand_incr;
orig_code = gimple_assign_rhs_code (c->cand_stmt);
orig_rhs1 = gimple_assign_rhs1 (c->cand_stmt);
@ -3371,7 +3362,7 @@ replace_one_candidate (slsr_cand_t c, unsigned i, tree basis_name)
from the basis name, or an add of the stride to the basis
name, respectively. It may be necessary to introduce a
cast (or reuse an existing cast). */
else if (cand_incr.is_one ())
else if (cand_incr == 1)
{
tree stride_type = TREE_TYPE (c->stride);
tree orig_type = TREE_TYPE (orig_rhs2);
@ -3386,7 +3377,7 @@ replace_one_candidate (slsr_cand_t c, unsigned i, tree basis_name)
c);
}
else if (cand_incr.is_minus_one ())
else if (cand_incr == -1)
{
tree stride_type = TREE_TYPE (c->stride);
tree orig_type = TREE_TYPE (orig_rhs2);
@ -3413,7 +3404,7 @@ replace_one_candidate (slsr_cand_t c, unsigned i, tree basis_name)
fputs (" (duplicate, not actually replacing)\n", dump_file);
}
else if (cand_incr.is_zero ())
else if (cand_incr == 0)
{
tree lhs = gimple_assign_lhs (c->cand_stmt);
tree lhs_type = TREE_TYPE (lhs);
@ -3463,7 +3454,7 @@ replace_profitable_candidates (slsr_cand_t c)
{
if (!cand_already_replaced (c))
{
double_int increment = cand_abs_increment (c);
widest_int increment = cand_abs_increment (c);
enum tree_code orig_code = gimple_assign_rhs_code (c->cand_stmt);
int i;

View file

@ -2777,11 +2777,7 @@ preprocess_case_label_vec_for_gimple (vec<tree> labels,
low = CASE_HIGH (labels[i - 1]);
if (!low)
low = CASE_LOW (labels[i - 1]);
if ((TREE_INT_CST_LOW (low) + 1
!= TREE_INT_CST_LOW (high))
|| (TREE_INT_CST_HIGH (low)
+ (TREE_INT_CST_LOW (high) == 0)
!= TREE_INT_CST_HIGH (high)))
if (wi::add (low, 1) != high)
break;
}
if (i == len)

View file

@ -1067,8 +1067,7 @@ Gcc_backend::type_size(Btype* btype)
if (t == error_mark_node)
return 1;
t = TYPE_SIZE_UNIT(t);
gcc_assert(TREE_CODE(t) == INTEGER_CST);
gcc_assert(TREE_INT_CST_HIGH(t) == 0);
gcc_assert(tree_fits_uhwi_p (t));
unsigned HOST_WIDE_INT val_wide = TREE_INT_CST_LOW(t);
size_t ret = static_cast<size_t>(val_wide);
gcc_assert(ret == val_wide);

View file

@ -36,6 +36,7 @@ along with GCC; see the file COPYING3. If not see
#include "pointer-set.h"
#include "obstack.h"
#include "debug.h"
#include "wide-int-print.h"
/* We dump this information from the debug hooks. This gives us a
stable and maintainable API to hook into. In order to work
@ -961,7 +962,7 @@ go_output_typedef (struct godump_container *container, tree decl)
const char *name;
struct macro_hash_value *mhval;
void **slot;
char buf[100];
char buf[WIDE_INT_PRINT_BUFFER_SIZE];
name = IDENTIFIER_POINTER (TREE_PURPOSE (element));
@ -982,10 +983,7 @@ go_output_typedef (struct godump_container *container, tree decl)
snprintf (buf, sizeof buf, HOST_WIDE_INT_PRINT_UNSIGNED,
tree_to_uhwi (TREE_VALUE (element)));
else
snprintf (buf, sizeof buf, HOST_WIDE_INT_PRINT_DOUBLE_HEX,
((unsigned HOST_WIDE_INT)
TREE_INT_CST_HIGH (TREE_VALUE (element))),
TREE_INT_CST_LOW (TREE_VALUE (element)));
print_hex (element, buf);
mhval->value = xstrdup (buf);
*slot = mhval;

View file

@ -75,14 +75,13 @@ gmp_cst_to_tree (tree type, mpz_t val)
{
tree t = type ? type : integer_type_node;
mpz_t tmp;
double_int di;
mpz_init (tmp);
mpz_set (tmp, val);
di = mpz_get_double_int (t, tmp, true);
wide_int wi = wi::from_mpz (t, tmp, true);
mpz_clear (tmp);
return double_int_to_tree (t, di);
return wide_int_to_tree (t, wi);
}
/* Sets RES to the min of V1 and V2. */

View file

@ -73,8 +73,7 @@ along with GCC; see the file COPYING3. If not see
static inline void
tree_int_to_gmp (tree t, mpz_t res)
{
double_int di = tree_to_double_int (t);
mpz_set_double_int (res, di, TYPE_UNSIGNED (TREE_TYPE (t)));
wi::to_mpz (t, res, TYPE_SIGN (TREE_TYPE (t)));
}
/* Returns the index of the PHI argument defined in the outermost
@ -1025,7 +1024,7 @@ build_loop_iteration_domains (scop_p scop, struct loop *loop,
/* loop_i <= expr_nb_iters */
else if (!chrec_contains_undetermined (nb_iters))
{
double_int nit;
widest_int nit;
isl_pw_aff *aff;
isl_set *valid;
isl_local_space *ls;
@ -1061,7 +1060,7 @@ build_loop_iteration_domains (scop_p scop, struct loop *loop,
isl_constraint *c;
mpz_init (g);
mpz_set_double_int (g, nit, false);
wi::to_mpz (nit, g, SIGNED);
mpz_sub_ui (g, g, 1);
approx = extract_affine_gmp (g, isl_set_get_space (inner));
x = isl_pw_aff_ge_set (approx, aff);

View file

@ -332,7 +332,8 @@ hook_bool_rtx_int_int_int_intp_bool_false (rtx a ATTRIBUTE_UNUSED,
}
bool
hook_bool_dint_dint_uint_bool_true (double_int, double_int, unsigned int, bool)
hook_bool_wint_wint_uint_bool_true (const widest_int &, const widest_int &,
unsigned int, bool)
{
return true;
}

View file

@ -23,7 +23,7 @@
#define GCC_HOOKS_H
#include "machmode.h"
#include "double-int.h"
#include "wide-int.h"
extern bool hook_bool_void_false (void);
extern bool hook_bool_void_true (void);
@ -61,7 +61,8 @@ extern bool hook_bool_rtx_int_int_int_intp_bool_false (rtx, int, int, int,
extern bool hook_bool_tree_tree_false (tree, tree);
extern bool hook_bool_tree_tree_true (tree, tree);
extern bool hook_bool_tree_bool_false (tree, bool);
extern bool hook_bool_dint_dint_uint_bool_true (double_int, double_int,
extern bool hook_bool_wint_wint_uint_bool_true (const widest_int &,
const widest_int &,
unsigned int, bool);
extern void hook_void_void (void);

View file

@ -239,12 +239,12 @@ ubsan_expand_si_overflow_addsub_check (tree_code code, gimple stmt)
;
else if (code == PLUS_EXPR && TREE_CODE (arg0) == SSA_NAME)
{
double_int arg0_min, arg0_max;
wide_int arg0_min, arg0_max;
if (get_range_info (arg0, &arg0_min, &arg0_max) == VR_RANGE)
{
if (!arg0_min.is_negative ())
if (!wi::neg_p (arg0_min, TYPE_SIGN (TREE_TYPE (arg0))))
pos_neg = 1;
else if (arg0_max.is_negative ())
else if (wi::neg_p (arg0_max, TYPE_SIGN (TREE_TYPE (arg0))))
pos_neg = 2;
}
if (pos_neg != 3)
@ -256,12 +256,12 @@ ubsan_expand_si_overflow_addsub_check (tree_code code, gimple stmt)
}
if (pos_neg == 3 && !CONST_INT_P (op1) && TREE_CODE (arg1) == SSA_NAME)
{
double_int arg1_min, arg1_max;
wide_int arg1_min, arg1_max;
if (get_range_info (arg1, &arg1_min, &arg1_max) == VR_RANGE)
{
if (!arg1_min.is_negative ())
if (!wi::neg_p (arg1_min, TYPE_SIGN (TREE_TYPE (arg1))))
pos_neg = 1;
else if (arg1_max.is_negative ())
else if (wi::neg_p (arg1_max, TYPE_SIGN (TREE_TYPE (arg1))))
pos_neg = 2;
}
}
@ -478,7 +478,7 @@ ubsan_expand_si_overflow_mul_check (gimple stmt)
rtx do_overflow = gen_label_rtx ();
rtx hipart_different = gen_label_rtx ();
int hprec = GET_MODE_PRECISION (hmode);
unsigned int hprec = GET_MODE_PRECISION (hmode);
rtx hipart0 = expand_shift (RSHIFT_EXPR, mode, op0, hprec,
NULL_RTX, 0);
hipart0 = gen_lowpart (hmode, hipart0);
@ -510,37 +510,35 @@ ubsan_expand_si_overflow_mul_check (gimple stmt)
if (TREE_CODE (arg0) == SSA_NAME)
{
double_int arg0_min, arg0_max;
wide_int arg0_min, arg0_max;
if (get_range_info (arg0, &arg0_min, &arg0_max) == VR_RANGE)
{
if (arg0_max.sle (double_int::max_value (hprec, false))
&& double_int::min_value (hprec, false).sle (arg0_min))
unsigned int mprec0 = wi::min_precision (arg0_min, SIGNED);
unsigned int mprec1 = wi::min_precision (arg0_max, SIGNED);
if (mprec0 <= hprec && mprec1 <= hprec)
op0_small_p = true;
else if (arg0_max.sle (double_int::max_value (hprec, true))
&& (~double_int::max_value (hprec,
true)).sle (arg0_min))
else if (mprec0 <= hprec + 1 && mprec1 <= hprec + 1)
op0_medium_p = true;
if (!arg0_min.is_negative ())
if (!wi::neg_p (arg0_min, TYPE_SIGN (TREE_TYPE (arg0))))
op0_sign = 0;
else if (arg0_max.is_negative ())
else if (wi::neg_p (arg0_max, TYPE_SIGN (TREE_TYPE (arg0))))
op0_sign = -1;
}
}
if (TREE_CODE (arg1) == SSA_NAME)
{
double_int arg1_min, arg1_max;
wide_int arg1_min, arg1_max;
if (get_range_info (arg1, &arg1_min, &arg1_max) == VR_RANGE)
{
if (arg1_max.sle (double_int::max_value (hprec, false))
&& double_int::min_value (hprec, false).sle (arg1_min))
unsigned int mprec0 = wi::min_precision (arg1_min, SIGNED);
unsigned int mprec1 = wi::min_precision (arg1_max, SIGNED);
if (mprec0 <= hprec && mprec1 <= hprec)
op1_small_p = true;
else if (arg1_max.sle (double_int::max_value (hprec, true))
&& (~double_int::max_value (hprec,
true)).sle (arg1_min))
else if (mprec0 <= hprec + 1 && mprec1 <= hprec + 1)
op1_medium_p = true;
if (!arg1_min.is_negative ())
if (!wi::neg_p (arg1_min, TYPE_SIGN (TREE_TYPE (arg1))))
op1_sign = 0;
else if (arg1_max.is_negative ())
else if (wi::neg_p (arg1_max, TYPE_SIGN (TREE_TYPE (arg1))))
op1_sign = -1;
}
}

View file

@ -1362,7 +1362,7 @@ get_polymorphic_call_info (tree fndecl,
{
base_pointer = TREE_OPERAND (base, 0);
context->offset
+= offset2 + mem_ref_offset (base).low * BITS_PER_UNIT;
+= offset2 + mem_ref_offset (base).to_short_addr () * BITS_PER_UNIT;
context->outer_type = NULL;
}
/* We found base object. In this case the outer_type

View file

@ -1099,7 +1099,7 @@ compute_complex_assign_jump_func (struct ipa_node_params *info,
|| max_size == -1
|| max_size != size)
return;
offset += mem_ref_offset (base).low * BITS_PER_UNIT;
offset += mem_ref_offset (base).to_short_addr () * BITS_PER_UNIT;
ssa = TREE_OPERAND (base, 0);
if (TREE_CODE (ssa) != SSA_NAME
|| !SSA_NAME_IS_DEFAULT_DEF (ssa)
@ -1159,7 +1159,7 @@ get_ancestor_addr_info (gimple assign, tree *obj_p, HOST_WIDE_INT *offset)
|| TREE_CODE (SSA_NAME_VAR (parm)) != PARM_DECL)
return NULL_TREE;
*offset += mem_ref_offset (expr).low * BITS_PER_UNIT;
*offset += mem_ref_offset (expr).to_short_addr () * BITS_PER_UNIT;
*obj_p = obj;
return expr;
}
@ -3787,8 +3787,7 @@ ipa_modify_call_arguments (struct cgraph_edge *cs, gimple stmt,
if (TYPE_ALIGN (type) > align)
align = TYPE_ALIGN (type);
}
misalign += (tree_to_double_int (off)
.sext (TYPE_PRECISION (TREE_TYPE (off))).low
misalign += (offset_int::from (off, SIGNED).to_short_addr ()
* BITS_PER_UNIT);
misalign = misalign & (align - 1);
if (misalign != 0)
@ -3994,7 +3993,7 @@ ipa_get_adjustment_candidate (tree **expr, bool *convert,
if (TREE_CODE (base) == MEM_REF)
{
offset += mem_ref_offset (base).low * BITS_PER_UNIT;
offset += mem_ref_offset (base).to_short_addr () * BITS_PER_UNIT;
base = TREE_OPERAND (base, 0);
}

View file

@ -32,8 +32,9 @@ The Free Software Foundation is independent of Sun Microsystems, Inc. */
#include "java-tree.h"
#include "parse.h"
#include "diagnostic-core.h"
#include "wide-int.h"
static void mark_reference_fields (tree, double_int *, unsigned int,
static void mark_reference_fields (tree, wide_int *, unsigned int,
int *, int *, int *, HOST_WIDE_INT *);
/* A procedure-based object descriptor. We know that our
@ -47,7 +48,7 @@ static void mark_reference_fields (tree, double_int *, unsigned int,
/* Recursively mark reference fields. */
static void
mark_reference_fields (tree field,
double_int *mask,
wide_int *mask,
unsigned int ubit,
int *pointer_after_end,
int *all_bits_set,
@ -107,7 +108,7 @@ mark_reference_fields (tree field,
bits for all words in the record. This is conservative, but the
size_words != 1 case is impossible in regular java code. */
for (i = 0; i < size_words; ++i)
*mask = (*mask).set_bit (ubit - count - i - 1);
*mask = wi::set_bit (*mask, ubit - count - i - 1);
if (count >= ubit - 2)
*pointer_after_end = 1;
@ -136,16 +137,15 @@ get_boehm_type_descriptor (tree type)
int last_set_index = 0;
HOST_WIDE_INT last_view_index = -1;
int pointer_after_end = 0;
double_int mask;
tree field, value, value_type;
mask = double_int_zero;
/* If the GC wasn't requested, just use a null pointer. */
if (! flag_use_boehm_gc)
return null_pointer_node;
value_type = java_type_for_mode (ptr_mode, 1);
wide_int mask = wi::zero (TYPE_PRECISION (value_type));
/* If we have a type of unknown size, use a proc. */
if (int_size_in_bytes (type) == -1)
goto procedure_object_descriptor;
@ -194,22 +194,22 @@ get_boehm_type_descriptor (tree type)
that we don't have to emit reflection data for run time
marking. */
count = 0;
mask = double_int_zero;
mask = wi::zero (TYPE_PRECISION (value_type));
++last_set_index;
while (last_set_index)
{
if ((last_set_index & 1))
mask = mask.set_bit (log2_size + count);
mask = wi::set_bit (mask, log2_size + count);
last_set_index >>= 1;
++count;
}
value = double_int_to_tree (value_type, mask);
value = wide_int_to_tree (value_type, mask);
}
else if (! pointer_after_end)
{
/* Bottom two bits for bitmap mark type are 01. */
mask = mask.set_bit (0);
value = double_int_to_tree (value_type, mask);
mask = wi::set_bit (mask, 0);
value = wide_int_to_tree (value_type, mask);
}
else
{

Some files were not shown because too many files have changed in this diff Show more