[Gc] Re: libatomic_ops aarch64 support
ivmai at mail.ru
Mon Mar 4 13:12:17 PST 2013
I've applied your patch (to add-aarch64-support branch) plus mine minor changes. Please retest.
* Could we discard stxp in double_load like for 32-bit ARM?
* The clobber lists of all asm statements are empty, is it ok?
Четверг, 28 февраля 2013, 22:57 +01:00 от Yvan Roux <yvan.roux at linaro.org>:
>I finally fixed the double_[load|store|compare_and_swap] AArch64
>support. I defined double_ptr_storage as an unsigned __int128 and used
>the load and store exclusive pair registers instruction of the A64
>isa. The testsuite is now fine (notice that the failures with stack
>and malloc was due to the previous compare_and_swap implementation). I
>kept the generic implementation garded by an ifndef, but maybe they
>could be put in something like a double_generic.h.
>On 15 February 2013 15:51, Yvan Roux < yvan.roux at linaro.org > wrote:
>> the native build is fine after the small fix below in your last commit
>> (the issue I encountered was on my side), we just have to fix the
>> correctness ;)
>> return (int)__atomic_compare_exchange_n(&addr->AO_whole,
>> - &old_val->AO_whole /* p_expected */,
>> + &old_val.AO_whole /* p_expected */,
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Gc