Over on the AtariAge forums, Lucien2 is finding all sorts of neat stuff that's broken.
Here's the latest:
int test()
{
return( (*(char*)0x837C) & 31);
}
This gets converted to:
test
movb @>837C, r1
andi r1, >1F
b *r11
GCC is falsely assuming that the byte value is stored in the low-order byte of R1. It's optimizing out the typecast, resulting in bad code.
I've changed all the constant address code to use hex values instead of decimal. It looks much better now.
In order to unify output, I should convert all shift-by-eights to use hex shift counts.
Another odd thing:
int test()
{
return( (*(char*)0x837C) & (*(char*)0x820C));
}
Gets converted to this:
test
inv @>837C
movb @>820C, r1
szcb @>837C, r1
Technicallly, we do need to invert before using SZCB, but we shouldn't invert memory without a good reason. That invert should be done in a register.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment