On Thu, 31 Aug 2000, Linda Walsh wrote:
> > It is propably from reasoning of:
> >
> > "there is really no point in it, as at 32bit systems
> > int and long are same size, thus same limit comes
> > with both types."
> >
> > At 64-bit machines there is, of course, definite difference.
> ---
> There ya go again -- confusing the issue with facts. Compare
> and contrast the appropriate 32 and 64 bit values. Using a 'long long'
> on ia32 for comparison vs. an int.
Just put this in a loop and time it. Change SIZE to long long, and do
it again! It doesn't scale well. The long long code is nearly 10 times
slower! You can do `gcc -S -o xxx name.c` and see why.
#define SIZE long
SIZE *foo()
{
SIZE one;
SIZE two;
static SIZE answer[8];
one = 1;
two = 2;
answer[0] = one - two;
answer[1] = two - one;
answer[2] = one + two;
answer[3] = two + one;
answer[4] = two / one;
answer[5] = one / two;
answer[6] = one * two;
answer[7] = two * one;
return answer;
}
Long long things, even it they work well, are not very nice on 32 bit
machines. For the time being, I'd advise increasing cluster size rather
than using 64 bit values.
In a few years, even Intel machines will be 64 bits. Int will still be
long, but it will be 64 bits. ---and they will autoscale for backwards
compatibility (one new instruction, used once, for oprand size).
Cheers,
Dick Johnson
Penguin : Linux version 2.2.15 on an i686 machine (797.90 BogoMips).
"Memory is like gasoline. You use it up when you are running. Of
course you get it all back when you reboot..."; Actual explanation
obtained from the Micro$oft help desk.
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
Please read the FAQ at http://www.tux.org/lkml/
This archive was generated by hypermail 2b29 : Thu Aug 31 2000 - 21:00:28 EST