Hi,
A few mins ago I was reading some reviews of Andre's new book on amazon. Some guy was denouncing all of andre's claims, and one of them was that 'bit shifting is faster than multiplying'. After years of been told _always_ bit shift, I laughed.
I thought I'd check it out tho, having never actually compared normal multiplication and bit shifting, just taking it on faith that it is faster...anyway.
I used the following code for bit shifting;
for(DWORD i=0; i<10000000; i++)
temp = 6545345 << 1;
And this for normal multiplication;
for(DWORD i=0; i<10000000; i++)
temp = 6545345 * 2;
10,000,000 seemed a large enough amount of multiplication 
Upon checking the frame rates, I got 3.15fps for the standard multiplication, and about 3.12fps for the bit shifting... HMM.
Is this just me? Bear in mind im on a lowly p200mmx, hence the slow fps
But in comparison, how come bitshifting is actually slower than standard multiplication??
Is this all just me?
Comments/Advice/Flames... 
--
Russ