On plain Pentiums, shifting was 0.5 cycle in the U pipe only (so there was a 0.5 cycle latency if in the V pipe). Integer multiplications were about ~10 cycles. Nowadays, even if rules have changed, you shouldn't discard bitshifts :
 
1) you never know where your program will run in the end
2) using a bitshift for masks and similar things is conceptually clearer anyway. Keep your code clean!
3) you probably code in C/C++, and most of the time the compiler handles this for you anyway (*)
4) there's more to optimization than this :)
5) you can't do the same as SHLD / SHRD with muls, ah!
 
(*) As a reminder, even "slow" things like:
 
    int Value;
    ...
    Value*=5;
 
...are cleverly compiled using the good old LEA instruction (that is: without shifts or muls). Nowadays I would leave all those things to the compiler.
 
 
Pierre Terdiman      *   Home: p.terdiman@wanadoo.fr
Coder in the dark    *   Zappy's Lair:  www.codercorner.com
 
 
----- Original Message -----
From: Alex Morano
To: gdalgorithms-list@lists.sourceforge.net
Sent: Saturday, December 23, 2000 9:38 PM
Subject: [Algorithms] Shifting or Multiplication....

    I was wondering if anyone could clear up something. Algorithmically, is bitshifting or using multiplication faster?? Now, I know this sounds like a subjective question, as some will undoubtadly say, it depends on the compiler and what it translates the code into. Lets take it down to the ASM level.
 
    Shifting, to me, looks like a 1 cycle operation.
 
    In this age, if the CPU can handle most MUL's almost as rapidly, is there a place for bitshifts? If so, then for what kinds of algo applications? Do we still need them to mask off bits in pixels colors?