Re: [ooc-compiler] Implementation of SYSTEM.LSH
Brought to you by:
mva
|
From: Stewart G. <sgr...@ii...> - 2005-12-05 04:13:06
|
Hi August, The oo2c ASH/LSH implementation (in __oo2c.h) looks like this: /* ASH(x,n) */ #define _ashl(_x,_n) (_x << _n) #define _ashr(_x,_n) (_x >> _n) | ((_x >= 0) ? 0 : ~(~(OOC_INT32)0 >> _n)) #define _ash(_x,_n) (_n >= 0) ? _ashl(_x,_n) : _ashr(_x,- _n) /* SYSTEM.LSH(x,n) */ #define _lshl(_x,_n,_type) ((_type)(_x << _n)) #define _lshr(_x,_n,_type) ((_type)(_x >> _n)) #define _lsh(_type,_x,_n) ((_n >= 0) ? _lshl(_x,_n,_type) : _lshr(_x,- _n,_type)) For LSH, it just uses the shift operator. For ASH, it explicitly adds the sign extension bits. According to my "C" definition (Harbison & Steele, 1984) the C shift operator does an arithmetic shift for signed integer types, and a logical shift (ie. fill with zero) for unsigned integer types. So there are a few problems with the above implementation. 1) _ashr will work for signed integers, but for these integers the sign test is redundant (>> already does the right thing). For unsigned integers, the sign test always fails (ie. they are all positive) so the sign extension is not done. I suspect that it also fails for negative 64-bit integers since the sign extension bits will be in the wrong place. 2) _lshr will only work correctly for unsigned types. For signed types, it does an arithmetic shift. Ideally, what we need is something like this: /* ASH(x,n) */ #define _ashl(_x,_n) (ST(_x) << _n) #define _ashr(_x,_n) (ST(_x) >> _n) #define _ash(_x,_n) (_n >= 0) ? _ashl(_x,_n) : _ashr(_x,- _n) /* SYSTEM.LSH(x,n) */ #define _lshl(_x,_n,_type) ((_type)(UT(_x) << _n)) #define _lshr(_x,_n,_type) ((_type)(UT(_x) >> _n)) #define _lsh(_type,_x,_n) ((_n >= 0) ? _lshl(_x,_n,_type) : _lshr(_x,- _n,_type)) Where ST(_x) and UT(_x) cast _x to the signed and unsigned (respectively) types of the same size. Unfortunately, "C" doesn't provide such an operator, so to make this work OOC would have to pass this type as an argument to the ASH/LSH function. One option (ie. hack) would be to do: #define ST(_x) ((signed) _x) #define UT(_x) ((unsigned) _x) but that would only work properly for arguments up to the size of the host "int" type (probably 32 bits). Another option is to cast integers to the corresponding CHAR type before doing LSH. This also limits you to 32 bits, since there is no 64-bit CHAR in OOC. Cheers, Stewart August Karlstrom wrote: > Hi, > > For integers, SYSTEM.LSH seems to be implemented as ASH, e.g. the result > of LSH(-1, -1) is -1 rather than MAX(SHORTINT). Why? > > Here is what I expect: > > LSH(-1, -1): 1111 1111 -> 0111 1111 > ASH(-1, -1): 1111 1111 -> 1111 1111 > > > Regards, > > August > > > > ------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. Do you grep through log > files > for problems? Stop! Download the new AJAX search engine that makes > searching your log files as easy as surfing the web. DOWNLOAD SPLUNK! > http://ads.osdn.com/?ad_id=7637&alloc_id=16865&op=click > _______________________________________________ > ooc-compiler mailing list > ooc...@li... > https://lists.sourceforge.net/lists/listinfo/ooc-compiler > |