From: Per P. <per...@ma...> - 2005-05-15 19:19:30
|
On May 14, 2005, at 23:05, Alan W. Irwin wrote: > Here is a C language question: in case we want to change > PL_FCI_MARK again, > how can I specify our internal font codes to take this into account? > For example, right now we have > > const FCI_to_FontName_Table Type1Lookup[N_Type1Lookup] = { > {0x10000000, "Helvetica"}, > {0x10000001, "Times-Roman"}, > ... > > in include/plfci-type1.h. Can I use > > {PL_FCI_MARK+0x0, "Helvetica"}, > {PL_FCI_MARK+0x1, "Times-Roman"}, > > etc.? Yes, or even better IMHO using the bitwise OR operation {PL_FCI_MARK | 0x0, "Helvetica"} > Or would it be better to use the following idea? > > #define PL_FCI_MARK_HEX 0x1 (or whatever the case may be?) > #define PL_FCI_MARK PL_FCI_MARK_HEX0000000 > > and > > {PL_FCI_MARK_HEX0000000, "Helvetica"}, > {PL_FCI_MARK_HEX0000001, "Times-Roman"}, > > etc.? No, no, no... please, my eyes! I'm going blind.... ;-) Seriously, no. That is completely unreadable precompiler abuse. A common pattern when creating (unsigned) 32 bit binary numbers from a set of options is someting like #define PL_FCI_MARK 0x80000000 #define FAMILY(a) ((a) << 0) #define TRAIT(a) ((a) << 8) #define MORE_OPTS(a) ((a) << 16) typedef enum { SERIF = 0, SANS_SERIF, MONO } font_family; typedef enum { ROMAN = 0, ITALIC } font_trait; uint32 my_fci = PL_FCI_MARK | FAMILY(SERIF) | TRAIT(ITALIC) | MORE_OPTS(BOLD); You'll notice that I didn't look up the actual definition of an FCI, but I hope you get the essence of what I'm trying to convey. /Per |