From: Brian P. <br...@va...> - 2000-10-26 03:24:39
|
luc-eric rousseau wrote: > > Hello, > for research, I would like to modify Mesa to support 16 bit per channel > texture, as well as a higher precision render buffers, either 16 bit or > single-precision floating point. > > The goal here is to use Mesa to render (non-interactivly) in software high > precision 3D images made with 16 bit texture source, without loss of > quality. The current precision of the vertex coordinates is fine, it's the > precision of the shading and the alpha blending that matters. The rendered > images are not meant for PC monitors (which are only 8 bit per channel max). > > My question is: how much trouble would it be? Of course, I've already > looked at the source, but I would like to get an idea of how much of the > code I would have to change, and if there are roadblocks I would hit. I > should also point out that i'm lazy :^) Unfortunately, it will be a lot of work. It's something that I've had in the back of my head for a couple years now. If you look in src/config.h and src/types.h you'll see a #define for CHAN_BITS and a GLchan datatype. The idea is that you could #define CHAN_BITS to 16 and GLchan would become a GLushort and all the internal pixel paths would have 16 bits of precision per color channel. Changing all occurances of GLubyte and the magic number 255 into GLchan and (1 << CHAN_BITS) - 1 is a step in the right direction but I'm sure there's many more issues to fix. It would probably require a line-by-line audit of the entire code base to get it done properly. -Brian |