From: luc-eric r. <lu...@ho...> - 2000-10-26 03:08:42
|
Hello, for research, I would like to modify Mesa to support 16 bit per channel texture, as well as a higher precision render buffers, either 16 bit or single-precision floating point. The goal here is to use Mesa to render (non-interactivly) in software high precision 3D images made with 16 bit texture source, without loss of quality. The current precision of the vertex coordinates is fine, it's the precision of the shading and the alpha blending that matters. The rendered images are not meant for PC monitors (which are only 8 bit per channel max). My question is: how much trouble would it be? Of course, I've already looked at the source, but I would like to get an idea of how much of the code I would have to change, and if there are roadblocks I would hit. I should also point out that i'm lazy :^) _________________________________________________________________________ Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com. Share information about yourself, create your own public profile at http://profiles.msn.com. |
From: Brian P. <br...@va...> - 2000-10-26 03:24:39
|
luc-eric rousseau wrote: > > Hello, > for research, I would like to modify Mesa to support 16 bit per channel > texture, as well as a higher precision render buffers, either 16 bit or > single-precision floating point. > > The goal here is to use Mesa to render (non-interactivly) in software high > precision 3D images made with 16 bit texture source, without loss of > quality. The current precision of the vertex coordinates is fine, it's the > precision of the shading and the alpha blending that matters. The rendered > images are not meant for PC monitors (which are only 8 bit per channel max). > > My question is: how much trouble would it be? Of course, I've already > looked at the source, but I would like to get an idea of how much of the > code I would have to change, and if there are roadblocks I would hit. I > should also point out that i'm lazy :^) Unfortunately, it will be a lot of work. It's something that I've had in the back of my head for a couple years now. If you look in src/config.h and src/types.h you'll see a #define for CHAN_BITS and a GLchan datatype. The idea is that you could #define CHAN_BITS to 16 and GLchan would become a GLushort and all the internal pixel paths would have 16 bits of precision per color channel. Changing all occurances of GLubyte and the magic number 255 into GLchan and (1 << CHAN_BITS) - 1 is a step in the right direction but I'm sure there's many more issues to fix. It would probably require a line-by-line audit of the entire code base to get it done properly. -Brian |
From: Allen A. <ak...@po...> - 2000-10-26 06:17:46
|
On Wed, Oct 25, 2000 at 11:08:37PM -0400, luc-eric rousseau wrote: | ... and if there are roadblocks I would hit. ... One thing that I would expect to cause trouble is the precision with which you'll need to represent intermediate results when interpolating vertex attributes. 32-bit fixed point is marginal for 16-bit interpolants. You might need to go to 64-bit fixed or double-precision float. Check out the math in tritemp.h. Allen |
From: Keith W. <ke...@va...> - 2000-10-26 16:10:13
|
luc-eric rousseau wrote: > > Hello, > for research, I would like to modify Mesa to support 16 bit per channel > texture, as well as a higher precision render buffers, either 16 bit or > single-precision floating point. > > The goal here is to use Mesa to render (non-interactivly) in software high > precision 3D images made with 16 bit texture source, without loss of > quality. The current precision of the vertex coordinates is fine, it's the > precision of the shading and the alpha blending that matters. The rendered > images are not meant for PC monitors (which are only 8 bit per channel max). > > My question is: how much trouble would it be? Of course, I've already > looked at the source, but I would like to get an idea of how much of the > code I would have to change, and if there are roadblocks I would hit. I > should also point out that i'm lazy :^) > It will certainly be trouble, but a lot of people are interested in seeing it happen. I'm going to be doing some reorganization work in the 3.5 branch over the next couple of weeks, and one thing I'll be doing is isolating the software rasterizer in its own directory. After that has been done it should be a little clearer which code needs to be touched to acheive this goal. Keith |
From: Brian P. <br...@va...> - 2001-03-08 15:38:21
|
Back in October, luc-eric rousseau wrote: > > Hello, > for research, I would like to modify Mesa to support 16 bit per channel > texture, as well as a higher precision render buffers, either 16 bit or > single-precision floating point. > > The goal here is to use Mesa to render (non-interactivly) in software high > precision 3D images made with 16 bit texture source, without loss of > quality. The current precision of the vertex coordinates is fine, it's the > precision of the shading and the alpha blending that matters. The rendered > images are not meant for PC monitors (which are only 8 bit per channel max). > > My question is: how much trouble would it be? Of course, I've already > looked at the source, but I would like to get an idea of how much of the > code I would have to change, and if there are roadblocks I would hit. I > should also point out that i'm lazy :^) Got some good news for you. Mesa now has support for 16-bit color channels. There's probably a number of bugs to fix yet but I ran some tests last night and it seems to work. I haven't exercised texturing yet so there could be problems there. If you want to play with it, here's what you have to do: 1. Get the latest Mesa CVS trunk code. 2. Edit src/config.h and change CHAN_BITS to 16. 3. Edit src/Makefile.X11 and remove DRIVER_SOURCES from the OBJECTS assignment. 4. make -f Makefile.X11 linux-debug (this'll help if you find bugs) 5. Mesa/tests/osdemo16.c is a modified version of the osdemo.c program which renders a 16-bit/channel image and writes it as an 8-bit/channel targa file (dropping the least significant bits). Only the OSMesa interface supports 16-bit channels at this time. Let me know how it works if you try it. -Brian |