From: Brian P. <bri...@tu...> - 2007-10-09 21:37:41
|
Jonathan Richard wrote: > > > On 10/9/07, *Brian Paul* <bri...@tu... > <mailto:bri...@tu...>> wrote: > > Jonathan Richard wrote: > > Hi, I use mesa3d for 16 bits offscreen rendering and it seems to be > > working well. Unfortunately I also need to display a 3d model on the > > screen. However since I linked with the opengl32.dll 16 bits per > > channel version, the color that are displayed are not the right ones. > > How does the conversion from 16 bits to 8 bits is performed? What > can > > I do to get the right 8 bits colors? thanks > > If Mesa's compiled for deep (16/32-bit) color channels and it sees that > you're rendering into a shallower renderbuffer it should down-convert > the color as needed. This would take place in src/mesa/swrast/s_span.c > in the convert_color_type() function. > > Perhaps you can use your debugger to see if that function is getting > called. > > -Brian > > > Hi Brian, I forget to say that I'm developping on ms windows. Does it > change your answer? You should get a look at this message wrote by Karl > Schultz: > > "The current GDI driver expects only 8 bit per channel data from Mesa. > You would have to modify the GDI driver to convert the 16 bit per > channel image generated by Mesa to 8 bit per channel, keeping the upper > 8 bits. This should not be too hard to do. If you look in > drivers/windows/gdi, I think most of the code is in wmesa.c. There is > probably a call to a BitBlt function. Just before that call, you would > walk through the source pixel array and convert it in place to 8 bits > per channel. > Hopefully there is a data structure someplace where you can determine > the component size and decide when it is appropriate to do this." > > Is he right? He was right, as of about 6-12 months ago. At some point I added the convert_color_type() code to handle the situation you describe. Karl may not be aware of it. -Brian |