From: Ian R. <id...@us...> - 2003-10-28 00:37:13
|
Felix Kuehling wrote: > Log message: > Improved internal texture format selection in mga, r128, r200 and radeon: > - try to choose a texture format with same color depths as application supplied > data in order to avoid loss of color information (mga, r200 and radeon only) > - don't override application's choice of internal color depth unless force16bpt > is GL_TRUE. For now it is always GL_FALSE, but will be configurable soon I've always felt like the various ChooseTextureFormat routines could be refactored into a single generic routine. The way you restructured the code give me an idea how to do it. For a card that supports every possible texture format, we would have a 2D matrix. One axis is the internalFormat as a concrete type (i.e., GL_RGB8), not as a generic type (i.e., GL_RGB). The other axis is the type as a packed format (i.e., GL_UNSIGNED_SHORT_5_6_5), which might be derrived from the application supplied type / format. There are some made-up types on that axis as well. For example, {GL_RGB, GL_UNSIGNED_BYTE} maps to a nonexistant GL_UNSIGNED_24_RGB. Each cell in the matrix is a MESA_FORMAT_ value that meets the requirements and has the least conversion cost. Obviously, no card supports all of the formats. Each cell then becomes a sorted list of MESA_FORMAT_ values. The added input to the generic function is a list (probably as a bit-mask) of the formats supported by the hardware. The generic routine would run through the array in the selected cell until it finds a format that is suppored by the card. The value of DRI_CONF_TEXTURE_DEPTH32 would change the way generic internal formats are converted to concrete formats, and DRI_CONF_TEXTURE_DEPTH_FORCE_16 would select a subset of the available texture formats (i.e., a bit-mask that only had 16-bit formats). The other missing piece is support in Mesa for the byte-swapped formats and CI4 (which is supported by Rage128, i830, & MGA hardware). Anyone have time to work on such a generic routine? :) |