On Fri, Jun 4, 2010 at 1:14 PM, Ian Mallett <geometrian@gmail.com> wrote:
What is the context of this problem? 

It's a false-color display on top of grayscale images. We have a grayscale image and a separate image which contains a per-pixel segment index.  Then we have a colormap which maps segment indexes to RGBA values.  The shader looks the color up in the colormap then blends it with the grayscale value.

Initially we used LUMINANCE_16 images, so only 16-bit indexes, and a fixed 256x256 colormap.  We moved to RGB  to hold up to 24-bit indexes because some images are large and have more than 65k segments.
 
A 3D texture would certainly work, but these suffer from memory issues, complexity, and graphics-cards-generally-not-liking-them. 

That's not very encouraging...  so are "texture arrays" even worse?
 
You might do better to "unroll" the 3D texture into a 2D texture, although again, complexity.

That's what we do now and it is a little complex. The shader needs to know the shape/size of the texture and do a bit of math to convert from RGB index to UV.  With a 3D texture it seems we can directly use the RGB index as the STR coordinates for a 256x256xN texture.  The texture can be only as deep as is needed, based on the size of the colormap.

So older cards don't like 3D Textures or does it continue to be a problem even for newer hardware?  We have only two dozen users, mostly newish Quadro 3800 cards.

-Philip