[PyOpenGL-Users] Newbie question - multiplying gl matrices element-wise
Brought to you by:
mcfletch
From: Brendan S. <bre...@ya...> - 2008-06-02 00:33:14
|
Hi all, Sorry for the basic level of this question. I'm absolutely new to GL programming. I'm trying to implement a very simple demosaicing algorithm on the graphics card (in order to render data from a really high-res bayer-filtered ccd sensor to the screen in real time). I need to be able to do the following: 1) given a "raw" image, copy the data (a matrix of unsigned bytes) to video memory three times. 2) multiply each copy element-wise by a precomputed "mask" of 1s and 0s, so that the first copy records only the red values (all others become 0), the second green, and the last blue. 3) Interpolate the missing colour info. This step is different for green, and for red/blue, but it goes something like this: Copy a crop of the green matrix exactly one row and one column narrower on all sides. Now add (again element-wise) to that copy another crop of the green matrix, but this time shifted up one row. And again shifted down one row. And again shifted left on column. And one more shifted right one column. Divide the total (element-wise) by 4. 4) Copy the interpolated red, green and blue matrices to an opengl texture. Map the texture to a 2d square, and render the square to the screen (resizing / rotating / flipping as desired). I can figure out steps 1 and 4. Ideally steps 2 and 3 would be done in video memory, using opengl functions, but I can't find anything that would do element-wise multiplication, addition or cropping of matrices in the opengl stack. Can anyone point me in the right direction? Thanks, Brendan -- Brendan Simons |