On Fri, Jan 27, 2012 at 10:06 AM, Howard <howard@renci.org> wrote:
On 1/27/12 3:39 AM, Ian Thomas wrote:
On 26 January 2012 19:36, Howard <howard@renci.org> wrote:
I'm rendering some images with about 3.5 million triangles into a 512x512 png file using tricontourf. I'm running this in a virtual machine, and I'm pretty sure that there is no graphics rendering hardware being used. Is it possible, assuming the hardware was available, to make tricontourf use the rendering hardware?  Will that happen by default?

You are correct, there is no graphics hardware rendering.  Rendering is controlled by the various matplotlib backends, and to my knowledge there are  no backends currently available that use hardware rendering.

There has been some work done on an OpenGL backend, but I am not sure of the status of this.  The last time I checked it was pretty experimental.  Perhaps someone involved with it can comment on its current status.

Ian Thomas
Ian

Thanks very much for the reply. If it helps whoever is doing the OpenGL backend, I may be able to play with it a bit.


Howard


That would be the Glumpy project.

http://code.google.com/p/glumpy/

As stated in an email response a while back, glumpy is intended to be a testbed for developing the OpenGL backend for future inclusion into matplotlib.

Cheers!
Ben Root