|
From: Anton V. <vas...@ya...> - 2009-04-08 23:06:39
|
Wow Jeff! You save me again! I remember looking at it last year and thinking it would be awesome if there would be a windows installer for it! I will install and play with it tonight! Thanks a lot! Anton ________________________________ From: Jeff Whitaker <js...@fa...> To: antonv <vas...@ya...> Cc: mat...@li... Sent: Wednesday, April 8, 2009 4:02:22 PM Subject: Re: [Matplotlib-users] Computer specs for fast matplotlib and basemap processing antonv wrote: > I know that using the csv files is very slow but I have no knowledge of > working with the netcdf format and I was in a bit of a rush when I wrote > this. I will take a look again at it. How would you translate a grib in > netcdf? Are there any secific applications or straight through numpy? > > As for pyngl, if i remember correctly I looked at it but it was not working > on windows. > > Thanks, > Anton > Anton: If these are grib version 2 files, another option is http://code.google.com/p/pygrib2. I have made a windows installer. -Jeff > > > efiring wrote: > >> antonv wrote: >> >>> I have a bit of experience programming and I am pretty sure I get my >>> parts of >>> the code pretty well optimized. I made sure that in the loop I have only >>> the >>> stuff needed and I'm loading all the stuff before. >>> >>> The biggest bottleneck is happening because I'm unpacking grib files to >>> csv >>> files using Degrib in command line. That operation is usually around half >>> an >>> >> Instead of going to csv files--which are *very* inefficient to write, store, and then read in again--why not convert directly to netcdf, and then read your data in from netcdf as needed for plotting? I suspect this will speed things up quite a bit. Numpy support for netcdf is very good. Of course, direct numpy-enabled access to the grib files might be even better, eliminating the translation phase entirely. Have you looked into http://www.pyngl.ucar.edu/Nio.shtml? >> >> Eric >> >> >> >>> hour using no more than 50% of the processor but it maxes out the memory >>> usage and it definitely is hard drive intensive as it ends up writing >>> over 4 >>> GB of data. I have noticed also that on a lower spec AMD desktop this >>> runs >>> faster than on my P4 Intel Laptop, my guess being that the laptop hdd is >>> 5400 rpm and the desktop is 7200 rpm. >>> >>> Next step is to take all those csv files and make images from them. For >>> this >>> one I haven't dug too deep to see what is happening but it seems to be >>> the >>> other way, using the cpu a lot more while keeping the memory usage high >>> too. >>> >>> Thanks, >>> Anton >>> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by: >> High Quality Requirements in a Collaborative Environment. >> Download a free trial of Rational Requirements Composer Now! >> http://p.sf.net/sfu/www-ibm-com >> _______________________________________________ >> Matplotlib-users mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >> >> >> > > -- Jeffrey S. Whitaker Phone : (303)497-6313 Meteorologist FAX : (303)497-6449 NOAA/OAR/PSD R/PSD1 Email : Jef...@no... 325 Broadway Office : Skaggs Research Cntr 1D-113 Boulder, CO, USA 80303-3328 Web : http://tinyurl.com/5telg |