Screenshot instructions:
Windows
Mac
Red Hat Linux
Ubuntu
Click URL instructions:
Right-click on ad, choose "Copy Link", then paste here →
(This may not be possible with some types of ads)
You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(151) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
1
(16) |
2
(31) |
3
(17) |
4
(18) |
5
(7) |
6
(5) |
7
(16) |
8
(9) |
9
(19) |
10
(18) |
11
(17) |
12
(7) |
13
(6) |
14
(15) |
15
(16) |
16
(15) |
17
(19) |
18
(27) |
19
(10) |
20
(5) |
21
(5) |
22
(19) |
23
(7) |
24
(11) |
25
(19) |
26
(1) |
27
(36) |
28
(37) |
29
(28) |
30
(36) |
|
|
|
From: Eric Firing <efiring@ha...> - 2009-09-10 23:33:17
|
John [H2O] wrote: > In mpl_toolkits.basemap there is a module cm. > > I have been using the cm.s3pcpn colormap for some plots which require > logarithmic coloring. The cm.s3pcpn_l colormap is also available (apparently > a linear version). > > I wanted to know whether there were any other logarithmic colormaps > available... for example, I would like to use gist_rainbow or gist_ncar, but > I haven't been able to so far using the example below. Suggestions? In addition to specifying the cmap, specify the norm as a LogNorm instance: from matplotlib.colors import LogNorm im = imshow(.... cmap=... , norm=LogNorm(vmin=clevs[0], vmax=clevs[-1])) Try something like that. Eric > > > The relevant section of code is here: > > dmn = 0.00001 > dmx = 1000000 > logspace = 10.**np.linspace(dmn, dmx, 100) > clevs = logspace > im = m.imshow(topodat,cmap=cm.s3pcpn,vmin=clevs[0],vmax=clevs[-1]) |
From: John [H2O] <washakie@gm...> - 2009-09-10 23:07:33
|
In mpl_toolkits.basemap there is a module cm. I have been using the cm.s3pcpn colormap for some plots which require logarithmic coloring. The cm.s3pcpn_l colormap is also available (apparently a linear version). I wanted to know whether there were any other logarithmic colormaps available... for example, I would like to use gist_rainbow or gist_ncar, but I haven't been able to so far using the example below. Suggestions? The relevant section of code is here: dmn = 0.00001 dmx = 1000000 logspace = 10.**np.linspace(dmn, dmx, 100) clevs = logspace im = m.imshow(topodat,cmap=cm.s3pcpn,vmin=clevs[0],vmax=clevs[-1]) -- View this message in context: http://www.nabble.com/logarithmic-colormaps-for-imshow-tp25392480p25392480.html Sent from the matplotlib - users mailing list archive at Nabble.com. |
From: Gökhan Sever <gokhansever@gm...> - 2009-09-10 22:37:06
|
On Thu, Sep 10, 2009 at 2:29 PM, Gökhan Sever <gokhansever@...> wrote: > Hello, > > I have a simple bar-chart seen at > http://img40.imageshack.us/img40/4889/barchart.png > > What is the way to plot each bar equally spaced apart from eachother? Any > simple way without defining custom ticks or manipulating the data? > > Homework season has just started here. Lots of matplotting to do... > > Thanks. > > -- > Gökhan > Self replying: See the resulting image at http://img9.imageshack.us/img9/145/barchart2.png I am thinking a keyword like "spacing" could be added to the bar(). If it is set true then an approach similar shown in the 2nd version might followed to equally space the bars instead of linear or log scaling. Any other ideas or a hidden keyword to achieve the same plotting? The code to produce both figures: import numpy as np import matplotlib.pyplot as plt # Load data diameters, numbers = np.loadtxt('lab1-data', dtype='int8', skiprows=1).T # 1st version --with linear spacing width = 1.0 plt.bar(diameters, numbers, width=width, align='edge') plt.axis(xmin=diameters.min()-2, xmax=diameters.max()+2) plt.xlabel("Diameter (mm)", fontsize=16) plt.ylabel("Number of washers (#)", fontsize=16) plt.xticks(diameters+width/2, diameters) plt.figure() # 2nd version --equal spacing zipped = zip(diameters, numbers) zipped.sort() # Unzipping diameters, numbers = zip(*zipped) width = 0.4 plt.bar(range(len(diameters)), numbers, width=width, align='edge') plt.xlabel("Diameter (mm)", fontsize=16) plt.ylabel("Number of washers (#)", fontsize=16) plt.xticks(np.arange(len(diameters))+width/2, diameters) plt.axis(xmin=-width/2, xmax=len(diameters)-width) plt.show() #Data Diameter(mm) Number 11 7 7 44 10 24 51 1 38 2 35 3 21 12 28 16 12 8 16 8 -- Gökhan |
From: Eric Firing <efiring@ha...> - 2009-09-10 19:52:22
|
Erik Wickstrom wrote: > Hi all, > > Can matplotlib (or any other Python charting library) generate charts > like this: (also attached if you prefer) > > http://imagebin.ca/view/iGhEQEE.html > > It's basically a moving average with the vertical lines being the > difference between the average and the actual data point. It looks like what you need is a simple modification of the present stem plot: http://matplotlib.sourceforge.net/examples/pylab_examples/stem_plot.html For now, you may be able to use the source--the stem method of the Axes class in matplotlib/lib/matplotlib/axes.py--to come up with your own function to do the job. Longer term, you, I, or someone else, should add this capability to that method via a keyword argument giving the baseline as a constant, or as an array of points corresponding to the input x variable. Even more options are possible. Eric > > Can anyone send me in the right direction? > > Thanks! > > Erik > > > ------------------------------------------------------------------------ > > > ------------------------------------------------------------------------ > > ------------------------------------------------------------------------------ > Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day > trial. Simplify your report design, integration and deployment - and focus on > what you do best, core application coding. Discover what's new with > Crystal Reports now. http://p.sf.net/sfu/bobj-july > > > ------------------------------------------------------------------------ > > _______________________________________________ > Matplotlib-users mailing list > Matplotlib-users@... > https://lists.sourceforge.net/lists/listinfo/matplotlib-users |
From: Eric Firing <efiring@ha...> - 2009-09-10 19:37:09
|
Dave wrote: > I upgraded my numpy to 1.4.0.dev7375 and scipy to 0.8.0.dev5920. After doing so > I get a segfault upon calling the plot command (see below) > > I guess I need to compile from source but I'm not sure exactly how to do so - > are there any good step-by-step instructions out there? > > Thanks, > Dave > > Python 2.5.4 (r254:67916, Dec 23 2008, 15:10:54) [MSC v.1310 32 bit (Intel)] > Type "copyright", "credits" or "license" for more information. > > IPython 0.10 -- An enhanced Interactive Python. > ? -> Introduction and overview of IPython's features. > %quickref -> Quick reference. > help -> Python's own help system. > object? -> Details about 'object'. ?object also works, ?? prints more. > > > *** Pasting of code with ">>>" or "..." has been enabled. > > In [2]: from numpy import * > > In [3]: from pylab import * > > In [4]: import numpy; numpy.__version__ > Out[4]: '1.4.0.dev7375' > > In [5]: plot(randn(100)) What happens if you simply do x = randn(100) or plot([1,2,3,2,1]) ? My guess is that you are seeing a numpy installation problem, not a matplotlib problem (that is, I expect the first trial above to fail and the second to succeed), and that the problem may be that you did not delete the build directory before rebuilding numpy from source. Distutils often fails to rebuild components that need to be recompiled after a change to the source, so the build and install appear to work, but the resulting numpy (or matplotlib, for that matter) does not. Eric |
From: Gökhan Sever <gokhansever@gm...> - 2009-09-10 19:29:19
|
Hello, I have a simple bar-chart seen at http://img40.imageshack.us/img40/4889/barchart.png What is the way to plot each bar equally spaced apart from eachother? Any simple way without defining custom ticks or manipulating the data? Homework season has just started here. Lots of matplotting to do... Thanks. -- Gökhan |
From: Christopher Barker <Chris.Barker@no...> - 2009-09-10 17:48:54
|
Farhan Sheikh wrote: > when i installed python 2.6, i installed it into the /usr/local/lib > folder maybe I wasn't clear -- the MPL installer it meant to be used with the binary from python.org. You want use the installer you get here: http://www.python.org/download/ > python -c "import sys; print sys.path" > > the results i got were: > > ['', '/usr/local/lib/python26.zip', '/usr/local/lib/python2.6', > '/usr/local/lib/python2.6/plat-darwin', > '/usr/local/lib/python2.6/plat-mac', > '/usr/local/lib/python2.6/plat-mac/lib-scriptpackages', > '/usr/local/lib/python2.6/lib-tk', '/usr/local/lib/python2.6/lib-old', > '/usr/local/lib/python2.6/lib-dynload', > '/usr/local/lib/python2.6/site-packages'] > > it seems like everything was downloaded to the correct directory. for a unix-style install into /usr/local. yes. But it is not the kind of install that the MPL binary is expecting. There are WAY TOO MANY ways to install python on OS-X, but the "standard" way is the one used by the installers found on python.org: It is called a "Framework Build", and is the "Mac" way to do things. People have their reasons for doing it other ways, but package distributors can only support so much, so the python.org way is the one generally supported. You can build MPL for your install it you want, it should be easy, except for the dependencies -- I'm not sure what those are anymore, but a little reading of the docs should tell you. Oh, and you'll end up having to build every other extension, too - wxPython, QT, PIL, ??? Without good reason, I'd just go with the python,org build. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@... |
From: Christopher Barker <Chris.Barker@no...> - 2009-09-10 16:59:16
|
Farhan Sheikh wrote: > i have python 2.6 running on my mac osx 10.5, however when installing > the binary file provided, it says i need to have python 2.6 on my > machine. i dont understand why this is happening as when open a new > terminal and type 'python', python version 2.6.2 is the version that > is run. My supervisor also had a look at this and could not figure it > out. He linked python 2.6 with the python command but the install file > still did not recognise python 2.6. > > anybody have the same issue? or know of how to fix this issue? How did you install 2.6? The binary MPL installer is probably looking for the python.org 2.6, which would be in /Library/Frameworks/.... If you are using a macports python, for instance, the installer wouldn't find it. "which python" at the command line might help clear this up. or: python -c "import sys; print sys.path" That will spew out a bunch of dirs, and where there are should tell you where your python is installed. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@... |
From: <jason-sage@cr...> - 2009-09-10 16:40:15
|
Erik Wickstrom wrote: > Hi all, > > Can matplotlib (or any other Python charting library) generate charts > like this: (also attached if you prefer) > > http://imagebin.ca/view/iGhEQEE.html > > It's basically a moving average with the vertical lines being the > difference between the average and the actual data point. > > Can anyone send me in the right direction? > Can you adapt one of the examples here: http://matplotlib.sourceforge.net/examples/pylab_examples/errorbar_demo.html Jason |
From: Werner F. Bruhin <werner.bruhin@fr...> - 2009-09-10 16:19:30
|
Jouni K. Seppänen wrote: > "Werner F. Bruhin" <werner.bruhin@...> writes: > > >>> I think this has been fixed on the trunk for good, by changing all >>> docstring modifications to use decorators (defined in docstring.py) that >>> check for nonexistent docstrings. The changes are perhaps too big to >>> apply on the 0.99 branch. >>> >>> >> As it is fixed in trunk is there a need/much use for a patch to 0.99? >> I.e. how far of is the next release which includes what ever is in >> trunk? >> > > I think John Hunter said somewhere that he plans to release 1.0 in a few > months, but software release schedules are notoriously difficult to > estimate, and with volunteer-driven open-source software even more so. I > think the answer depends on how many people use py2exe, about which I > have absolutely no idea. > I don't think it is worse a patch as the work around is very easy. I documented what is needed for a py2exe setup.py with mpl 0.99 on the py2exe site at the end of the following page http://www.py2exe.org/index.cgi/MatPlotLib . While doing this I noted one more problem if one uses backend_wx, the wxPython version check will always fail if py2exe'd. I added "if not hasattr(sys, 'frozen'):" on line 113 and indented lines 114 to 131. Hopefully someone could make this change for the next version of mpl. Werner |
From: Armin Moser <armin.moser@st...> - 2009-09-10 16:06:44
|
Erik Wickstrom schrieb: > Hi all, > > Can matplotlib (or any other Python charting library) generate charts > like this: (also attached if you prefer) Here is a demo-script. The second way is inspired by matlab. Can this be done more easily in python? Can X and Y be built more elegantly with numpy? Armin -------8<------------------------ from pylab import * # data generation x = linspace(0,10,100) y = exp(-x) ye = y + rand(y.size)-0.5 # plot the vertical lines # with loop for xl,yl,yel in zip(x,y,ye): plot([xl,xl],[yl,yel],'r') plot(x,y,x,ye,'d') # plot by separating with NaN figure() X = zeros((x.size,3)) Y = zeros((x.size,3)) X[:,0],X[:,1],X[:,2] = x,x,NaN Y[:,0],Y[:,1],Y[:,2] = y,ye,NaN plot(X.flatten(),Y.flatten(),x,y,x,ye,'d') show() -------8<------------------------ |
From: Erik Wickstrom <erik@er...> - 2009-09-10 15:39:36
|
Hi all, Can matplotlib (or any other Python charting library) generate charts like this: (also attached if you prefer) http://imagebin.ca/view/iGhEQEE.html It's basically a moving average with the vertical lines being the difference between the average and the actual data point. Can anyone send me in the right direction? Thanks! Erik |
From: Dave <dave.hirschfeld@gm...> - 2009-09-10 15:10:17
|
I upgraded my numpy to 1.4.0.dev7375 and scipy to 0.8.0.dev5920. After doing so I get a segfault upon calling the plot command (see below) I guess I need to compile from source but I'm not sure exactly how to do so - are there any good step-by-step instructions out there? Thanks, Dave Python 2.5.4 (r254:67916, Dec 23 2008, 15:10:54) [MSC v.1310 32 bit (Intel)] Type "copyright", "credits" or "license" for more information. IPython 0.10 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. *** Pasting of code with ">>>" or "..." has been enabled. In [2]: from numpy import * In [3]: from pylab import * In [4]: import numpy; numpy.__version__ Out[4]: '1.4.0.dev7375' In [5]: plot(randn(100)) C:\dev\bin\pythonxy\console> |
From: Farhan Sheikh <fas@cs...> - 2009-09-10 14:57:29
|
Dear all, i have python 2.6 running on my mac osx 10.5, however when installing the binary file provided, it says i need to have python 2.6 on my machine. i dont understand why this is happening as when open a new terminal and type 'python', python version 2.6.2 is the version that is run. My supervisor also had a look at this and could not figure it out. He linked python 2.6 with the python command but the install file still did not recognise python 2.6. anybody have the same issue? or know of how to fix this issue? Thank You Farhan -- Academic Excellence at the Heart of Scotland. The University of Stirling is a charity registered in Scotland, number SC 011159. |
From: Jeff Whitaker <jswhit@fa...> - 2009-09-10 11:26:45
|
Arthur M. Greene wrote: > Just to add a little info: > > I've been poking around various OPeNDAP servers looking for files to > try and open (and read), and have had a little success, so the module > does seem to work, if not all the time for my purposes. At the moment > I'm on a 64-bit machine (Fedora 10), so this is encouraging. Some > details: > > I tried several of the IPCC AR4 models at PCMDI, with results similar > to what I reported earlier. The time object appears with neither units > nor a calendar. Looking at the metadata shows this not to be correct, > for at least the three models I investigated (gfdl_cm2_1, mpi_echam5, > ncar_ccsm3_0). I believe the inclusion of units and a calendar are > standard procedure for all of these models, and would probably cause > the dataset to be flagged if they weren't present. Many users (like > hundreds) have downloaded and analyzed these files. > > The IRI data library (http://iridl.ldeo.columbia.edu/) has a large > collection of datasets, all available using opendap. But I had > problems with the ones I tried because the calendars seem all to be > given as "360", rather than "360_day". (Perhaps someone is cutting > corners with the typing, I can't say...) I couldn't correct this by > setting timedata.calendar='360_day' because the files are opened > read-only. There must be files on this server with differently-defined > calendars, since the data come from many different sources. I'll have > to root around some more to turn some up. Arthur: It's only the time manipulation functions (date2index, num2date, date2num) that require the time attributes to be CF compliant. You can still read the data with the netCDF module, even if the time attributes are not CF compliant, or not there at all. It is odd that the time attributes appear missing. That does appear to be a bug in the client, or could it have something to do with the fact that the PCC openDAP pages at LLNL appear to be password protected? I'll email the unidata folks and see what they think. Thanks for your all your testing. -Jeff > > Similarly, the time units in > http://test.opendap.org/opendap/data/nc/data.nc are given simply as > 'hour', so num2date can't figure out what dates the time values refer > to. I wouldn't have expected this at this URL, but maybe it's a test? > Aside from the fact that "... since" was missing, netcdf4 also > complained that 'hour' was not an acceptable unit. Only 'hours' will > do. (No 'months' or 'years' either, it seems.) > > http://test.opendap.org/opendap/data/nc/coads_climatology.nc seems to > download OK, and there are units, but it's a 12-month climatology, so > calendar is irrelevant. I could plot the data, although it appeared > reversed left-to-right. (I didn't add axes, but just plotted it raw.) > > The conclusion seems to be that (1) there may be a lot of > non-conforming datasets out there (and netcdf4 may be a little fussy > about what time units it will accept, too), but (2) since there seems > to be some discordance w.r.t. the IPCC data (where we believe the > units and calendar must actually be present) one cannot be absolutely > sure that all of the problems experienced are solely due to malformed > data descriptions. Evidently more detective work will be required to > sort everything out... > > Best, thanks again for the assistance. I've been up too late chasing > around the web... > > Arthur > > > Jeff Whitaker wrote: >> Arthur M. Greene wrote: >>> Thanks much. I am able to replicate your results using netcdf4. >>> >>> FYI, I don't believe the xml file is a CDAT creation; rather, it is >>> probably written using CMOR (http://www2-pcmdi.llnl.gov/cmor), which >>> was used to standardize the IPCC model output files, presumably so >>> they could be accessed by a variety of applications via OpenDAP. >>> Hmmmm... >>> >>> At any rate, I can access the remote data object with netcdf4, but >>> no luck retrieving either data or a time index. >>> >>> In [94]: datobj = ncf(fname) >>> In [95]: timedata = datobj.variables['time'] >>> In [97]: taxvals = timedata[1070:1090] >>> In [99]: print taxvals >>> [ 32559.5 32590. 32620.5 32651. 32681.5 32712.5 32743. >>> 32773.5 >>> 32804. 32834.5 32865.5 32895. 32924.5 32955. 32985.5 33016. >>> 33046.5 33077.5 33108. 33138.5] >>> In [100]: print >>> date2index(date0,timedata.units,timedata.calendar,select='nearest') >> >> Arthur: That's because the timedata variable has no attributes (no >> calendar or units), and the date2index function looks for these >> attributes. That's weird though, since that dataset is supposed to >> be CF compliant. I wonder if openDAP is not handling that xml file >> correctly. >> >> -Jeff >>> --------------------------------------------------------------------------- >>> >>> AttributeError Traceback (most recent >>> call last) >>> >>> /home/amg/work/nhmm/<ipython console> in <module>() >>> >>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >>> in netCDF4.Variable.__getattr__ (netCDF4.c:13593)() >>> >>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >>> in netCDF4._get_att (netCDF4.c:1806)() >>> >>> AttributeError: NetCDF: Attribute not found >>> >>> >>> In [96]: print datobj.variables['tas'].shape >>> (1680, 90, 144) >>> In [101]: testdat = datobj.variables['tas'][0,:,:] >>> --------------------------------------------------------------------------- >>> >>> RuntimeError Traceback (most recent >>> call last) >>> >>> /home/amg/work/nhmm/<ipython console> in <module>() >>> >>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >>> in netCDF4.Variable.__getitem__ (netCDF4.c:14286)() >>> >>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >>> in netCDF4.Variable._get (netCDF4.c:18945)() >>> >>> RuntimeError: NetCDF: Variable has no data in DAP request >>> >>> -------------------------- >>> >>> Well, at least the error messages are different... >>> >>> Thanks again for all the assistance. It would be useful to access >>> the IPCC output with OpenDap at some point. >>> >>> Best, >>> >>> Arthur >>> >>> >>> Jeff Whitaker wrote: >>>> Arthur M. Greene wrote: >>>>> Thanks much for the quick response. I updated both matplotlib and >>>>> basemap (now at 0.99.5) via svn and noticed the new netcdftime.py. >>>>> First, from within site-packages/mpl_toolkits/basemap, >>>>> >>>>> $ grep date2index *.py >>>>> __init__.py::func:`date2index`: compute a time variable index >>>>> corresponding to a date. >>>>> __init__.py:def date2index(dates, nctime, >>>>> calendar='proleptic_gregorian'): >>>>> __init__.py: return netcdftime.date2index(dates, nctime, >>>>> calendar=None) >>>>> netcdftime.py:def date2index(dates, nctime, calendar=None, >>>>> select='exact'): >>>>> netcdftime.py: date2index(dates, nctime, calendar=None, >>>>> select='exact') >>>>> >>>>> so there seems to be some disagreement between __init__.py and >>>>> netcdftime.py concerning the presence of the "select" argument. >>>>> When I >>>>> call date2index with the "select" keyword arg I get >>>>> >>>>> In [24]: ix0 = >>>>> date2index(date0,timedata,timedata.calendar,select='nearest') >>>>> --------------------------------------------------------------------------- >>>>> >>>>> TypeError Traceback (most recent >>>>> call last) >>>>> >>>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>>> >>>>> TypeError: date2index() got an unexpected keyword argument 'select' >>>> >>>> Arthur: I forgot to update the wrapper function in __init__.py - >>>> that's fixed now if you do an svn update. Concerning your other >>>> problems below, using your test case exposed a couple of other >>>> bugs, but it still doesn't work. The basic problem is that the >>>> date2index function was designed to work with netCDF4 variable >>>> objects (http://code.google.com/p/netcdf4-python), and the netcdf >>>> file/variable objects that are produced by pupynere/pydap (the pure >>>> python netcdf /dap reader included in basemap) don't quite behave >>>> the same way. Using netCDF4, I can get your gfdl_test.nc case to >>>> work with >>>> >>>> > cat testdate2index.py >>>> >>>> #from mpl_toolkits.basemap import date2index,num2date,NetCDFFile as >>>> ncf >>>> from netCDF4 import Dataset as ncf >>>> from netCDF4 import date2index, num2date >>>> from mpl_toolkits import basemap >>>> fname0 = 'http://esgcet.llnl.gov/dap/'; >>>> fname1 =\ >>>> 'ipcc4/20c3m/gfdl_cm2_1/pcmdi.ipcc4.gfdl_cm2_1.20c3m.run1.atm.mo.xml' >>>> fname = fname0+fname1 >>>> #fname = 'gfdl_test.nc' >>>> print fname >>>> datobj = ncf(fname) >>>> print datobj.variables['tas'].shape >>>> timedata = datobj.variables['time'] >>>> from datetime import datetime as dt >>>> date0 = dt(1951,1,16,12,0,0) >>>> print num2date(timedata[:],timedata.units,calendar=timedata.calendar) >>>> print date0 >>>> nt0 = date2index(date0,timedata,select='nearest') >>>> print nt0 >>>> print \ >>>> timedata[nt0],num2date(timedata[nt0],timedata.units,calendar=timedata.calendar) >>>> >>>> >>>> > python testdate2index.py >>>> >>>> gfdl_test.nc >>>> (13, 31, 29) >>>> [1950-08-16 12:00:00 1950-09-16 00:00:00 1950-10-16 12:00:00 >>>> 1950-11-16 00:00:00 1950-12-16 12:00:00 1951-01-16 12:00:00 >>>> 1951-02-15 00:00:00 1951-03-16 12:00:00 1951-04-16 00:00:00 >>>> 1951-05-16 12:00:00 1951-06-16 00:00:00 1951-07-16 12:00:00 >>>> 1951-08-16 12:00:00] >>>> 1951-01-16 12:00:00 >>>> 5 >>>> [ 32865.5] [1951-01-16 12:00:00] >>>> >>>> >>>> Your original example doesn't work because the URL is not an >>>> opendap server, it's some kind of CDAT xml file that presumably >>>> only CDAT understands. >>>> >>>> We will see if we can fix the date2index function included in >>>> basemap (if not I will remove it), but for now I recommend using >>>> netcdf4-python. It's really a much more robust and feature-rich >>>> solution for netcdf reading and writing. >>>> -Jeff >>>> >>>> >>>> >>>> >>>>> >>>>> ----------------------- >>>>> >>>>> This detail aside, I am still having difficulty with date2index, but >>>>> annoyingly, I seem to get different error messages with different >>>>> datasets. I'll illustrate two here, starting with the one I >>>>> initially posted about. (See note below regarding this data.) >>>>> >>>>> In [3]: from mpl_toolkits.basemap import >>>>> date2index,num2date,NetCDFFile as ncf >>>>> In [10]: from mpl_toolkits import basemap >>>>> In [11]: print basemap.__version__ >>>>> 0.99.5 >>>>> In [24]: fname0 = 'http://esgcet.llnl.gov/dap/'; >>>>> In [25]: fname1 = >>>>> 'ipcc4/20c3m/gfdl_cm2_1/pcmdi.ipcc4.gfdl_cm2_1.20c3m.run1.atm.mo.xml' >>>>> In [26]: fname = fname0+fname1 >>>>> In [28]: datobj = ncf(fname) >>>>> In [33]: datobj.variables['tas'].shape >>>>> Out[33]: (1680, 90, 144) >>>>> In [34]: timedata = datobj.variables['time'] >>>>> In [35]: from datetime import datetime as dt >>>>> In [36]: date0 = dt(1951,1,16,12,0,0) >>>>> In [37]: print date0 >>>>> 1951-01-16 12:00:00 >>>>> In [38]: nt0 = date2index(date0,timedata) >>>>> --------------------------------------------------------------------------- >>>>> >>>>> ClientError Traceback (most recent >>>>> call last) >>>>> >>>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/__init__.pyc >>>>> in date2index(dates, nctime, calendar) >>>>> 3931 Returns an index or a sequence of indices. >>>>> 3932 """ >>>>> -> 3933 return netcdftime.date2index(dates, nctime, >>>>> calendar=None) >>>>> 3934 >>>>> 3935 def maskoceans(lonsin,latsin,datain,inlands=False): >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>> in date2index(dates, nctime, calendar, select) >>>>> 1006 # If the times do not correspond, then it means that >>>>> the times >>>>> 1007 # are not increasing uniformly and we try the >>>>> bisection method. >>>>> -> 1008 if not _check_index(index, dates, nctime, calendar): >>>>> 1009 >>>>> 1010 # Use the bisection method. Assumes the dates are >>>>> ordered. >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>> in _check_index(indices, dates, nctime, calendar) >>>>> 959 return False >>>>> 960 >>>>> --> 961 t = nctime[indices] >>>>> 962 return numpy.all( num2date(t, nctime.units, calendar) >>>>> == dates) >>>>> 963 >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdf.pyc >>>>> in __getitem__(self, index) >>>>> 65 >>>>> 66 def __getitem__(self, index): >>>>> ---> 67 datout = squeeze(self._var.__getitem__(index)) >>>>> 68 # automatically >>>>> 69 # - remove singleton dimensions >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/dap/dtypes.pyc >>>>> in __getitem__(self, key) >>>>> 409 def __getitem__(self, key): >>>>> 410 # Return data from the array. >>>>> --> 411 return self.data[key] >>>>> 412 >>>>> 413 def __setitem__(self, key, item): >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/dap/proxy.pyc >>>>> in __getitem__(self, index) >>>>> 112 >>>>> 113 # Fetch data. >>>>> --> 114 resp, data = openurl(url, self.cache, >>>>> self.username, self.password) >>>>> 115 >>>>> 116 # First lines are ASCII information that end with >>>>> 'Data:\n'. >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/dap/util/http.pyc >>>>> in openurl(url, cache, username, password) >>>>> 19 m = re.search('code = (?P<code>\d+);\s*message = >>>>> "(?P<msg>.*)"', data, re.DOTALL | re.MULTILINE) >>>>> 20 msg = 'Server error %(code)s: "%(msg)s"' % >>>>> m.groupdict() >>>>> ---> 21 raise ClientError(msg) >>>>> 22 >>>>> 23 return resp, data >>>>> >>>>> ClientError: 'Server error 0: "invalid literal for int(): [1113"' >>>>> >>>>> ----------------------------- >>>>> >>>>> Note that this is a different error than previously reported. >>>>> Also, the correct time index is still 1080: >>>>> >>>>> In [40]: taxvals = datobj.variables['time'][:] >>>>> >>>>> In [41]: num2date(taxvals[1080],timedata.units,timedata.calendar) >>>>> Out[41]: 1951-01-16 12:00:00 >>>>> >>>>> ----------------------------- >>>>> >>>>> This dataset, generated by one of the IPCC models, is >>>>> password-protected, but could be a good target for decoding, since >>>>> it is typical of a large class of climate models, that generate a >>>>> lot of analytical activity. To get a password (they're free) one >>>>> must register. Info is here: >>>>> http://www-pcmdi.llnl.gov/ipcc/info_for_analysts.php. Follow "How >>>>> to access..." then "Register to download output." Once you get a >>>>> userid and password they can be inserted in the NetCDFFile call, >>>>> voila. Note that there is a new iteration of IPCC coming down the >>>>> pike; new model files to become widely available starting in 2010. >>>>> >>>>> ------------------------------ >>>>> >>>>> The underlying data is available via ftp. I fetched it and >>>>> extracted a small slab, which is available at >>>>> http://iri.columbia.edu/~amg/test/gfdl_test.nc. The CDAT package >>>>> can digest this file; first time step is plotted here: >>>>> http://iri.columbia.edu/~amg/test/gfdl_test_time0.png. The dates >>>>> can also be read by this package, and run from Aug 1950 to Aug >>>>> 1951, inclusive (13 mos). So the file does not seem to be garbage. >>>>> >>>>> In [16]: datobj = ncf('gfdl_test.nc') >>>>> In [17]: timedata = datobj.variables['time'] >>>>> In [18]: date0 = dt(1951,1,16,12,0,0) >>>>> In [19]: nt0 = date2index(date0,timedata) >>>>> --------------------------------------------------------------------------- >>>>> >>>>> TypeError Traceback (most recent >>>>> call last) >>>>> >>>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/__init__.pyc >>>>> in date2index(dates, nctime, calendar) >>>>> 3931 Returns an index or a sequence of indices. >>>>> 3932 """ >>>>> -> 3933 return netcdftime.date2index(dates, nctime, >>>>> calendar=None) >>>>> 3934 >>>>> 3935 def maskoceans(lonsin,latsin,datain,inlands=False): >>>>> >>>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>> in date2index(dates, nctime, calendar, select) >>>>> 1011 import bisect >>>>> 1012 >>>>> -> 1013 index = numpy.array([bisect.bisect_left(nctime, n) >>>>> for n in num], int) >>>>> 1014 >>>>> 1015 nomatch = num2date(nctime[index], nctime.units) != >>>>> dates >>>>> >>>>> TypeError: object of type 'netcdf_variable' has no len() >>>>> >>>>> Investigating the time axis, >>>>> >>>>> In [20]: taxvals = timedata[:] >>>>> >>>>> In [21]: taxvals >>>>> Out[21]: array([ 32712.5, 32743. , 32773.5, 32804. , 32834.5, >>>>> 32865.5, 32895. , 32924.5, 32955. , 32985.5, 33016. , >>>>> 33046.5, 33077.5]) >>>>> >>>>> In [22]: num2date(taxvals,timedata.units,timedata.calendar) >>>>> Out[22]: array([1950-08-16 12:00:00, 1950-09-16 00:00:00, >>>>> 1950-10-16 12:00:00, >>>>> 1950-11-16 00:00:00, 1950-12-16 12:00:00, 1951-01-16 12:00:00, >>>>> 1951-02-15 00:00:00, 1951-03-16 12:00:00, 1951-04-16 00:00:00, >>>>> 1951-05-16 12:00:00, 1951-06-16 00:00:00, 1951-07-16 12:00:00, >>>>> 1951-08-16 12:00:00], dtype=object) >>>>> >>>>> Which agrees with what CDAT sees. >>>>> >>>>> ------------------------- >>>>> >>>>> I think this is enough for now. I also had problems opening data >>>>> files whose time units were like "months since xxxx-xx-xx," since >>>>> the "months" unit does not seem to be supported. ("years since..." >>>>> could also be useful in some cases.) But maybe one or two things >>>>> at a time is enough! >>>>> >>>>> Thanks for any assistance/advice! >>>>> >>>>> Best, >>>>> >>>>> Arthur >>>>> >>>>> >>>>> Jeff Whitaker wrote: >>>>>> David Huard wrote: >>>>>>> Arthur, >>>>>>> >>>>>>> I wrote the date2index function and I think what you are seeing >>>>>>> is a bug that I fixed a couple of months ago. By using the >>>>>>> latest version of netcdf4-python, not only should this bug >>>>>>> disappear, but you'll also find that date2index now supports >>>>>>> different selection methods: 'exact', 'before', 'after', >>>>>>> 'nearest', that should help with your use case. >>>>>>> >>>>>>> If this does not fix the problem you are seeing, I'd appreciate >>>>>>> having a copy of the file and code to reproduce the problem and >>>>>>> find a solution. >>>>>>> >>>>>>> HTH, >>>>>>> >>>>>>> David Huard >>>>>> >>>>>> Arthur: I've just updated basemap svn with David's latest >>>>>> version of date2index, so another option is to update basemap >>>>>> from svn. Or, even simpler, just drop the attached netcdftime.py >>>>>> file in lib/mpl_toolkits/basemap (replacing the old one) and run >>>>>> python setup.py install. >>>>>> >>>>>> -Jeff >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Sep 7, 2009 at 9:27 AM, Arthur M. Greene >>>>>>> <amg@... <mailto:amg@...>> wrote: >>>>>>> >>>>>>> Hi All, >>>>>>> >>>>>>> The problem is not with fetching the data slice itself, but >>>>>>> finding the >>>>>>> correct indices to specify, particularly with the time >>>>>>> dimension. The >>>>>>> below examples refer to a remote dataset that I can open and >>>>>>> slice >>>>>>> using >>>>>>> indices, as in >>>>>>> >>>>>>> slice = remoteobj.variables['tas'][:120,20:40,30:50]. >>>>>>> >>>>>>> However, I have problems when trying to use the syntax in >>>>>>> plotsst.py or >>>>>>> pnganim.py (from the examples) to find time indices: >>>>>>> >>>>>>> In [107]: from datetime import datetime as dt >>>>>>> In [108]: date0 = dt(1951,1,1,0) >>>>>>> In [110]: print date0 >>>>>>> 1951-01-01 00:00:00 >>>>>>> >>>>>>> In [125]: timedata = remoteobj.variables['time'] >>>>>>> In [126]: nt0 = date2index(date0,timedata) >>>>>>> >>>>>>> --------------------------------------------------------------------------- >>>>>>> >>>>>>> AssertionError Traceback (most >>>>>>> recent >>>>>>> call last) >>>>>>> >>>>>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>>>>> >>>>>>> >>>>>>> /usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/__init__.pyc >>>>>>> >>>>>>> >>>>>>> in date2index(dates, nctime, calendar) >>>>>>> 3924 Returns an index or a sequence of indices. >>>>>>> 3925 """ >>>>>>> -> 3926 return netcdftime.date2index(dates, nctime, >>>>>>> calendar=None) >>>>>>> 3927 >>>>>>> 3928 def maskoceans(lonsin,latsin,datain,inlands=False): >>>>>>> >>>>>>> >>>>>>> /usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>>>> >>>>>>> >>>>>>> in date2index(dates, nctime, calendar) >>>>>>> 986 >>>>>>> 987 # Perform check again. >>>>>>> --> 988 _check_index(index, dates, nctime, calendar) >>>>>>> 989 >>>>>>> 990 # convert numpy scalars or single element arrays >>>>>>> to python >>>>>>> ints. >>>>>>> >>>>>>> >>>>>>> /usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>>>> >>>>>>> >>>>>>> in _check_index(indices, dates, nctime, calendar) >>>>>>> 941 for n,i in enumerate(indices): >>>>>>> 942 t[n] = nctime[i] >>>>>>> --> 943 assert numpy.all( num2date(t, nctime.units, >>>>>>> calendar) >>>>>>> == dates) >>>>>>> 944 >>>>>>> 945 >>>>>>> >>>>>>> AssertionError: >>>>>>> >>>>>>> --------------------------------------------------------- >>>>>>> >>>>>>> It turns out that date0 corresponds best to index 1080: >>>>>>> >>>>>>> In [139]: remoteobj.variables['time'][1080] >>>>>>> Out[139]: 32865.5 >>>>>>> >>>>>>> In [141]: num2date(32865.5,timedata.units,timedata.calendar) >>>>>>> Out[141]: 1951-01-16 12:00:00 >>>>>>> >>>>>>> This isn't the _exact_ date and time I had specified, but >>>>>>> >>>>>>> In [142]: date0 = dt(1951,01,16,12,00,00) >>>>>>> In [143]: print date0 >>>>>>> 1951-01-16 12:00:00 >>>>>>> >>>>>>> In [144]: date2index(date0,timedata,timedata.calendar) >>>>>>> >>>>>>> produces the same AssertionError. Where is the problem? >>>>>>> >>>>>>> What I would _like_ to do is to issue a simple call using >>>>>>> coordinates >>>>>>> rather than the indices, of the form: >>>>>>> >>>>>>> slice = variable[date0:date1,[plev],lat0:lat1,lon0:lon1], >>>>>>> >>>>>>> or similar, preferably without writing a whole module just >>>>>>> to find the >>>>>>> correct indices. I need to fetch similar slices from a group of >>>>>>> models, >>>>>>> having time axes that may each be defined slightly >>>>>>> differently -- >>>>>>> different calendars, time point set at a different day of >>>>>>> the month, >>>>>>> etc. (It's monthly data and I'm specifying only monthly >>>>>>> bounds, even >>>>>>> though the calendar may be defined as "days since 1860...") >>>>>>> I need to >>>>>>> automate the process so I get back the correct slab regardless. >>>>>>> >>>>>>> Suggestions appreciated! >>>>>>> >>>>>>> Thx, >>>>>>> >>>>>>> Arthur >>>>>>> >>>>>>> >>>>>>> *^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~* >>>>>>> Arthur M. Greene, Ph.D. >>>>>>> The International Research Institute for Climate and Society >>>>>>> (IRI) >>>>>>> The Earth Institute, Columbia University, Lamont Campus >>>>>>> >>>>>>> amg at iri dot columbia dot edu | http://iri.columbia.edu >>>>>>> *^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~* >>>>>>> >>>>>>> >>>>>>> >>>>>>> ------------------------------------------------------------------------------ >>>>>>> >>>>>>> Let Crystal Reports handle the reporting - Free Crystal Reports >>>>>>> 2008 30-Day >>>>>>> trial. Simplify your report design, integration and >>>>>>> deployment - >>>>>>> and focus on >>>>>>> what you do best, core application coding. Discover what's >>>>>>> new with >>>>>>> Crystal Reports now. http://p.sf.net/sfu/bobj-july >>>>>>> _______________________________________________ >>>>>>> Matplotlib-users mailing list >>>>>>> Matplotlib-users@... >>>>>>> <mailto:Matplotlib-users@...> >>>>>>> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >>>>>>> >>>>>>> >>>>>>> ------------------------------------------------------------------------ >>>>>>> >>>>>>> >>>>>>> ------------------------------------------------------------------------------ >>>>>>> >>>>>>> Let Crystal Reports handle the reporting - Free Crystal Reports >>>>>>> 2008 30-Day trial. Simplify your report design, integration and >>>>>>> deployment - and focus on what you do best, core application >>>>>>> coding. Discover what's new with Crystal Reports now. >>>>>>> http://p.sf.net/sfu/bobj-july >>>>>>> ------------------------------------------------------------------------ >>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> Matplotlib-users mailing list >>>>>>> Matplotlib-users@... >>>>>>> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >>>>>>> >>>>> >>>> >>>> >>> >> >> > |
From: Arthur M. Greene <amg@ir...> - 2009-09-10 03:55:13
|
Just to add a little info: I've been poking around various OPeNDAP servers looking for files to try and open (and read), and have had a little success, so the module does seem to work, if not all the time for my purposes. At the moment I'm on a 64-bit machine (Fedora 10), so this is encouraging. Some details: I tried several of the IPCC AR4 models at PCMDI, with results similar to what I reported earlier. The time object appears with neither units nor a calendar. Looking at the metadata shows this not to be correct, for at least the three models I investigated (gfdl_cm2_1, mpi_echam5, ncar_ccsm3_0). I believe the inclusion of units and a calendar are standard procedure for all of these models, and would probably cause the dataset to be flagged if they weren't present. Many users (like hundreds) have downloaded and analyzed these files. The IRI data library (http://iridl.ldeo.columbia.edu/) has a large collection of datasets, all available using opendap. But I had problems with the ones I tried because the calendars seem all to be given as "360", rather than "360_day". (Perhaps someone is cutting corners with the typing, I can't say...) I couldn't correct this by setting timedata.calendar='360_day' because the files are opened read-only. There must be files on this server with differently-defined calendars, since the data come from many different sources. I'll have to root around some more to turn some up. Similarly, the time units in http://test.opendap.org/opendap/data/nc/data.nc are given simply as 'hour', so num2date can't figure out what dates the time values refer to. I wouldn't have expected this at this URL, but maybe it's a test? Aside from the fact that "... since" was missing, netcdf4 also complained that 'hour' was not an acceptable unit. Only 'hours' will do. (No 'months' or 'years' either, it seems.) http://test.opendap.org/opendap/data/nc/coads_climatology.nc seems to download OK, and there are units, but it's a 12-month climatology, so calendar is irrelevant. I could plot the data, although it appeared reversed left-to-right. (I didn't add axes, but just plotted it raw.) The conclusion seems to be that (1) there may be a lot of non-conforming datasets out there (and netcdf4 may be a little fussy about what time units it will accept, too), but (2) since there seems to be some discordance w.r.t. the IPCC data (where we believe the units and calendar must actually be present) one cannot be absolutely sure that all of the problems experienced are solely due to malformed data descriptions. Evidently more detective work will be required to sort everything out... Best, thanks again for the assistance. I've been up too late chasing around the web... Arthur Jeff Whitaker wrote: > Arthur M. Greene wrote: >> Thanks much. I am able to replicate your results using netcdf4. >> >> FYI, I don't believe the xml file is a CDAT creation; rather, it is >> probably written using CMOR (http://www2-pcmdi.llnl.gov/cmor), which >> was used to standardize the IPCC model output files, presumably so >> they could be accessed by a variety of applications via OpenDAP. Hmmmm... >> >> At any rate, I can access the remote data object with netcdf4, but no >> luck retrieving either data or a time index. >> >> In [94]: datobj = ncf(fname) >> In [95]: timedata = datobj.variables['time'] >> In [97]: taxvals = timedata[1070:1090] >> In [99]: print taxvals >> [ 32559.5 32590. 32620.5 32651. 32681.5 32712.5 32743. 32773.5 >> 32804. 32834.5 32865.5 32895. 32924.5 32955. 32985.5 33016. >> 33046.5 33077.5 33108. 33138.5] >> In [100]: print >> date2index(date0,timedata.units,timedata.calendar,select='nearest') > > Arthur: That's because the timedata variable has no attributes (no > calendar or units), and the date2index function looks for these > attributes. That's weird though, since that dataset is supposed to be > CF compliant. I wonder if openDAP is not handling that xml file correctly. > > -Jeff >> --------------------------------------------------------------------------- >> >> AttributeError Traceback (most recent call >> last) >> >> /home/amg/work/nhmm/<ipython console> in <module>() >> >> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >> in netCDF4.Variable.__getattr__ (netCDF4.c:13593)() >> >> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >> in netCDF4._get_att (netCDF4.c:1806)() >> >> AttributeError: NetCDF: Attribute not found >> >> >> In [96]: print datobj.variables['tas'].shape >> (1680, 90, 144) >> In [101]: testdat = datobj.variables['tas'][0,:,:] >> --------------------------------------------------------------------------- >> >> RuntimeError Traceback (most recent call >> last) >> >> /home/amg/work/nhmm/<ipython console> in <module>() >> >> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >> in netCDF4.Variable.__getitem__ (netCDF4.c:14286)() >> >> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/netCDF4.so >> in netCDF4.Variable._get (netCDF4.c:18945)() >> >> RuntimeError: NetCDF: Variable has no data in DAP request >> >> -------------------------- >> >> Well, at least the error messages are different... >> >> Thanks again for all the assistance. It would be useful to access the >> IPCC output with OpenDap at some point. >> >> Best, >> >> Arthur >> >> >> Jeff Whitaker wrote: >>> Arthur M. Greene wrote: >>>> Thanks much for the quick response. I updated both matplotlib and >>>> basemap (now at 0.99.5) via svn and noticed the new netcdftime.py. >>>> First, from within site-packages/mpl_toolkits/basemap, >>>> >>>> $ grep date2index *.py >>>> __init__.py::func:`date2index`: compute a time variable index >>>> corresponding to a date. >>>> __init__.py:def date2index(dates, nctime, >>>> calendar='proleptic_gregorian'): >>>> __init__.py: return netcdftime.date2index(dates, nctime, >>>> calendar=None) >>>> netcdftime.py:def date2index(dates, nctime, calendar=None, >>>> select='exact'): >>>> netcdftime.py: date2index(dates, nctime, calendar=None, >>>> select='exact') >>>> >>>> so there seems to be some disagreement between __init__.py and >>>> netcdftime.py concerning the presence of the "select" argument. When I >>>> call date2index with the "select" keyword arg I get >>>> >>>> In [24]: ix0 = >>>> date2index(date0,timedata,timedata.calendar,select='nearest') >>>> --------------------------------------------------------------------------- >>>> >>>> TypeError Traceback (most recent >>>> call last) >>>> >>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>> >>>> TypeError: date2index() got an unexpected keyword argument 'select' >>> >>> Arthur: I forgot to update the wrapper function in __init__.py - >>> that's fixed now if you do an svn update. Concerning your other >>> problems below, using your test case exposed a couple of other bugs, >>> but it still doesn't work. The basic problem is that the date2index >>> function was designed to work with netCDF4 variable objects >>> (http://code.google.com/p/netcdf4-python), and the netcdf >>> file/variable objects that are produced by pupynere/pydap (the pure >>> python netcdf /dap reader included in basemap) don't quite behave the >>> same way. Using netCDF4, I can get your gfdl_test.nc case to work with >>> >>> > cat testdate2index.py >>> >>> #from mpl_toolkits.basemap import date2index,num2date,NetCDFFile as ncf >>> from netCDF4 import Dataset as ncf >>> from netCDF4 import date2index, num2date >>> from mpl_toolkits import basemap >>> fname0 = 'http://esgcet.llnl.gov/dap/'; >>> fname1 =\ >>> 'ipcc4/20c3m/gfdl_cm2_1/pcmdi.ipcc4.gfdl_cm2_1.20c3m.run1.atm.mo.xml' >>> fname = fname0+fname1 >>> #fname = 'gfdl_test.nc' >>> print fname >>> datobj = ncf(fname) >>> print datobj.variables['tas'].shape >>> timedata = datobj.variables['time'] >>> from datetime import datetime as dt >>> date0 = dt(1951,1,16,12,0,0) >>> print num2date(timedata[:],timedata.units,calendar=timedata.calendar) >>> print date0 >>> nt0 = date2index(date0,timedata,select='nearest') >>> print nt0 >>> print \ >>> timedata[nt0],num2date(timedata[nt0],timedata.units,calendar=timedata.calendar) >>> >>> >>> > python testdate2index.py >>> >>> gfdl_test.nc >>> (13, 31, 29) >>> [1950-08-16 12:00:00 1950-09-16 00:00:00 1950-10-16 12:00:00 >>> 1950-11-16 00:00:00 1950-12-16 12:00:00 1951-01-16 12:00:00 >>> 1951-02-15 00:00:00 1951-03-16 12:00:00 1951-04-16 00:00:00 >>> 1951-05-16 12:00:00 1951-06-16 00:00:00 1951-07-16 12:00:00 >>> 1951-08-16 12:00:00] >>> 1951-01-16 12:00:00 >>> 5 >>> [ 32865.5] [1951-01-16 12:00:00] >>> >>> >>> Your original example doesn't work because the URL is not an opendap >>> server, it's some kind of CDAT xml file that presumably only CDAT >>> understands. >>> >>> We will see if we can fix the date2index function included in basemap >>> (if not I will remove it), but for now I recommend using >>> netcdf4-python. It's really a much more robust and feature-rich >>> solution for netcdf reading and writing. >>> -Jeff >>> >>> >>> >>> >>>> >>>> ----------------------- >>>> >>>> This detail aside, I am still having difficulty with date2index, but >>>> annoyingly, I seem to get different error messages with different >>>> datasets. I'll illustrate two here, starting with the one I >>>> initially posted about. (See note below regarding this data.) >>>> >>>> In [3]: from mpl_toolkits.basemap import >>>> date2index,num2date,NetCDFFile as ncf >>>> In [10]: from mpl_toolkits import basemap >>>> In [11]: print basemap.__version__ >>>> 0.99.5 >>>> In [24]: fname0 = 'http://esgcet.llnl.gov/dap/'; >>>> In [25]: fname1 = >>>> 'ipcc4/20c3m/gfdl_cm2_1/pcmdi.ipcc4.gfdl_cm2_1.20c3m.run1.atm.mo.xml' >>>> In [26]: fname = fname0+fname1 >>>> In [28]: datobj = ncf(fname) >>>> In [33]: datobj.variables['tas'].shape >>>> Out[33]: (1680, 90, 144) >>>> In [34]: timedata = datobj.variables['time'] >>>> In [35]: from datetime import datetime as dt >>>> In [36]: date0 = dt(1951,1,16,12,0,0) >>>> In [37]: print date0 >>>> 1951-01-16 12:00:00 >>>> In [38]: nt0 = date2index(date0,timedata) >>>> --------------------------------------------------------------------------- >>>> >>>> ClientError Traceback (most recent >>>> call last) >>>> >>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/__init__.pyc >>>> in date2index(dates, nctime, calendar) >>>> 3931 Returns an index or a sequence of indices. >>>> 3932 """ >>>> -> 3933 return netcdftime.date2index(dates, nctime, calendar=None) >>>> 3934 >>>> 3935 def maskoceans(lonsin,latsin,datain,inlands=False): >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>> in date2index(dates, nctime, calendar, select) >>>> 1006 # If the times do not correspond, then it means that the >>>> times >>>> 1007 # are not increasing uniformly and we try the bisection >>>> method. >>>> -> 1008 if not _check_index(index, dates, nctime, calendar): >>>> 1009 >>>> 1010 # Use the bisection method. Assumes the dates are >>>> ordered. >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>> in _check_index(indices, dates, nctime, calendar) >>>> 959 return False >>>> 960 >>>> --> 961 t = nctime[indices] >>>> 962 return numpy.all( num2date(t, nctime.units, calendar) == >>>> dates) >>>> 963 >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdf.pyc >>>> in __getitem__(self, index) >>>> 65 >>>> 66 def __getitem__(self, index): >>>> ---> 67 datout = squeeze(self._var.__getitem__(index)) >>>> 68 # automatically >>>> 69 # - remove singleton dimensions >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/dap/dtypes.pyc >>>> in __getitem__(self, key) >>>> 409 def __getitem__(self, key): >>>> 410 # Return data from the array. >>>> --> 411 return self.data[key] >>>> 412 >>>> 413 def __setitem__(self, key, item): >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/dap/proxy.pyc >>>> in __getitem__(self, index) >>>> 112 >>>> 113 # Fetch data. >>>> --> 114 resp, data = openurl(url, self.cache, self.username, >>>> self.password) >>>> 115 >>>> 116 # First lines are ASCII information that end with >>>> 'Data:\n'. >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/dap/util/http.pyc >>>> in openurl(url, cache, username, password) >>>> 19 m = re.search('code = (?P<code>\d+);\s*message = >>>> "(?P<msg>.*)"', data, re.DOTALL | re.MULTILINE) >>>> 20 msg = 'Server error %(code)s: "%(msg)s"' % >>>> m.groupdict() >>>> ---> 21 raise ClientError(msg) >>>> 22 >>>> 23 return resp, data >>>> >>>> ClientError: 'Server error 0: "invalid literal for int(): [1113"' >>>> >>>> ----------------------------- >>>> >>>> Note that this is a different error than previously reported. Also, >>>> the correct time index is still 1080: >>>> >>>> In [40]: taxvals = datobj.variables['time'][:] >>>> >>>> In [41]: num2date(taxvals[1080],timedata.units,timedata.calendar) >>>> Out[41]: 1951-01-16 12:00:00 >>>> >>>> ----------------------------- >>>> >>>> This dataset, generated by one of the IPCC models, is >>>> password-protected, but could be a good target for decoding, since >>>> it is typical of a large class of climate models, that generate a >>>> lot of analytical activity. To get a password (they're free) one >>>> must register. Info is here: >>>> http://www-pcmdi.llnl.gov/ipcc/info_for_analysts.php. Follow "How to >>>> access..." then "Register to download output." Once you get a userid >>>> and password they can be inserted in the NetCDFFile call, voila. >>>> Note that there is a new iteration of IPCC coming down the pike; new >>>> model files to become widely available starting in 2010. >>>> >>>> ------------------------------ >>>> >>>> The underlying data is available via ftp. I fetched it and extracted >>>> a small slab, which is available at >>>> http://iri.columbia.edu/~amg/test/gfdl_test.nc. The CDAT package can >>>> digest this file; first time step is plotted here: >>>> http://iri.columbia.edu/~amg/test/gfdl_test_time0.png. The dates can >>>> also be read by this package, and run from Aug 1950 to Aug 1951, >>>> inclusive (13 mos). So the file does not seem to be garbage. >>>> >>>> In [16]: datobj = ncf('gfdl_test.nc') >>>> In [17]: timedata = datobj.variables['time'] >>>> In [18]: date0 = dt(1951,1,16,12,0,0) >>>> In [19]: nt0 = date2index(date0,timedata) >>>> --------------------------------------------------------------------------- >>>> >>>> TypeError Traceback (most recent >>>> call last) >>>> >>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/__init__.pyc >>>> in date2index(dates, nctime, calendar) >>>> 3931 Returns an index or a sequence of indices. >>>> 3932 """ >>>> -> 3933 return netcdftime.date2index(dates, nctime, calendar=None) >>>> 3934 >>>> 3935 def maskoceans(lonsin,latsin,datain,inlands=False): >>>> >>>> /home/amg/usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>> in date2index(dates, nctime, calendar, select) >>>> 1011 import bisect >>>> 1012 >>>> -> 1013 index = numpy.array([bisect.bisect_left(nctime, n) >>>> for n in num], int) >>>> 1014 >>>> 1015 nomatch = num2date(nctime[index], nctime.units) != >>>> dates >>>> >>>> TypeError: object of type 'netcdf_variable' has no len() >>>> >>>> Investigating the time axis, >>>> >>>> In [20]: taxvals = timedata[:] >>>> >>>> In [21]: taxvals >>>> Out[21]: array([ 32712.5, 32743. , 32773.5, 32804. , 32834.5, >>>> 32865.5, 32895. , 32924.5, 32955. , 32985.5, 33016. , 33046.5, >>>> 33077.5]) >>>> >>>> In [22]: num2date(taxvals,timedata.units,timedata.calendar) >>>> Out[22]: array([1950-08-16 12:00:00, 1950-09-16 00:00:00, 1950-10-16 >>>> 12:00:00, >>>> 1950-11-16 00:00:00, 1950-12-16 12:00:00, 1951-01-16 12:00:00, >>>> 1951-02-15 00:00:00, 1951-03-16 12:00:00, 1951-04-16 00:00:00, >>>> 1951-05-16 12:00:00, 1951-06-16 00:00:00, 1951-07-16 12:00:00, >>>> 1951-08-16 12:00:00], dtype=object) >>>> >>>> Which agrees with what CDAT sees. >>>> >>>> ------------------------- >>>> >>>> I think this is enough for now. I also had problems opening data >>>> files whose time units were like "months since xxxx-xx-xx," since >>>> the "months" unit does not seem to be supported. ("years since..." >>>> could also be useful in some cases.) But maybe one or two things at >>>> a time is enough! >>>> >>>> Thanks for any assistance/advice! >>>> >>>> Best, >>>> >>>> Arthur >>>> >>>> >>>> Jeff Whitaker wrote: >>>>> David Huard wrote: >>>>>> Arthur, >>>>>> >>>>>> I wrote the date2index function and I think what you are seeing is >>>>>> a bug that I fixed a couple of months ago. By using the latest >>>>>> version of netcdf4-python, not only should this bug disappear, but >>>>>> you'll also find that date2index now supports different selection >>>>>> methods: 'exact', 'before', 'after', 'nearest', that should help >>>>>> with your use case. >>>>>> >>>>>> If this does not fix the problem you are seeing, I'd appreciate >>>>>> having a copy of the file and code to reproduce the problem and >>>>>> find a solution. >>>>>> >>>>>> HTH, >>>>>> >>>>>> David Huard >>>>> >>>>> Arthur: I've just updated basemap svn with David's latest version >>>>> of date2index, so another option is to update basemap from svn. >>>>> Or, even simpler, just drop the attached netcdftime.py file in >>>>> lib/mpl_toolkits/basemap (replacing the old one) and run python >>>>> setup.py install. >>>>> >>>>> -Jeff >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Sep 7, 2009 at 9:27 AM, Arthur M. Greene >>>>>> <amg@... <mailto:amg@...>> wrote: >>>>>> >>>>>> Hi All, >>>>>> >>>>>> The problem is not with fetching the data slice itself, but >>>>>> finding the >>>>>> correct indices to specify, particularly with the time >>>>>> dimension. The >>>>>> below examples refer to a remote dataset that I can open and >>>>>> slice >>>>>> using >>>>>> indices, as in >>>>>> >>>>>> slice = remoteobj.variables['tas'][:120,20:40,30:50]. >>>>>> >>>>>> However, I have problems when trying to use the syntax in >>>>>> plotsst.py or >>>>>> pnganim.py (from the examples) to find time indices: >>>>>> >>>>>> In [107]: from datetime import datetime as dt >>>>>> In [108]: date0 = dt(1951,1,1,0) >>>>>> In [110]: print date0 >>>>>> 1951-01-01 00:00:00 >>>>>> >>>>>> In [125]: timedata = remoteobj.variables['time'] >>>>>> In [126]: nt0 = date2index(date0,timedata) >>>>>> >>>>>> --------------------------------------------------------------------------- >>>>>> >>>>>> AssertionError Traceback (most recent >>>>>> call last) >>>>>> >>>>>> /home/amg/work/nhmm/<ipython console> in <module>() >>>>>> >>>>>> >>>>>> /usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/__init__.pyc >>>>>> >>>>>> >>>>>> in date2index(dates, nctime, calendar) >>>>>> 3924 Returns an index or a sequence of indices. >>>>>> 3925 """ >>>>>> -> 3926 return netcdftime.date2index(dates, nctime, >>>>>> calendar=None) >>>>>> 3927 >>>>>> 3928 def maskoceans(lonsin,latsin,datain,inlands=False): >>>>>> >>>>>> >>>>>> /usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>>> >>>>>> >>>>>> in date2index(dates, nctime, calendar) >>>>>> 986 >>>>>> 987 # Perform check again. >>>>>> --> 988 _check_index(index, dates, nctime, calendar) >>>>>> 989 >>>>>> 990 # convert numpy scalars or single element arrays >>>>>> to python >>>>>> ints. >>>>>> >>>>>> >>>>>> /usr/local/cdat/trunk/lib/python2.5/site-packages/mpl_toolkits/basemap/netcdftime.pyc >>>>>> >>>>>> >>>>>> in _check_index(indices, dates, nctime, calendar) >>>>>> 941 for n,i in enumerate(indices): >>>>>> 942 t[n] = nctime[i] >>>>>> --> 943 assert numpy.all( num2date(t, nctime.units, calendar) >>>>>> == dates) >>>>>> 944 >>>>>> 945 >>>>>> >>>>>> AssertionError: >>>>>> >>>>>> --------------------------------------------------------- >>>>>> >>>>>> It turns out that date0 corresponds best to index 1080: >>>>>> >>>>>> In [139]: remoteobj.variables['time'][1080] >>>>>> Out[139]: 32865.5 >>>>>> >>>>>> In [141]: num2date(32865.5,timedata.units,timedata.calendar) >>>>>> Out[141]: 1951-01-16 12:00:00 >>>>>> >>>>>> This isn't the _exact_ date and time I had specified, but >>>>>> >>>>>> In [142]: date0 = dt(1951,01,16,12,00,00) >>>>>> In [143]: print date0 >>>>>> 1951-01-16 12:00:00 >>>>>> >>>>>> In [144]: date2index(date0,timedata,timedata.calendar) >>>>>> >>>>>> produces the same AssertionError. Where is the problem? >>>>>> >>>>>> What I would _like_ to do is to issue a simple call using >>>>>> coordinates >>>>>> rather than the indices, of the form: >>>>>> >>>>>> slice = variable[date0:date1,[plev],lat0:lat1,lon0:lon1], >>>>>> >>>>>> or similar, preferably without writing a whole module just to >>>>>> find the >>>>>> correct indices. I need to fetch similar slices from a group of >>>>>> models, >>>>>> having time axes that may each be defined slightly differently -- >>>>>> different calendars, time point set at a different day of the >>>>>> month, >>>>>> etc. (It's monthly data and I'm specifying only monthly >>>>>> bounds, even >>>>>> though the calendar may be defined as "days since 1860...") I >>>>>> need to >>>>>> automate the process so I get back the correct slab regardless. >>>>>> >>>>>> Suggestions appreciated! >>>>>> >>>>>> Thx, >>>>>> >>>>>> Arthur >>>>>> >>>>>> >>>>>> *^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~* >>>>>> Arthur M. Greene, Ph.D. >>>>>> The International Research Institute for Climate and Society >>>>>> (IRI) >>>>>> The Earth Institute, Columbia University, Lamont Campus >>>>>> >>>>>> amg at iri dot columbia dot edu | http://iri.columbia.edu >>>>>> *^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~* >>>>>> >>>>>> >>>>>> >>>>>> ------------------------------------------------------------------------------ >>>>>> >>>>>> Let Crystal Reports handle the reporting - Free Crystal Reports >>>>>> 2008 30-Day >>>>>> trial. Simplify your report design, integration and deployment - >>>>>> and focus on >>>>>> what you do best, core application coding. Discover what's new >>>>>> with >>>>>> Crystal Reports now. http://p.sf.net/sfu/bobj-july >>>>>> _______________________________________________ >>>>>> Matplotlib-users mailing list >>>>>> Matplotlib-users@... >>>>>> <mailto:Matplotlib-users@...> >>>>>> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >>>>>> >>>>>> >>>>>> ------------------------------------------------------------------------ >>>>>> >>>>>> >>>>>> ------------------------------------------------------------------------------ >>>>>> >>>>>> Let Crystal Reports handle the reporting - Free Crystal Reports >>>>>> 2008 30-Day trial. Simplify your report design, integration and >>>>>> deployment - and focus on what you do best, core application >>>>>> coding. Discover what's new with Crystal Reports now. >>>>>> http://p.sf.net/sfu/bobj-july >>>>>> ------------------------------------------------------------------------ >>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> Matplotlib-users mailing list >>>>>> Matplotlib-users@... >>>>>> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >>>>>> >>>> >>> >>> >> > > -- *^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~* Arthur M. Greene, Ph.D. The International Research Institute for Climate and Society (IRI) The Earth Institute, Columbia University, Lamont Campus Monell Building, 61 Route 9W, Palisades, NY 10964-8000 USA amg at iri dot columbia dot edu | http://iri.columbia.edu *^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~*^*~* |
From: nbv4 <cp368202@oh...> - 2009-09-10 02:45:47
|
I have a large array that looks like this: vals=[0,0,1,3,2,1,0,4,2,4,2...] dates = [datetime.date(...), datetime.date(...)...] which then is transformed into a cumsum: acc_vals = np.cumsum(vals) and then that is sent to maplotlib to be graphed. The resultant graph looks like this: http://img171.imageshack.us/img171/8589/linetotal.png The blue line is the cumsum data, and the red line is the raw values (just for illustrative purposes). What I want is the red line to represent the rate of change for the previous month. That is to say at each point in the graph, the value on the red line represents the total number of flight time that has occurred over the previous 30 days. Is this something that matplotlib can handles on it's own? Or am I going to have to write my own number crunching method to get it working? -- View this message in context: http://www.nabble.com/rate-of-change-for-a-splice-of-data-tp25376306p25376306.html Sent from the matplotlib - users mailing list archive at Nabble.com. |