From: Francesc A. <fa...@ca...> - 2005-01-27 14:17:06
|
Hi Antonio, I've finally found some time to work in this issue. The problem with that was that native chunked datasets were not supported by PyTables. The good news is that I've been able to support and map them to EArray objects :) With the new version, you should be able to read chunked datasets without problems (at least I do with the sample you sent me). However, there is a limited support for multiple extendeable dimensions that are set simultaneously because PyTables does not support that. That would mean that you can still extend these native HDF5 datasets, but only along the *first* (counting from the left to the right) extendeable dimension that is found in the dataset. As we do not offer public access to the SVN repository yet, you should wait until tomorrow so as to get the snapshot with this patch that will appear in: http://www.carabos.com/downloads/pytables/snapshots/ Please, tell me if this works for you. Cheers, A Diumenge 16 Gener 2005 16:19, Antonio Valentino va escriure: > hi, > I'm not an expert user and I'm having some problems trying to open an > hdf5 file containing a chunked dataset. > Here it is some infos > > Python 2.3.4 > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 > numarray v. 1.1.1 > tables v. 0.9.1 > hdf5 v. 1.6.3-patch > > this is the test program > > # BEGIN file test-uchar.py > import tables > h5file =3D tables.openFile('data.h5') > print h5file > # END file test-uchar.py > > an this is the data > > # chunk (128x128x2) > > [antonio@m6n h5]$ h5dump -A data.h5 > HDF5 "data.h5" { > GROUP "/" { > DATASET "ComplexUCharArray" { > DATATYPE H5T_STD_U8LE > DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } > } > } > } > > When i run the test program i get a segfault > > [antonio@m6n h5]$ python test-uchar.py > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > Trying toguess what's there using HDF5 metadata. I can't promise you > getting the correctobjects, but I will do my best!. > path, UserWarning) > Segmentation fault > > > If I try it with a *non* chunked dataset ... > > [antonio@m6n h5]$ python test-uchar.py > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > Trying to guess what's there using HDF5 metadata. I can't promise you > getting the correct objects, but I will do my best!. > path, UserWarning) > Traceback (most recent call last): > File "test-uchar.py", line 6, in ? > print h5file > File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, in > __str__ > astring +=3D str(leaf) + '\n' > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > __str__ > title =3D self.attrs.TITLE > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > 166, in __getattr__ > raise AttributeError, \ > > [SNIP] > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > 166, in __getattr__ > raise AttributeError, \ > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > __str__ > title =3D self.attrs.TITLE > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, in > _get_attrs > return AttributeSet(self) > RuntimeError: maximum recursion depth exceeded > > > in this case the file seems to be correctly opened but some problem is > met in the print statement. > > antonio =2D-=20 >qo< Francesc Altet =A0 =A0 http://www.carabos.com/ V =A0V C=E1rabos Coop. V. =A0=A0Enjoy Data "" |