From: Antonio V. <val...@co...> - 2005-01-16 15:19:52
|
hi, I'm not an expert user and I'm having some problems trying to open an hdf5 file containing a chunked dataset. Here it is some infos Python 2.3.4 [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 numarray v. 1.1.1 tables v. 0.9.1 hdf5 v. 1.6.3-patch this is the test program # BEGIN file test-uchar.py import tables h5file = tables.openFile('data.h5') print h5file # END file test-uchar.py an this is the data # chunk (128x128x2) [antonio@m6n h5]$ h5dump -A data.h5 HDF5 "data.h5" { GROUP "/" { DATASET "ComplexUCharArray" { DATATYPE H5T_STD_U8LE DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } } } } When i run the test program i get a segfault [antonio@m6n h5]$ python test-uchar.py /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. Trying toguess what's there using HDF5 metadata. I can't promise you getting the correctobjects, but I will do my best!. path, UserWarning) Segmentation fault If I try it with a *non* chunked dataset ... [antonio@m6n h5]$ python test-uchar.py /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. Trying to guess what's there using HDF5 metadata. I can't promise you getting the correct objects, but I will do my best!. path, UserWarning) Traceback (most recent call last): File "test-uchar.py", line 6, in ? print h5file File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, in __str__ astring += str(leaf) + '\n' File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in __str__ title = self.attrs.TITLE File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line 166, in __getattr__ raise AttributeError, \ [SNIP] File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line 166, in __getattr__ raise AttributeError, \ File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in __str__ title = self.attrs.TITLE File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, in _get_attrs return AttributeSet(self) RuntimeError: maximum recursion depth exceeded in this case the file seems to be correctly opened but some problem is met in the print statement. antonio -- Antonio Valentino Consorzio Innova S.r.l. via della Scienza - Zona Paip I 75100 Matera (MT) Italy Tel.: +39 0835 309180 Fax: +39 0835 264705 Home Page: www.consorzio-innova.it Email: val...@co... |
From: Francesc A. <fa...@ca...> - 2005-01-17 16:19:03
|
Hi Antonio, Can you send me privately both files so as to see what's going on with them? Cheers, A Diumenge 16 Gener 2005 16:19, Antonio Valentino va escriure: > hi, > I'm not an expert user and I'm having some problems trying to open an > hdf5 file containing a chunked dataset. > Here it is some infos >=20 > Python 2.3.4 > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 > numarray v. 1.1.1 > tables v. 0.9.1 > hdf5 v. 1.6.3-patch >=20 > this is the test program >=20 > # BEGIN file test-uchar.py > import tables > h5file =3D tables.openFile('data.h5') > print h5file > # END file test-uchar.py >=20 > an this is the data >=20 > # chunk (128x128x2) >=20 > [antonio@m6n h5]$ h5dump -A data.h5 > HDF5 "data.h5" { > GROUP "/" { > DATASET "ComplexUCharArray" { > DATATYPE H5T_STD_U8LE > DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } > } > } > } >=20 > When i run the test program i get a segfault >=20 > [antonio@m6n h5]$ python test-uchar.py > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > Trying toguess what's there using HDF5 metadata. I can't promise you > getting the correctobjects, but I will do my best!. > path, UserWarning) > Segmentation fault >=20 >=20 > If I try it with a *non* chunked dataset ... >=20 > [antonio@m6n h5]$ python test-uchar.py > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > Trying to guess what's there using HDF5 metadata. I can't promise you > getting the correct objects, but I will do my best!. > path, UserWarning) > Traceback (most recent call last): > File "test-uchar.py", line 6, in ? > print h5file > File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, in > __str__ > astring +=3D str(leaf) + '\n' > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > __str__ > title =3D self.attrs.TITLE > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > 166, in __getattr__ > raise AttributeError, \ >=20 > [SNIP] >=20 > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > 166, in __getattr__ > raise AttributeError, \ > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > __str__ > title =3D self.attrs.TITLE > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, in > _get_attrs > return AttributeSet(self) > RuntimeError: maximum recursion depth exceeded >=20 >=20 > in this case the file seems to be correctly opened but some problem is > met in the print statement. >=20 > antonio >=20 >=20 =2D-=20 >OO< =A0 Francesc Altet || http://www.carabos.com/ V =A0V =A0 Carabos Coop. V. || Who is your data daddy? PyTables "" |
From: Antonio V. <val...@co...> - 2005-01-19 08:15:20
|
Ok, I sent data you asked for yesterday. I run some more tests: Python 2.3.4 [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 numarray v. 1.1.1 tables v. 0.9.1 hdf5 v. 1.6.2 and=20 Python 2.4 Win XP SP2 numarray v. 1.1.1 tables v. 0.9.1 hdf5 v. 1.6.3-pathc the result is always the same. bye Alle 17:18, luned=EC 17 gennaio 2005, Francesc Altet ha scritto: > Hi Antonio, > > Can you send me privately both files so as to see what's going on with > them? > > Cheers, > > A Diumenge 16 Gener 2005 16:19, Antonio Valentino va escriure: > > hi, > > I'm not an expert user and I'm having some problems trying to open an > > hdf5 file containing a chunked dataset. > > Here it is some infos > > > > Python 2.3.4 > > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 > > numarray v. 1.1.1 > > tables v. 0.9.1 > > hdf5 v. 1.6.3-patch > > > > this is the test program > > > > # BEGIN file test-uchar.py > > import tables > > h5file =3D tables.openFile('data.h5') > > print h5file > > # END file test-uchar.py > > > > an this is the data > > > > # chunk (128x128x2) > > > > [antonio@m6n h5]$ h5dump -A data.h5 > > HDF5 "data.h5" { > > GROUP "/" { > > DATASET "ComplexUCharArray" { > > DATATYPE H5T_STD_U8LE > > DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } > > } > > } > > } > > > > When i run the test program i get a segfault > > > > [antonio@m6n h5]$ python test-uchar.py > > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > > Trying toguess what's there using HDF5 metadata. I can't promise you > > getting the correctobjects, but I will do my best!. > > path, UserWarning) > > Segmentation fault > > > > > > If I try it with a *non* chunked dataset ... > > > > [antonio@m6n h5]$ python test-uchar.py > > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > > Trying to guess what's there using HDF5 metadata. I can't promise you > > getting the correct objects, but I will do my best!. > > path, UserWarning) > > Traceback (most recent call last): > > File "test-uchar.py", line 6, in ? > > print h5file > > File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, = in > > __str__ > > astring +=3D str(leaf) + '\n' > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, i= n > > __str__ > > title =3D self.attrs.TITLE > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", lin= e > > 166, in __getattr__ > > raise AttributeError, \ > > > > [SNIP] > > > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", lin= e > > 166, in __getattr__ > > raise AttributeError, \ > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, i= n > > __str__ > > title =3D self.attrs.TITLE > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, i= n > > _get_attrs > > return AttributeSet(self) > > RuntimeError: maximum recursion depth exceeded > > > > > > in this case the file seems to be correctly opened but some problem i= s > > met in the print statement. > > > > antonio --=20 Antonio Valentino Consorzio Innova S.r.l. Home Page: www.consorzio-innova.it |
From: Francesc A. <fa...@ca...> - 2005-01-27 14:17:06
|
Hi Antonio, I've finally found some time to work in this issue. The problem with that was that native chunked datasets were not supported by PyTables. The good news is that I've been able to support and map them to EArray objects :) With the new version, you should be able to read chunked datasets without problems (at least I do with the sample you sent me). However, there is a limited support for multiple extendeable dimensions that are set simultaneously because PyTables does not support that. That would mean that you can still extend these native HDF5 datasets, but only along the *first* (counting from the left to the right) extendeable dimension that is found in the dataset. As we do not offer public access to the SVN repository yet, you should wait until tomorrow so as to get the snapshot with this patch that will appear in: http://www.carabos.com/downloads/pytables/snapshots/ Please, tell me if this works for you. Cheers, A Diumenge 16 Gener 2005 16:19, Antonio Valentino va escriure: > hi, > I'm not an expert user and I'm having some problems trying to open an > hdf5 file containing a chunked dataset. > Here it is some infos > > Python 2.3.4 > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 > numarray v. 1.1.1 > tables v. 0.9.1 > hdf5 v. 1.6.3-patch > > this is the test program > > # BEGIN file test-uchar.py > import tables > h5file =3D tables.openFile('data.h5') > print h5file > # END file test-uchar.py > > an this is the data > > # chunk (128x128x2) > > [antonio@m6n h5]$ h5dump -A data.h5 > HDF5 "data.h5" { > GROUP "/" { > DATASET "ComplexUCharArray" { > DATATYPE H5T_STD_U8LE > DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } > } > } > } > > When i run the test program i get a segfault > > [antonio@m6n h5]$ python test-uchar.py > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > Trying toguess what's there using HDF5 metadata. I can't promise you > getting the correctobjects, but I will do my best!. > path, UserWarning) > Segmentation fault > > > If I try it with a *non* chunked dataset ... > > [antonio@m6n h5]$ python test-uchar.py > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > Trying to guess what's there using HDF5 metadata. I can't promise you > getting the correct objects, but I will do my best!. > path, UserWarning) > Traceback (most recent call last): > File "test-uchar.py", line 6, in ? > print h5file > File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, in > __str__ > astring +=3D str(leaf) + '\n' > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > __str__ > title =3D self.attrs.TITLE > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > 166, in __getattr__ > raise AttributeError, \ > > [SNIP] > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > 166, in __getattr__ > raise AttributeError, \ > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > __str__ > title =3D self.attrs.TITLE > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, in > _get_attrs > return AttributeSet(self) > RuntimeError: maximum recursion depth exceeded > > > in this case the file seems to be correctly opened but some problem is > met in the print statement. > > antonio =2D-=20 >qo< Francesc Altet =A0 =A0 http://www.carabos.com/ V =A0V C=E1rabos Coop. V. =A0=A0Enjoy Data "" |
From: Antonio V. <val...@co...> - 2005-01-28 08:20:34
|
Il giorno gio, 27-01-2005 alle 15:16 +0100, Francesc Altet ha scritto: > Hi Antonio, hi > I've finally found some time to work in this issue. The problem with > that was that native chunked datasets were not supported by PyTables. > The good news is that I've been able to support and map them to EArray > objects :) thanks a lot I did try to fix it by myself but I had no luck. The best I can do is to fix the segmentation fault but still an UnImplemented object was created and than: RuntimeError: maximum recursion depth exceeded :(( May I ask you what kind of tools do you use to debug PyTables? How do you debug PyRex extensions? > With the new version, you should be able to read chunked datasets > without problems (at least I do with the sample you sent me). However, > there is a limited support for multiple extendeable dimensions that > are set simultaneously because PyTables does not support that. That > would mean that you can still extend these native HDF5 datasets, but > only along the *first* (counting from the left to the right) > extendeable dimension that is found in the dataset. Ok, this is enough for me. I don't have to extend arbitrary chunked arrays. Anyway I would like to have more control over the chunk size. Do you think to add the full support to chunked datasets in the future? > As we do not offer public access to the SVN repository yet, you should > wait until tomorrow so as to get the snapshot with this patch that > will appear in: > > http://www.carabos.com/downloads/pytables/snapshots/ saved ;) > Please, tell me if this works for you. Of course, I'll do it. > Cheers, Ciao Antonio P.S. excuse me for my bad english :) > A Diumenge 16 Gener 2005 16:19, Antonio Valentino va escriure: > > hi, > > I'm not an expert user and I'm having some problems trying to open an > > hdf5 file containing a chunked dataset. > > Here it is some infos > > > > Python 2.3.4 > > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 > > numarray v. 1.1.1 > > tables v. 0.9.1 > > hdf5 v. 1.6.3-patch > > > > this is the test program > > > > # BEGIN file test-uchar.py > > import tables > > h5file = tables.openFile('data.h5') > > print h5file > > # END file test-uchar.py > > > > an this is the data > > > > # chunk (128x128x2) > > > > [antonio@m6n h5]$ h5dump -A data.h5 > > HDF5 "data.h5" { > > GROUP "/" { > > DATASET "ComplexUCharArray" { > > DATATYPE H5T_STD_U8LE > > DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } > > } > > } > > } > > > > When i run the test program i get a segfault > > > > [antonio@m6n h5]$ python test-uchar.py > > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > > Trying toguess what's there using HDF5 metadata. I can't promise you > > getting the correctobjects, but I will do my best!. > > path, UserWarning) > > Segmentation fault > > > > > > If I try it with a *non* chunked dataset ... > > > > [antonio@m6n h5]$ python test-uchar.py > > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > > Trying to guess what's there using HDF5 metadata. I can't promise you > > getting the correct objects, but I will do my best!. > > path, UserWarning) > > Traceback (most recent call last): > > File "test-uchar.py", line 6, in ? > > print h5file > > File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, in > > __str__ > > astring += str(leaf) + '\n' > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > > __str__ > > title = self.attrs.TITLE > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > > 166, in __getattr__ > > raise AttributeError, \ > > > > [SNIP] > > > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > > 166, in __getattr__ > > raise AttributeError, \ > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > > __str__ > > title = self.attrs.TITLE > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, in > > _get_attrs > > return AttributeSet(self) > > RuntimeError: maximum recursion depth exceeded > > > > > > in this case the file seems to be correctly opened but some problem is > > met in the print statement. > > > > antonio > |
From: Antonio V. <val...@co...> - 2005-01-31 10:23:00
|
Il giorno ven, 28-01-2005 alle 09:19 +0100, Antonio Valentino ha scritto: > Il giorno gio, 27-01-2005 alle 15:16 +0100, Francesc Altet ha scritto: > > Hi Antonio, > > hi > > > I've finally found some time to work in this issue. The problem with > > that was that native chunked datasets were not supported by PyTables. > > The good news is that I've been able to support and map them to EArray > > objects :) > > thanks a lot > I did try to fix it by myself but I had no luck. > The best I can do is to fix the segmentation fault but still an > UnImplemented object was created and than: > > RuntimeError: maximum recursion depth exceeded > > :(( > > May I ask you what kind of tools do you use to debug PyTables? > How do you debug PyRex extensions? > > > With the new version, you should be able to read chunked datasets > > without problems (at least I do with the sample you sent me). However, > > there is a limited support for multiple extendeable dimensions that > > are set simultaneously because PyTables does not support that. That > > would mean that you can still extend these native HDF5 datasets, but > > only along the *first* (counting from the left to the right) > > extendeable dimension that is found in the dataset. > > Ok, this is enough for me. I don't have to extend arbitrary chunked arrays. ^^^^^^^^^^^^^^^^^^^^^^^^^^ It seems I spoke too soon :P I tested pytables snapshot on my data and in effect it works good on chunked datasets with at least one extendeable dimension (so were the ones I sent to you). Then I run a test on chunked data with no extendeable dimensions (unfortunately I have no control on this kind of aspects because data come from an external team) and again I got a Segmentation fault. I fixed it by modifying hdf5Extension.pyx ($Id: hdf5Extension.pyx 520 2005-01-27 13:51:04Z faltet $) as follows ##patch #2832,2833c2832,2834 #< if (self.__class__.__name__ == "EArray" or #< self.__class__.__name__ == "IndexArray"): #--- #> #if (self.__class__.__name__ == "EArray" or #> # self.__class__.__name__ == "IndexArray"): #> if self.extdim >= 0: Of course in this case the array is no more mapped onto an EArray object and an UnImplemented object is created instead. ciao Antonio > Anyway I would like to have more control over the chunk size. > Do you think to add the full support to chunked datasets in the future? > > > As we do not offer public access to the SVN repository yet, you should > > wait until tomorrow so as to get the snapshot with this patch that > > will appear in: > > > > http://www.carabos.com/downloads/pytables/snapshots/ > > saved ;) > > > Please, tell me if this works for you. > > Of course, I'll do it. > > > Cheers, > > Ciao > Antonio > > P.S. excuse me for my bad english :) > > > A Diumenge 16 Gener 2005 16:19, Antonio Valentino va escriure: > > > hi, > > > I'm not an expert user and I'm having some problems trying to open an > > > hdf5 file containing a chunked dataset. > > > Here it is some infos > > > > > > Python 2.3.4 > > > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] on linux2 > > > numarray v. 1.1.1 > > > tables v. 0.9.1 > > > hdf5 v. 1.6.3-patch > > > > > > this is the test program > > > > > > # BEGIN file test-uchar.py > > > import tables > > > h5file = tables.openFile('data.h5') > > > print h5file > > > # END file test-uchar.py > > > > > > an this is the data > > > > > > # chunk (128x128x2) > > > > > > [antonio@m6n h5]$ h5dump -A data.h5 > > > HDF5 "data.h5" { > > > GROUP "/" { > > > DATASET "ComplexUCharArray" { > > > DATATYPE H5T_STD_U8LE > > > DATASPACE SIMPLE { ( 200, 150, 2 ) / ( 200, 150, 2 ) } > > > } > > > } > > > } > > > > > > When i run the test program i get a segfault > > > > > > [antonio@m6n h5]$ python test-uchar.py > > > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > > > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > > > Trying toguess what's there using HDF5 metadata. I can't promise you > > > getting the correctobjects, but I will do my best!. > > > path, UserWarning) > > > Segmentation fault > > > > > > > > > If I try it with a *non* chunked dataset ... > > > > > > [antonio@m6n h5]$ python test-uchar.py > > > /usr/lib/python2.3/site-packages/tables/File.py:192: UserWarning: > > > 'data.h5' does exist, is an HDF5 file, but has not a PyTables format. > > > Trying to guess what's there using HDF5 metadata. I can't promise you > > > getting the correct objects, but I will do my best!. > > > path, UserWarning) > > > Traceback (most recent call last): > > > File "test-uchar.py", line 6, in ? > > > print h5file > > > File "/usr/lib/python2.3/site-packages/tables/File.py", line 1000, in > > > __str__ > > > astring += str(leaf) + '\n' > > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > > > __str__ > > > title = self.attrs.TITLE > > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > > > 166, in __getattr__ > > > raise AttributeError, \ > > > > > > [SNIP] > > > > > > File "/usr/lib/python2.3/site-packages/tables/AttributeSet.py", line > > > 166, in __getattr__ > > > raise AttributeError, \ > > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 472, in > > > __str__ > > > title = self.attrs.TITLE > > > File "/usr/lib/python2.3/site-packages/tables/Leaf.py", line 189, in > > > _get_attrs > > > return AttributeSet(self) > > > RuntimeError: maximum recursion depth exceeded > > > > > > > > > in this case the file seems to be correctly opened but some problem is > > > met in the print statement. > > > > > > antonio > > > > > > ------------------------------------------------------- > This SF.Net email is sponsored by: IntelliVIEW -- Interactive Reporting > Tool for open source databases. Create drag-&-drop reports. Save time > by over 75%! Publish reports on the web. Export to DOC, XLS, RTF, etc. > Download a FREE copy at http://www.intelliview.com/go/osdn_nl > _______________________________________________ > Pytables-users mailing list > Pyt...@li... > https://lists.sourceforge.net/lists/listinfo/pytables-users |