From: Edward H. <ha...@co...> - 2003-05-16 08:40:02
|
On Friday, May 16, 2003, at 11:02 am, Francesc Alted wrote: > A Dijous 15 Maig 2003 19:52, Edward Hartley va escriure: >> Hi >> is there any limit in the HDF5 file layer on the size of table >> entries? > > To my understanding no (well, at least a practical limit). The number > of > rows on datasets (a table is a special type of dataset) in HDF5 are of > type > hsize_t which is bound to integers of 64 bits both in modern Unix and > Windows. So, your tables can have up to 2**63 rows, which is something > like > 1e19 (!). > Thanks my knowing this is usefull. However question was directed at finding the maximum size of a Column entry in a row not at the maximum number of rows. I have found that the maximum size of an array entry in a group is about 2**27 before the interpreter cannot get any more memory on my machine with 128M swap. This is without a flush operation Regards Ed > Cheers, > > -- > Francesc Alted > > > ------------------------------------------------------- > Enterprise Linux Forum Conference & Expo, June 4-6, 2003, Santa Clara > The only event dedicated to issues related to Linux enterprise > solutions > www.enterpriselinuxforum.com > > _______________________________________________ > Pytables-users mailing list > Pyt...@li... > https://lists.sourceforge.net/lists/listinfo/pytables-users > |