From: Edward H. <ha...@co...> - 2003-05-16 09:49:03
|
On Friday, May 16, 2003, at 12:17 pm, Francesc Alted wrote: > A Divendres 16 Maig 2003 10:42, Edward Hartley va escriure: >> However question was directed at finding the maximum size of a Column >> entry in a row >> not at the maximum number of rows. >> I have found that the maximum size of an array entry in a group is >> about 2**27 before the interpreter >> cannot get any more memory on my machine with 128M swap. > > I don't fully understand what are you trying to mean, but it seems > like if > you are trying to put big columns elements on a table. If so, as > PyTables > uses a I/O buffer where the differents rows are putted there before > writing > them to disk, it has a current limitation of rowsizes less than 600 KB > in > size (mainly because of processor cache efficency reasons). Many thanks this was the limit I have been hitting, 600kB seems reasonable > Right now, I > didn't realize that anybody would want more than that, because the > normal > situation is to break down your data into small pieces that would form > the > rows to be added. > I am dealing with images and image feature sets so I was wanting to create storage for these. I naively thought a table row would be appropriate clearly it isn't. > So, if you are willing to deal with very large row sizes, I would > recommend > you to breakdown your row data into smaller chunks and then > reconstruct the > data during table reading if you want to. > > Besides, if you are trying to use arrays of 2**27 elements, this means > 134 > millions, so, if your arrays are single-precisition floats, that > represents > 134*4= 536 MB (!), which is just too much for your machine. > I agree on your assessment of the problems with dealing with objects of this size. This was an exercise I conducted to find where the boundary was after finding that there was a problem using table rows and does not represent a realistic case. > However, you discovered a flaw on PyTables design in that it doesn't > check > for row sizes bigger than the I/O buffer. I'll try to add that on the > CVS > version and issue a warn to the user in case that happens. > Thats useful many thanks again > Cheers, > > -- > Francesc Alted > |