From: Francesc A. <fa...@op...> - 2003-05-16 09:17:23
|
A Divendres 16 Maig 2003 10:42, Edward Hartley va escriure: > However question was directed at finding the maximum size of a Column > entry in a row > not at the maximum number of rows. > I have found that the maximum size of an array entry in a group is > about 2**27 before the interpreter > cannot get any more memory on my machine with 128M swap. I don't fully understand what are you trying to mean, but it seems like i= f you are trying to put big columns elements on a table. If so, as PyTables uses a I/O buffer where the differents rows are putted there before writi= ng them to disk, it has a current limitation of rowsizes less than 600 KB in size (mainly because of processor cache efficency reasons). Right now, I didn't realize that anybody would want more than that, because the normal situation is to break down your data into small pieces that would form th= e rows to be added. So, if you are willing to deal with very large row sizes, I would recomme= nd you to breakdown your row data into smaller chunks and then reconstruct t= he data during table reading if you want to. Besides, if you are trying to use arrays of 2**27 elements, this means 13= 4 millions, so, if your arrays are single-precisition floats, that represen= ts 134*4=3D 536 MB (!), which is just too much for your machine. However, you discovered a flaw on PyTables design in that it doesn't chec= k for row sizes bigger than the I/O buffer. I'll try to add that on the CVS version and issue a warn to the user in case that happens. Cheers, --=20 Francesc Alted |