From: Anthony S. <sc...@gm...> - 2012-09-27 18:11:23
|
On Thu, Sep 27, 2012 at 11:02 AM, Luke Lee <dur...@gm...> wrote: > Are there any performance issues with relatively large carrays? For > example, say I have a carray with 300,000 float64s in it. Is there some > threshold where I could expect performance to degrade or anything? > Hello Luke, The breakdowns happen when you have too many chunks. However you are well away from this threshold (which is ~20,000). I believe that the PyTables will issue a warning or error when you reach this point anyways. > I think I remember seeing there was a performance limit with tables > 255 > columns. I can't find a reference to that so it's possible I made it up. > However, I was wondering if carrays had some limitation like that. > Tables are a different data set. The issue with tables is that column metadata (names, etc.) needs to fit in the attribute space. The size of this space is statically limited to 64 kb. In my experience, this number is in the thousands of columns (not hundreds). On the other hand CArrays don't have much of any column metadata. CArrays should scale to an infinite number of columns without any issue. Be Well Anthony > > > ------------------------------------------------------------------------------ > Everyone hates slow websites. So do we. > Make your web apps faster with AppDynamics > Download AppDynamics Lite for free today: > http://ad.doubleclick.net/clk;258768047;13503038;j? > http://info.appdynamics.com/FreeJavaPerformanceDownload.html > _______________________________________________ > Pytables-users mailing list > Pyt...@li... > https://lists.sourceforge.net/lists/listinfo/pytables-users > > |