Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo

Close

Problems when data file is larger than 512MG

cdominguez
2005-02-24
2014-01-19
  • cdominguez
    cdominguez
    2005-02-24

    Hi,

    My database has CACHED tables for large amounts of data. When ".data" file is
    larger than 512MB, the time used for queries and insertions in CACHED tables increases
    between 3000% to 5000%. The maximum allocated memory of JVM is -Xmx512m.

    Thank you in advance for answer,
    CELIA

     
    • Don't quote me on this, but I believe it is because the current nio file access mechanism shuts off after attempting to access rows above 1/2 GB point in cache file, because this wants to grow the (currently single) memory mapped nio buffer larger than 1/2 GB.

      IIRC,  to avoid this, a more sophisticated (hence complicated) nio mechanism would need to be devised, that maintains a dynamic collection of nio buffers.

      This was dicussed once in this forum, quite a while ago, with the tentative conculsion that the current nio API places some unavoidable restrictions while in the same breath fails to provide certain desirable guarantees regarding attempting to deal with said restrictions.