InterBase MEMO/BLOB fields, size data>1MB

  • pet

      Please help me with this problem (Perl & InterBase MEMO/BLOB fields, data with size > 1MB):
      I appreciate any help. In my Perl script I have a code part that fetches a data from database InterBase, everything works fine except in case when fetched data has size of over 1MB! Script(s) work just fine if the "content" in MEMO or BLOB fields of the database has smaller size than 1MBytes; if size of the file (data) is over 1MB then "ERROR: Blob exceeds maximum length" is generated in the "error.log" of the Apache Server (and of course data is not fetched):
        Script code:
    #     . . . . . - rest of the code
        use DBI;
    #     . . . . . - rest of the code
            # name & location of the database
        $dbpath = 'C:\Database\database_01.fdb';
        $dsn = "DBI:InterBase:database=$dbpath;host=123.456.78.9;ib_dialect=3";
            # connesting to DB
        $dbh = DBI->connect($dsn, '', '', {AutoCommit => 0}) or die "$DBI::errstr";
            # setting the size of the buffer (1KB)
        $dbh -> {LongReadLen} = 1024;
            # NO matter if Truncation is ON or OFF? Tried both cases!
        #$dbh -> {LongTruncOk} = 0;
        $dbh -> {LongTruncOk} = 1;

            # here, DATA is MEMO field in the database!
        $sth = $dbh->prepare($sql) or die "Preparing: ", $dbh->errstr;
        $sth->execute or die "Executing: ", $sth->errstr;
            # opening FILE HANDLER for writing the content of the MEMO/BLOB field
        open (F, ">./database1.txt");
            # fetching the content
        while (@row = $sth->fetchrow_array()) {
            foreach (@row) {
            # saving into the file "database1.txt"!
                print F "$_";
        close F;
        $dbh->commit or warn $dbh->errstr;
        $dbh->disconnect or warn $dbh->errstr;
    #     . . . . . - rest of the code

      I have tried several workarounds (in the current code I've increased the size of the buffer, LongReadLen - EVEN to 10MB; also tried with chunks & blob_read method - to get small pieces of data & to compose/concatenate them again at the end BUT got nothing, even with data that had size of only 10KB!), everytime it was unsuccessfully! Maybe I should replace the DBD-InterBase with some other Perl module?
      Installed Environment: Perl version 5.6.1, Firebird 1.5.0 (database: InterBase) and I have two similar scripts (Win32 & Unix), both work OK when fetched data had smaller size than 1MB, installed Perl modules: DBI 1.37, DBD-InterBase 0.40.

        *NOTICE: I read this:
    "The DBI currently defines no way to insert or update LONG/LOB values piece-wise - piece by piece. That means you're limited to handling values that will fit into your available memory."
        as well as:
    "Read/Write BLOB fields block by block not (yet) supported. The maximum size of a BLOB read/write is hardcoded to about 1MB."
      - Does that mean that I can't get data using DBI and DBD:InterBase with size larger than 1MB, or how do I change the limit (in DBD:InterBase or in DBI) from 1MB to 10MB?

      Please help me with any ideas, hints or solutions!

    • Edwin Pratomo
      Edwin Pratomo

      the upper limit is hardcoded in dbdimp.h. Find MAX_SAFE_BLOB_LENGTH, and adjust it to your safe maximum number, and recompile.

    • pet

        THANK YOU VERY MUCH, you saved my life! :-))

      I tried to find & fix it myself, but unsuccessfully; btw, I tried to REACH YOU (even wrote a mail/letter to you) cause I knew DBD::InterBase was your masterpiece! Sorry for any inconvenience, I appreciate all the work and given help!

      Regards, - Pet.