#58 Out of memory errors during genxref

v0.9.1
closed-works-for-me
nobody
genxref (49)
5
2009-03-26
2002-05-22
Andy Baskett
No

I'm not sure where the problem arises - I have spent
hours (8+) attempting to debug the code and looked
through CVS to see where previous leaks were fixed
but to no avail.
My MySQL "lxr.files" table currently contains 4400 files.
My repository contains thousands of font files and
other non-indexable files.
My repository contains hundreds of files generated
with old (9+ years) versions of RCS which results
in "co" errors.

Using Devel::Leak I believe the biggest leak is
when "co" fails - under these circumstances I think the
FileHandle is not "undef" but actually contains the
error message from "co". Perhaps parsing this causes
a problem?

In any case Devel::Leak shows each file generates 2
leaks per file (even when indexing successfully) - I
think %files lib/LXR/Index/Mysql.pl causes one as it
keeps getting appended - is it possible to include %
files = undef in the "empty_cache" function?

Other than that I am at a loss for ideas.
I am running very old versions of perl and MySQL
which I hope to update and may be the cause, but I
expect they are not the only reason for the problem.

I am running:
LXR 0.9.1
mysql 3.22.30
ectags 5.0.1
HP-UX 10.20

Discussion

  • Logged In: NO

    Similar case here, also indexing a CVS tree. The "Out of
    memory!" occurs always at the same point, in just a few
    minutes, after indexing a few hundred files.

    OpenBSD 3.0
    perl-5.6.1 (official obsd pkg)
    LXR 0.9.1
    mysql-3.23.37 (official obsd pkg, on remote host running
    OBSD 2.9)
    ectags 5.2.3
    swish-e 2.1-dev-25 (didn't get to the point of actually
    running it)

    $ ulimit -a
    time(cpu-seconds) unlimited
    file(blocks) unlimited
    coredump(blocks) unlimited
    data(kbytes) 65536
    stack(kbytes) 4096
    lockedmem(kbytes) 61366
    memory(kbytes) 184100
    nofiles(descriptors) 64
    processes 64

    The funny thing is that it when run under perl debugger
    (perl -d genxref ...) it kept working for many hours, having
    indexed thousands of files.
    `ulimit -a` returns the same values, not matter if run
    inside the perl debugger or not.

     
  • Malcolm Box
    Malcolm Box
    2009-03-26

    Running against a very large repository works here, and several memory leaks have been fixed since the 0.9.1 release.

     
  • Malcolm Box
    Malcolm Box
    2009-03-26

    • status: open --> closed-works-for-me