On Mon, Mar 13, 2006 at 04:23:48PM +0100, Nicolas Neuss wrote:
> since two weeks ago I have access to an AMD64-machine with 16 GB memory and
> I would like to solve some large problems there with SBCL. However, I have
> now run several times in errors when I really use a lot of memory (say,
> more than 4 GB).
By default you can use up to 8GB of memory. If you need more than that,
edit the definition of dynamic-space-end in src/compiler/x86-86/parms.lisp
> I now have done a test which simply allocates a list of 500 8MB-arrays
> three times, and run into a problem already for this simple test (see
> Argh! gc_find_freeish_pages failed (restart_page), nbytes=8000016.
This has a working set of 7.5GB (3.7GB for the old list in *test* that can't
be discarded yet, another 3.7GB for the new list that's being constructed),
which should, in theory, just barely fit into the default heap. I suspect
that the problem is that some of the elements of the first *test* have been
tenured into a very high generation, and no sufficiently full GC has yet
been triggered after they became garbage. If that's the case, adding a call
to (gc :full t) after each DEFPARAMETER evaluation should fix the problem.
This is of course not a practical general solution. But it's quite possible
that this simple example isn't actually representative of the problems
you're having with your real program, and just increasing the heap size will
fix it. At least there are others running with very large heaps, and I
haven't seen any similar bug reports from them.