[Gc] Problem with Blacklist

Francois Bronsard fbronsard@bigfoot.com
Mon, 3 Nov 2003 09:20:03 -0500


This may be a naive question, but why is a large part of the heap
blacklisted?  (And what does "blacklisted" mean exactly?)  I'm asking
because we're not using anything that I would associate with "unterpretable
data".  Specifically, there are no image or compressed data.  There are many
string and pointers, though.  The application is a static analysis program
and the heap grows so big because we keep in memory multiple copies/variant
of the program representation tree (aka. the parse tree).  So I do expect
the memory to grow a lot but I don't understand what is making GC treat part
of that memory as blacklisted.

Thanks in advance,
Francois


Boehm, Hans writes:
 > I suspect that too much of the heap is blacklisted.  To verify that,
 > call GC_dump() sometime shortly before the failure.  It will generate
 > lots of output with a 120MB heap.  But one piece of that will be a list
 > of the individual heap sections and some info on how much of each is blacklisted.
 > 
 > Assuming this is the case, various partial solutions are possible to reduce
 > the number of blacklisted pages:
 > 
 > 1) Build the collector with -DLARGE_CONFIG.  This probably won't help much, but
 > it's definitely a good idea for heaps that size.
 > 
 > 2) Supply some type/object layout information to the collector, especially
 > if you are allocating objects containing "random" data, e.g. compressed data.
 > It's usually easiest to use GC_MALLOC_ATOMIC where appropriate.  This is
 > probably the most important, and may eliminate the problem completely.
 > 
 > 3) Allocate large objects with the ...ignore_off_page primitives.  Make sure
 > that you keep pointers to the beginnings of such objects.  This will help only
 > if it's still possible to allocate smaller objects without running into
 > blacklisted blocks.
 > 
 > 4) If possible for your application, turn off GC_all_interior_pointers.
 > 
 > 5) Check the GC_dump() output to make sure that graphics memory is not traced.
 > (There are some hooks in 6.3alpha2 to deal with that potential problem if you
 > run into it.)
 > 
 > 5) Switch to a 64-bit machine.  (Very effective but not always feasible :-) .)
 > 
 > Hans
 > 
 > > -----Original Message-----
 > > From: gc-admin@napali.hpl.hp.com [mailto:gc-admin@napali.hpl.hp.com]On
 > > Behalf Of Francois Bronsard
 > > Sent: Thursday, October 30, 2003 1:01 PM
 > > To: gc@napali.hpl.hp.com
 > > Subject: [Gc] Problem with Blacklist
 > > 
 > > 
 > > Hello everyone,
 > > 
 > > I have a strange problem with the black list data structure 
 > > within GC.  I've
 > > allocated so far about 120Meg of data (_heapsize is roughly 
 > > 120Meg), but now
 > > as I'm trying to allocate another 75K, GC gets into a loop in which it
 > > allocates itself more memory (about 8Meg), then as it tries 
 > > to use it to
 > > satisfy the allocation request, it checks if that memory is 
 > > blacklisted and
 > > decide that the whole new block is blacklisted, then it 
 > > repeat the loop
 > > until the complete virtual memory has been exhausted.  Then 
 > > the program
 > > stops with "no memory available".
 > > 
 > > I'm using GC version 6.2, running on a PC with Windows XP 
 > > Pro, compiled with
 > > Visual Studio C++ version 6.0 with 500+Meg of RAM.  I've set
 > > GC_free_space_divisor is 3.
 > > 
 > > Any suggestion to either fix the problem or at least narrow 
 > > it down to find
 > > the source of the problem?
 > > 
 > > Thanks in advance,
 > > Francois Bronsard
 > > 
 > > _______________________________________________
 > > Gc mailing list
 > > Gc@linux.hpl.hp.com
 > > http://linux.hpl.hp.com/cgi-bin/mailman/listinfo/gc
 > >