[Gc] Silently fails to allocate memory?
bruce at hoult.org
Fri Apr 12 19:03:16 PDT 2013
The GC does actually work, and thousands of programs (and indeed products)
use it, so the most likely explanation is that you're doing something
wrong. Unfortunately we have no way of knowing without seeing your code.
You're aware that C++ destructors won't be called by the GC? They're not
needed if all they'd do is free other memory that is also GC allocated, but
they will be if you need to close files or free other memory, such as that
allocated using malloc by library code (including the STL). In that case
you need to register finalizers for the affected objects.
On Sat, Apr 13, 2013 at 9:51 AM, The Devils Jester <
thedevilsjester at gmail.com> wrote:
> It is difficult to describe the issue I am having with libgc.
> I have a for loop that allocates thousands of (small) objects. Each
> iteration of the loop only allocates a few new objects, but when the loop
> is sufficiently large (usually around 3500 iterations), then for some
> reason it just...well gives up. It does not hang or segfault, or give any
> indication that something is wrong, but it (and this is difficult to
> determine) seems to silently stop allowing the objects to be created.
> If I disable libgc (or manually call delete, which is not a working long
> term solution), the loop executes correctly. This is on Ubuntu Linux using
> GCC 4.7 .
> On Mac OS X, I cannot tell if its working (which it appears to be) or if
> the limit to "give up" is might higher.
> Is there some trick to this? Am I using it wrong? How I am using libgc is
> by simply including "gc_cpp.h" calling "GC_INIT" and having my class(es)
> inherit from gc.
> This message has been scanned for viruses and
> dangerous content by *MailScanner* <https://www.mailscanner.info/>, and is
> believed to be clean.
> Gc mailing list
> Gc at linux.hpl.hp.com
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Gc