I have BNets stored on disk as many XML files. I need to process a large amount of them, but run into what looks like a memory leak. The code below is a simple example of an infinite loop, where memory usage just grows and grows. Any ideas on what to do?

void Test::testMemoryLeak()
{
  string model_file = "BNet_example.xml";
  pnl::CBNet* net1;
  pnl::CBNet* net2;
  pnl::CBNet* net3;

  while (true)
  {
    pnl::CContextPersistence xml_context;
    if (! xml_context.LoadXML(model_file))
    {
      cout << "Error reading xml" << endl;
      return;
    }

    net1 = dynamic_cast<pnl::CBNet*>(xml_context.Get("Net1"));
    net2 = dynamic_cast<pnl::CBNet*>(xml_context.Get("Net2"));
    net3 = dynamic_cast<pnl::CBNet*>(xml_context.Get("Net3"));

    delete net1;
    delete net2;
    delete net3;
  }
}

Actually, the context.Get (and later delete) statements are not needed for having a memory leak; it is sufficient to only call LoadXML. Documentation is thin, and there seems to be no other methods available for cleaning up(?)