Hi all,
For a particular project, I have a plugin that reads data from an
external database using some JDBC queries.
For each row of my JDBC ResultSet, the plugin creates an XWiki document,
the associated objects, and fill these objects.
All is running fine except that I have around 50.000 objects to import :-D.
Then I get a Java Heap Space exception.
I re-re-re-re-re-read my source code, I don't keep any reference to the
documents created.
Every 200 document creation, I do the following calls :
context.getWiki().flushCache();
context.getWiki().getStore().cleanUp(context);
System.gc();
Anyway, the amount of memory used doesn't go down after flushing the caches.
And I still get my Heap Space exception.
Does anybody has an idea for me ?
Who thinks that I could be a XWiki memory leak ?
Thanks,
--
Arnaud Thimel
Société Code Lutin -
http://www.codelutin.com