Hi Arnaud,
I faced this problem before.
The importation of huge data need the configuration of java parameters of the
heap. I was not able to import and create 300000 documents - without object -
until I changed the following parameters :
-Xms512m -Xmx521m (you could add or change them in startup.bat in windows or
catalina.sh on linux).
These parameters depends on your machine. Then you can increase (-->1024) or
decrease them (1024).
Hope this will solve your problem.
Cheers.
-----------
Youcef Bey
Quoting Arnaud Thimel <thimel(a)codelutin.com>om>:
Hi all,
For a particular project, I have a plugin that reads data from an
external database using some JDBC queries.
For each row of my JDBC ResultSet, the plugin creates an XWiki document,
the associated objects, and fill these objects.
All is running fine except that I have around 50.000 objects to import :-D.
Then I get a Java Heap Space exception.
I re-re-re-re-re-read my source code, I don't keep any reference to the
documents created.
Every 200 document creation, I do the following calls :
context.getWiki().flushCache();
context.getWiki().getStore().cleanUp(context);
System.gc();
Anyway, the amount of memory used doesn't go down after flushing the caches.
And I still get my Heap Space exception.
Does anybody has an idea for me ?
Who thinks that I could be a XWiki memory leak ?
Thanks,
--
Arnaud Thimel
Société Code Lutin -
http://www.codelutin.com
-------------------------------------------------
envoyé via Webmail/IMAG !