On Apr 5, 2007, at 7:50 PM, Youcef.Bey(a)imag.fr wrote:
Hi Arnaud,
I faced this problem before.
The importation of huge data need the configuration of java
parameters of the
heap. I was not able to import and create 300000 documents -
without object -
until I changed the following parameters :
-Xms512m -Xmx521m (you could add or change them in startup.bat in
windows or
catalina.sh on linux).
These parameters depends on your machine. Then you can increase (--
1024) or
decrease them (1024).
Hope this will solve your problem.
if anyone has the time, it would be nice to use a profile to
understand why this is required...
Thanks
-Vincent
Quoting Arnaud Thimel <thimel(a)codelutin.com>om>:
Hi all,
For a particular project, I have a plugin that reads data from an
external database using some JDBC queries.
For each row of my JDBC ResultSet, the plugin creates an XWiki
document,
the associated objects, and fill these objects.
All is running fine except that I have around 50.000 objects to
import :-D.
Then I get a Java Heap Space exception.
I re-re-re-re-re-read my source code, I don't keep any reference
to the
documents created.
Every 200 document creation, I do the following calls :
context.getWiki().flushCache();
context.getWiki().getStore().cleanUp(context);
System.gc();
Anyway, the amount of memory used doesn't go down after flushing
the caches.
And I still get my Heap Space exception.
Does anybody has an idea for me ?
Who thinks that I could be a XWiki memory leak ?
Thanks,
--
Arnaud Thimel
Société Code Lutin -
http://www.codelutin.com
-------------------------------------------------
envoyé via Webmail/IMAG !
--
You receive this message as a subscriber of the xwiki-
dev(a)objectweb.org mailing list.
To unsubscribe: mailto:xwiki-dev-unsubscribe@objectweb.org
For general help: mailto:sympa@objectweb.org?subject=help
ObjectWeb mailing lists service home page:
http://www.objectweb.org/
wws