Sylvain Desbureaux wrote:
Hi all,
I tried to put a big file in my wiki (200Mo). Because of the size, I
truncated it in 8 pieces. When I uploaded the fourth one, the wiki kinda
crashed. So I stopped it and restarted it but I always get error 500. So
I put more memory in the JVM (I was using 1300M and I put 1800M). And
now it works.
But I'm a bit scared with this memory consumption because I have
something like 100Mo of file attachment and the server is quite full :/
Is it because I'm using HSQLDB? If yes, is it possible to migrate to
something else? What's the best choice?
HSQLDB is not recommended for large wikis; it is merely a
small-footprint database that can be used in XWiki distributions to show
how XWiki works. In a real setup it is more than needed to use something
else.
HSQLDB is an in-memory database, so at every moment you have the whole
database kept in memory, and when you're using the data, it is stored at
least twice, once in the DB app, and once in the XWiki platform.
Still, the memory consumption is a bit too high, we should investigate
where does all the memory go. There are several reports about HeapMemory
exceptions when attaching files.
Sergiu