Hi,
I am currently trying to migrate a already populated xwiki from hsqldb
to mysql.
The first thing I noticed is, that xwiki needs lots of heap space for
attaching the
backup.xar to the import page. This xar is about 20mb, the wiki is not
that big, yet
the tomcat process consumes about 1Gb resident memory (2Gb in total)
while attaching
which leads to massive swapping and slowdown. I wonder what will happen, if
a wiki is really heavily populated with lots of attachements?
After the xar is attached, I cannot select it, because an exception occurs:
Wrapped Exception: Error number 2002 in 2: Error parsing xml
Wrapped Exception: Error on line 346 of document : Character reference
"" is an invalid XML character. Nested exception: Character reference
"" is an invalid XML character.
at com.xpn.xwiki.web.ImportAction.render(ImportAction.java:94)
at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:148)
at
org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)
at
org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)
at
org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)
Any hints about how to do the migration?
Cheers,
Hans-Peter