Hi,
To sum up, the main problem, in my opinion, with import feature : sometimes some pages
make the import process fail. If you remove the page(s) that make things fail, then
it's ok.
The big problem is : you don't known which page(s) make import fails ... So I have to
split the .xar into small xars, and then try to locate the page(s) with incorrect data ...
This process is very long : it would be very helpful to display the page on which import
process fails.
Regards,
Jérémie
-----Original Message-----
From: Arnaud Bourree [mailto:arnaud.bourree@gemalto.com]
Sent: jeudi 7 juin 2007 13:59
To: xwiki-users(a)objectweb.org
Subject: Re: [xwiki-users] [1.1Milestone1] Importing a large .xar file
Hello,
Could I suggest you to split your XAR?
Regards,
Arnaud.
BOUSQUET Jeremie wrote on 07/06/2007 12:32:
Hello,
I'm trying to migrate from xwiki 1.0 to 1.1.milestone 1
engine using
the export/import feature. I'm configured to
use a MySQL
database on
both sides.
My exported .xar file is about 60Mo. I managed to go
through following
errors:
- limit of 10Mo from XWiki as attachment size
- limit of 1Mo as packet max size for MySQL
Now I'm getting the following error while trying to attach the
Backup.xar file in my new wiki instance:
"java.sql.BatchUpdateException: Data truncation: Data truncated for
column 'XWA_CONTENT' at row 1"
Any ideas ? The .xar file do not get correctly attached after this.
Best regards,
Jérémie