:-)
In fact the XWA_CONTENT problem is due MEDIUMBLOB type used in xwikiattachment_content and
xwikiattachment_archive tables, I think.
Anyway I tried splitting the .xar, but I sometimes get some strange errors (premature EOF
from XML parser for example).
I also have a page containing 17Mo of files attachments --> impossible to get this
through because with MEDIUMBLOB I'm limited to 16Mo whatever other limitations (if
I'm right of course).
I wanted to make this migration properly using .xar features, and not through a brutal SQL
dump, but finally it's not that easy.
Regards,
Jérémie
-----Original Message-----
From: Arnaud Bourree [mailto:arnaud.bourree@gemalto.com]
Sent: jeudi 7 juin 2007 13:59
To: xwiki-users(a)objectweb.org
Subject: Re: [xwiki-users] [1.1Milestone1] Importing a large .xar file
Hello,
Could I suggest you to split your XAR?
Regards,
Arnaud.
BOUSQUET Jeremie wrote on 07/06/2007 12:32:
Hello,
I'm trying to migrate from xwiki 1.0 to 1.1.milestone 1
engine using
the export/import feature. I'm configured to
use a MySQL
database on
both sides.
My exported .xar file is about 60Mo. I managed to go
through following
errors:
- limit of 10Mo from XWiki as attachment size
- limit of 1Mo as packet max size for MySQL
Now I'm getting the following error while trying to attach the
Backup.xar file in my new wiki instance:
"java.sql.BatchUpdateException: Data truncation: Data truncated for
column 'XWA_CONTENT' at row 1"
Any ideas ? The .xar file do not get correctly attached after this.
Best regards,
Jérémie