Hi,
The XWiki SAS platform team does backup at the mysql level by doing live replication
and periodic backups. It sounds like you are using the default HSQL/jetty installation
which has some scaling issues and is lacking in administrative tools so it is not
recommended for production use. Given what you're doing, I think your solution of
backing up the files makes sense. You might consider using a differential compression
scheme like rsync or, my personal favorite, zfs snapshots using zfsonlinux.
Note that zfsonlinux still has some issues so it's not good for mission critical
applications but it powers Livermore National lab's equipment and my /home/
directory.
Thanks,
Caleb
On 08/02/2012 10:03 AM, siewiki wrote:
I am using XWiki for managing Knowledge. Now i want to
project a backup
strategy. First of all i thought about to create a *.rar archive from whole
XWiki Enterprise folder which has a amount from 1.5GB. In a batch i copy
this rar to another harddrive. The rar archive is about 850MB.
and with the windows scheduler i import different batchs for each day
another.
But a better solution is the incremental backup.
Is there any way to make a incremental backup? Because i dont want to have
about 10 x 850 MB files for my backup system.
Thank you very much in advance for the answers.
--
View this message in context:
http://xwiki.475771.n2.nabble.com/Incremental-Backup-tp7580693.html
Sent from the XWiki- Users mailing list archive at
Nabble.com.
_______________________________________________
users mailing list
users(a)xwiki.org
http://lists.xwiki.org/mailman/listinfo/users