Stephanie,
I’m also very interested in anything you come up with, or whether anyone
has additional thoughts on making an “offline” copy. I’m not sure whether
my situation is similar to yours or not, but I’ll explain a few things I’ve
tried so far in case it helps either of us.
Short version – has anyone successfully used Wget to mirror an XWiki
instance?
What I ultimately would like is an offline archive of an XWiki instance
that is totally independent of needing a servlet container or database
(even standalone).
I understand Arnaud’s suggestion to use a standalone XWiki instance to
create an offline backup, but unfortunately that is too complicated for
most of my users to access.
In my circumstance, I have an XWiki instance that needs to “reset” at the
beginning of each year, and then need to keep an archive of each full
year’s worth of contributions to the wiki. So I end up with a separate
XWiki database for each year. This is quickly becoming cumbersome and
eating up a lot of server resources to keep all of them online.
I would like for an average user to be able to download an “archive” of a
particular year’s wiki instance so I no longer need to host it “live.”
One other consideration – almost all of the page content in each wiki
instance is stored in objects attached to each page that are then retrieved
with velocity and formatted with javascript. And most pages have a large
number of attachments.
Things I have tried:
1) HTML/PDF export – Like Stephanie, this doesn’t work for me since it
doesn’t maintain navigation or scripting.
2) Standalone XWiki instance – this has proved just too complicated for my
users. I’d prefer some type of archive in a flat file/HTML format if at all
possible.
3) Wget – This seems to be the most promising option so far, since it’s
supposed to make a totally offline recursive mirror of the site. My
attempts so far have been mixed – I can get some of the page content to
download, but struggle with getting a completely working copy. I’ve also
tried a few other “offline archiver” type programs, but none have worked
better than Wget. If someone has successfully used Wget to mirror an XWiki
site, I'd love to hear about it.
Any other ideas?
Thanks,
aaron
On Fri, Jan 25, 2013 at 8:53 AM, Arnaud bourree <arnaud.bourree(a)gmail.com>wrote;wrote:
Hello,
Off-line: XWiki is off-line: you don't need internet to run XWiki
excepted for some connected extension.
More than Off-line, we may want portable XWiki instance you can put on
usb pen-drive.
IMO, standalone XWiki is ready for that.
OK, You want to put a copy of your XWiki server in you pen-drive ...
- made you standalone XWiki read-only to not resynchronize back to your
server
- after dump of your server database, you have to convert it to
Hslqdb, or you write Event listener extension to propagate page update
from server to pen-drive. Database conversion looks more easy to do
Regards,
Arnaud.
2013/1/23 <lists(a)yhmail.de>de>:
Hello again!
I would like to make a xwiki-instance accessible off-line. I tried to
export
everything as HTML with rather bad results as the
navigation and the
scripting isn't exported. So the idea was to somehow export xwiki to some
standalone version. Has anyone ever done something like this and wouldn't
mind sharing his/her inside thoughts?
Thanks for your help,
Stephanie
_______________________________________________
users mailing list
users(a)xwiki.org
http://lists.xwiki.org/mailman/listinfo/users
_______________________________________________
users mailing list
users(a)xwiki.org
http://lists.xwiki.org/mailman/listinfo/users