Hi Peter,
Of course, though you might want to think a bit on the approach.
1) If you want to do it in XWiki using the scheduler plugin, you have to
write some groovy script [1] in the scheduled task that you create in order
to use the filter module [2] (or the old packager module [3], whichever you
choose) in order to export the pages you are interested in and save the
result (on the filesystem or as an attachment in a page).
As inspiration, you could check out the source code (in java) of the export
action [4] which is exactly what is going on when you are exporting a page
from the UI.
2) A simpler choice, IMO, would be to schedule a cron [5] task on a linux
machine (possible the same one running the XWiki instance) and have that
task simply download (using something like cURL [6]) the result of a
manually crafter export URL (as detailed in our documentation [7])
containing the list of pages to export.
Hope this helps,
Eduard
----------
[1] 
http://platform.xwiki.org/xwiki/bin/view/DevGuide/Scripting
[2] 
http://extensions.xwiki.org/xwiki/bin/view/Extension/Filter+Module
[3]
https://github.com/xwiki/xwiki-platform/blob/master/xwiki-platform-core/xwi…
[4]
https://github.com/xwiki/xwiki-platform/blob/master/xwiki-platform-core/xwi…
[5] 
https://en.wikipedia.org/wiki/Cron
[6] 
https://en.wikipedia.org/wiki/CURL
[7] 
http://platform.xwiki.org/xwiki/bin/view/Features/Exports#HXARExport
On Thu, Sep 17, 2015 at 3:49 PM, Peter Huisman <p.huisman(a)ximm.nl> wrote:
  Hi,
 Is there a way to generate XAR’s using XWiki’s scheduling function? I’m
 not looking for a DB based backup but merely for a backup of spaces / pages.
 With kind regards,
 Peter
 _______________________________________________
 users mailing list
 users(a)xwiki.org
 
http://lists.xwiki.org/mailman/listinfo/users