Hi again,
Meanwhile we've made the robots.txt more strict and the crawler seems to accept that,
so we have no immediate trouble at the moment. Still, knowing that there probably is a
loophole by which guests can delete pages is worrying. What we find extremely confusing is
that we cannot replicate HOW the crawler deleted pages. When we try to do the same
manually as Guest we are redirected to the login page, as expected. Also what caught our
eyes in the logfiles is that the crawler seems to work with a session ID while we do not
seem to get one when not logged in.
Any hints about what we might be missing greatly appreciated.
Additionally a related question:The option to hinder unregistered users from editing
documents independently from page and space rights, does that also hinder from deleting
pages?
Cheers,
Olaf
________________________________
Von: O Voss <richyfourtythree(a)yahoo.com>
An: Vincent Massol <vincent(a)massol.net>et>; XWiki Users <users(a)xwiki.org>
Gesendet: 9:09 Freitag, 6.April 2012
Betreff: Re: [xwiki-users] severe trouble with web crawlers
Hi Vincent,
Many thanks for your quick answer.
I will list the wiki scope rights that we have set here. (I hope I get the translations
right, I have the UI in German.)
Users:
unregistered users: allow view, deny edit, deny delete, deny admin, allow register
Groups:
Redakteure: allow edit, allow delete,
XWikiAdminGroup: allow edit, allow admin
XWikiAllGroup: allow view, allow comment, allow register
The latest destruction was done on the Blog space. I am not sure how the rights were set
exactly before the trouble began, but I am abuolutely sure there was no right explicitly
allowed to unregistered users (andd 99% sure none was explicityly denied).
The option to hinder unregistered users from editing documents independently from page and
space rights had not been ticked when we had the trouble last night. I have switched that
on now.
We also have some kind of trouble with XWiki.XwikiPreferences, which is not deleted
entirely, but many (if not all) of the data in it is. These are the rights on the XWiki
space:
Users:
unregistered users: allow view, deny edit, deny delete, deny admin
Groups:
XWikiAdminGroup: allow edit, allow delete, allow admin
XWikiAllGroup: allow view
Cheers,
Olaf
________________________________
Von: Vincent Massol <vincent(a)massol.net>
An: XWiki Users <users(a)xwiki.org>
CC: O Voss <richyfourtythree(a)yahoo.com>
Gesendet: 8:33 Freitag, 6.April 2012
Betreff: Re: [xwiki-users] severe trouble with web crawlers
Hi Olaf,
On Apr 6, 2012, at 8:15 AM, O Voss wrote:
Hi,
a web crawler keeps deleting documents in our Wiki. Meanwhile we have a robots.txt, but
it seems to ignore that.
Of course, you should never allow unregistered users to delete documents, but you know
that already… ;)
In the global rights the unregistered user is
explicitely denied edit, delete and admin rights. Nowhere are such rights set to allowed.
What are we missing?
Personally I never use deny rights. I simply enable the rights for the user/group that
should get the rights. The others are denied by default.
You should also check rights at 3 levels:
- global
- space level
- page level
The deleted documents do not turn up in
XWiki/DeletedDocuments. Is there any way to restore the deleted documents?
Only from a backup.
This is an absolute desaster atm, any hints
greatly appreciated!
We'd need to see your full rights setup to help me.
You could check
http://platform.xwiki.org/xwiki/bin/view/AdminGuide/Access+Rights (not
perfect doc though) to understand rights a bit more.
It won't help you but FYI we're planning to redesign the rights UI to make it even
easier to use:
http://incubator.myxwiki.org/xwiki/bin/view/Improvements/Rights
Thanks
-Vincent
_______________________________________________
users mailing list
users(a)xwiki.org
http://lists.xwiki.org/mailman/listinfo/users