On Sun, Nov 16, 2008 at 8:23 PM, Gerritjan Koekkoek
<gerritjankoekkoek(a)gmail.com> wrote:
The problem is probably to dynamically update the
robot.txt file when
content (read new page) is added.
The simplest way to do this would be to create these protected pages
in some specific spaces and disallow them in robots.txt.
Disallow: /xwiki/bin/view/ProtectedSpace/
Maybe it helkps the content is stored in a OBJECT and
displayed only
when a certain programmatically boolean is set
If the data is stored in an object what you can do is adding something
like this in the HTML meta information (Administration > Presentation)
:
#if ($doc.getObjectNumbers("Space.ClassToProtect") > 0)
<meta name="robots" CONTENT="noindex" />
#end
JV.