Devs,
In MediaWiki there are preprocessor limits
<https://en.wikipedia.org/wiki/Wikipedia:Template_limits> that track how
complex a page is so that rendering can gracefully bail out if any of these
limits are broken to keep the server stable. MediaWiki will include this
information in a comment in the rendered page like:
<!--
NewPP limit report
Parsed by mw1270
Cached time: 20170223033729
Cache expiry: 2592000
Dynamic content: false
CPU time usage: 0.124 seconds
Real time usage: 0.170 seconds
Preprocessor visited node count: 468/1000000
Preprocessor generated node count: 0/1500000
Post‐expand include size: 50512/2097152 bytes
Template argument size: 37/2097152 bytes
Highest expansion depth: 7/40
Expensive parser function count: 0/500
Lua time usage: 0.039/10.000 seconds
Lua memory usage: 1.66 MB/50 MB
-->
We have cases where our users will want to include pages as templates into
other pages and these can get pretty deeply nested as well as contain some
complex content that is expensive and/or takes a long time to render. We
want to make sure that a user cannot bring our servers down accidentally or
intentionally due to using too many expensive/long running inclusions on
their pages. Testing this on our instance of XWiki is showing that XWiki
will run until a) the page is finally able to render which could be minutes
in some cases or b) the server runs out of memory and falls over.
I have been looking but have been unable to find if XWiki has this sort of
feature to turn on and configure. I was also unable to find it in the XWiki
Jira or in this forum so I wanted to ask the Devs directly. Does this sort
of limiting exist on XWiki? And if so, how can I turn it on?
Thanks!
--
View this message in context:
http://xwiki.475771.n2.nabble.com/Page-Complexity-limiter-tp7602880.html
Sent from the XWiki- Dev mailing list archive at
Nabble.com.