Le 06/03/11 22:08, Paul Libbrecht a écrit :
Sergiu,
Sure, caching can be dangerous and can and should be invalidatable.
That's the reason the browsers generally do not rely only on their cache but rely on
"validating their cache" doing a call with the addition of
"If-Modified-Since". What allows them to keep their cached result is the
response "Not Modified". A very good explanation is that of Mark Nottingham
http://www.mnot.net/cache_docs/
Where this is a major performance gainer is that it's not only the html (or js, jpeg,
or png) stream that is cached but all the analysis based on that (for example the
computation of styles, the js compilation, or the jpeg profile adjustments, provided they
also don't change).
I disagree with Ludovic that says that it almost always changes: for a given browser it
does mostly not when you do back for example.
I think what I'm saying is that we
cannot do "cross-user" caching as the
result of each template is almost always personalized.
Of course we could have a per-user cache but I'm not sure it would be
very memory efficient.
I agree with Sergiu that application developers should
not have cache hold their updates and probably should only enable a cache when they are
aware of it but currently, they can only invalidate the xwiki cache by saving again the
given page. That sure is too limited!
Ludovic, your "usage statistics of macros" is a tool lots of people would put
to good use. If you could publish that, it could be put to good use.
This was just a simple hack:
#macro(template $tname)
#set($time1 = $xwiki.date.time)
$xwiki.parseTemplate($tname)
#set($time2 = $xwiki.date.time)
#set($dtime1 = $time2 - $time1)
Time $tname: $dtime1
#end
Now it will only work with 3.0 because it needs velocity 1.7 which has a
stack for variables.
With older versions I think the variables will overwrite themselves on
recursive calls.
We could put this in the core in parseTemplate with an option.
We also have the old monitor plugin which implements a strategy for that
type of stuff.
Since it's a plugin we could expose it in velocity:
http://svn.xwiki.org/svnroot/xwiki/platform/core/trunk/xwiki-core/src/main/…
At the time I used this plugin a lot to verify that storage was behaving
as I wanted.
Ludovic
paul
Le 6 mars 2011 à 10:41, Ludovic Dubost a écrit :
So implementing caches on all this is a good way to
keep performance good.
I think this is not the right approach. Caching always
introduces
surprises. Image we cache the "recently viewed" panel. The user views
some documents, but that panel doesn't show them, but insists on
displaying things from 5 minutes ago. Buggy feature...
Imagine we cache the homepage, and I go and create a new "product", and
go to the homepage and don't see it there. What do I do? Panic? Say it's
a bug and call the IT guy only to look like a fool later when I try to
show it? Report a bug to those developers only to have it closed as
"won't fix, duplicate of the other 30 issues reported this month"?
Of course caching needs to be use with intelligence and should be used only on
content that you can invalidate the cache for.
But the numbers shown higher up, show that a significant portion of the load time is
actually in certain scripts (panels or home page) that are costly by nature and quite
almost the same from one run to another.
It's not something that is problematic by essence but this is something an
administrator needs to be aware of.
It's known that an entry point should be fast. For instance Google does not use a
dynamic page for it's search box.
Having a cost of 2 seconds to display the boxes from our home page is a killer.
Another thing that it shows is that indeed there is a lot of time in many little things.
37 templates, thousands of calls to the prefs and to the message API (you forgot that
one).
_______________________________________________
devs mailing list
devs(a)xwiki.org
http://lists.xwiki.org/mailman/listinfo/devs
--
Ludovic Dubost
Blog:
http://blog.ludovic.org/
XWiki:
http://www.xwiki.com
Skype: ldubost GTalk: ldubost