Hi,
I've been testing newrelic which has some interesting live JVM analysis.
I've been running the JVM Profiling while running a load test going on
various XWiki pages (except activity stream) including AWM, LiveTable
Searches, Content pages, Lucene search and loading ressources.
Here is a result of the profiling:
25.0% - java.net.SocketInputStream.socketRead0
32 Collapsed methods (show)
25.0% - com.xpn.xwiki.store.hibernate.query.HqlQueryExecutor.execute
21.0% - org.xwiki.query.internal.DefaultQueryExecutorManager.execute
11.0% - org.xwiki.query.internal.DefaultQuery.execute
127 Collapsed methods (show)
11.0% - java.lang.Thread.run
9.9% - org.xwiki.query.internal.SecureQueryExecutorManager.execute
4.2% - com.xpn.xwiki.store.hibernate.query.DefaultQueryExecutor.execute
12.0% - org.apache.velocity.runtime.parser.node.SimpleNode.render
12.0% - org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate
12.0% - org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate
8.1% -
org.xwiki.rendering.internal.macro.velocity.VelocityMacro.evaluateString
4.3% - com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate
9.2% - org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate
8.5% - com.xpn.xwiki.render.DefaultVelocityManager.getVelocityEngine
4.1% - org.apache.velocity.runtime.parser.node.ASTBlock.render
4.1% - com.xpn.xwiki.web.XWikiAction.execute
3.6% -
org.xwiki.rendering.internal.transformation.macro.MacroTransformation.transformOnce
3.2% - org.apache.velocity.runtime.directive.Foreach.render
2.9% -
org.xwiki.rendering.wikimodel.xwiki.xwiki21.javacc.XWikiScanner.docElements
2.8% - com.xpn.xwiki.plugin.lucene.SearchResults.getRelevantResults
2.6% - com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate
2.4% - com.mysql.jdbc.PreparedStatement.executeQuery
2.4% - org.xwiki.uiextension.internal.WikiUIExtension.getParameters
2.3% - org.xwiki.rendering.wikimodel.xwiki.xwiki21.javacc.XWikiScanner.line
My understanding is that we spend 25% of time reading from the DB (this
could be livetable related for instance). Around 50 in various velocity
parts. Around 10% in the XWiki rendering and 3% in Lucene search. 2,4% in
UI extensions (no visible rights, probably because running as Admin)
Velocity is clearly quite expensive. Now I see particularly:
8.5% - com.xpn.xwiki.render.DefaultVelocityManager.getVelocityEngine
Would this mean we are spending 8,5% of the time getting the velocity
engine to use ?
This sounds a lot.
In any case this is food for thoughts for future optimizations.
Ludovic
--
Ludovic Dubost
Founder and CEO
Blog: http://blog.ludovic.org/
XWiki: http://www.xwiki.com
Skype: ldubost GTalk: ldubost
Hi Devs!
Thought you might be interested
http://www.diagram.ly/
Roman Muntyanu | rmuntyan(a)softserveinc.com<mailto:rmuntyan@softserveinc.com> | Project Manager | SoftServe<http://www.softserveinc.com/> Inc. | +380-32-240-9090 x3738
Hi devs,
We have recently voted a rule where we said that we will do the following:
* Always deprecate APIs
* Always move them to Legacy modules
* And when there's a technical issue to moving stuff to the Legacy module, only then, send a VOTE to remove an API
(see http://markmail.org/message/tino4ngttflc5i3s).
This means that from now on (starting on 26th of April 2012) we're not allowed to put excludes in CLIRR.
However I've seen that we have added some CLIRR excludes after the vote was passed.
I believe that the main issue we have is for "young" APIs that are not considered stable.
Proposal 1: Internal package
=========
* Young APIs must be located in the "internal" package till they become stable. I propose "internal" to reuse an existing package that we filter when testing for CLIRR. "internal" means that users should not use this API because it's considered unstable and can change at any time.
* When a Young API is considered stable enough and we want to open it to public consumption then we move it from "internal" to its target package (that's easy to with IDEs). From that point forward any changes to them goes through the standard mechanism of deprecation/legacy.
* If we want to add a new method to an existing public API then this should not be considered a "young" API. It's just an addition to an existing API and thus goes directly to the deprecation/legacy cycle.
* We need to be careful to isolate "young" APIs from public API so that users don't inadvertently use "young" unstable APIs by mistake. If not possible then directly go through the deprecation/legacy cycle.
The advantage of this proposal is that it doesn't change our current practices and is very easy to verify through CLIRR.
Proposal 2: Experimental package
=========
Another possibility I can think of is to introduce a new "experimental" package instead of reusing the "internal" one. It has the advantage of being able to follow "young" APIs and ensure they don't stay in that state indefinitely, while still allowing the user who uses it to notice it's experimental.
Proposal 3: Experimental Annotation
=========
Another idea is to just use an @Experimental javadoc tag for experimental code. It has the advantage of using the target package but it has drawbacks:
* It's impossible for users to notice that they're using Experimental APIs since when they import a class they won't see anything that'll tell them they're using a "young" API
* It's almost impossible to tell CLIRR to exclude those APIs from its checks. The only way to do this is to modify the source code of the CLIRR plugin AFAIK. Thus we would need to exclude those manually using CLIRR excludes and thus before we release we would need to go over the full list of CLIRR excludes to ensure the excludes listed are only for "young" APIs marked "experimental".
Note that I mentioned javadoc tag and not annotation because I believe we need to add information about when the Experimental API was first introduced so that we eventually move it as a proper API by removing the Experimental tag. Maybe we would need a rule such as: keep that tag for less or equal to 3 full minor releases (i.e. 6 months).
WDYT? Any other idea?
Thanks
-Vincent
Hi devs,
So far I've always created a new Team for new repos on xwiki-contrib. This is a bit too high maintenance and in addition it doesn't help for cross-project collaboration.
Thus I'm going to remove them in the coming minutes and I'll keep only the following teams:
* currikiorg
* gsoc
* xclams*
* xwikiorg
Basically I'll fold all users currently on the other teams into xwikiorg.
Thanks
-Vincent
Hi devs,
Well, the subject pretty much sums it up. I am proposing to skip any
testing while building a release, since that is the job of the CI
infrastructure and, it happened more than once for the build of the release
to fail because of a flickering test. This is really annoying for a release
manager that has to start over the entire build (since right now it's a
real pain to alter the scripts to continue the build).
Besides this, tests (specially the functional ones added into platform)
uselessly delay the build, thus, the release.
Note: We are already skipping tests for enterprise and manager. This vote
is actually about skipping tests for commons, rendering and platform as
well.
Here's my +1.
Thanks,
Eduard
Hi devs,
The last piece missing to the new localization framework for 4.x is to
allow providing translations in a jar extension.
Here are some ideas:
1) Continue with ApplicationResources_*.properties files but loads it from
everywhere instead of taking the first one we find like now
2) Same that 1) but place it somewhere a bit "cleaner" like
org/xwiki/localization/translation_*.properties
3) Completely different system. I looked a bit if any simple de facto
standard was already existing but was not obvious.
Keep in mind the localization framework allow providing any source so it's
easy to move/add a new way to get translations in jar extensions later
without touching the API so while it would be better to do something good
it's mostly implementation details and whatever we choose we are not going
to be stuck with it forever.
WDYT ?
I would go for 2) (I'm +0 for 1)) with
org/xwiki/localization/translation_*.properties for 4.x and keep 3) for
later when we have some time to work on it a bit more.
--
Thomas Mortagne
20 threads blocked on:
at com.xpn.xwiki.render.DefaultVelocityManager.getVelocityEngine(DefaultVelocityManager.java:177)
- waiting to lock <0x637cbe38> (a com.xpn.xwiki.render.DefaultVelocityManager)
at com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate(XWikiVelocityRenderer.java:105)
https://github.com/xwiki/xwiki-platform/blob/master/xwiki-platform-core/xwi…
The holder of this lock calls context.getWiki().getSkinFile("macros.vm", skin, context);
and context.getWiki().exists(skin, context), both of which potentially access the database while
holding a global lock.
Looks like some delicious low hanging fruit to me.
Thanks,
Caleb
Original message removed because it is too big for the list.
See it here:
https://ezcrypt.it/4H5n#8mjU1voDtZsANw9vsLa8WM8Y
Hi devs,
I would need a contrib repo for an extension I've developped for USH.
It is called Classification and enables to create easily a classification
plan with categories and sub-categories. Apparently such a feature is often
needed on client project, so I hope it could be useful.
Thanks,
Thomas