Hi,
I wanted to discuss about the future of the mailsender plugin ?
I've been working on a small tool to be able to send a Calendar Invitation
by email from a Meeting Notes AppWithinMinutes application and I found some
limitation in the mailsender plugin, namely you cannot add multipart
alternative email parts in addition to the text and html parts already
supported by the plugin.
I was able to hack the mailsender plugin to add a vcalendar part but it
does not really sound right to do that since we should support any part of
any content type, but this is a bigger refactoring.
I was wondering what the future is for the mailsender plugin. Do we plan to
make it a component and keep the same functionality ? Is there a plan for
an alternative component ?
And what would be the approach to add a vcalendar part in emails sent by
the current mailsender ? This would be needed to support the feature of
sending invitation emails which would be very powerfull.
Ludovic
--
Ludovic Dubost
Founder and CEO
Blog: http://blog.ludovic.org/
XWiki: http://www.xwiki.com
Skype: ldubost GTalk: ldubost
The XWiki development team is proud to announce the availability of XWiki 4.5.
This version brings improvements and stabilization to the Extension Manager and Distribution Wizards. It also adds the ability to internationalize applications created with the App Within Minutes tool, and continues the improvements of the new experimental Solr search integration. This is the last release of the XWiki 4.x cycle.
You can download it here: http://www.xwiki.org/xwiki/bin/view/Main/Download
Make sure to review the release notes:
http://www.xwiki.org/xwiki/bin/view/ReleaseNotes/ReleaseNotesXWiki45
Thanks
-The XWiki dev team
Hello fellow XWiki developers,
I was wondering if anyone had taken the time to have a joint install of XWiki and OpenFire, apparently a fairly scalable XMPP server with nice feature completeness.
Something should be done at the UI level, such as candy-chat probably.
My question is more about the server side, there seems to be a way to write an adapter to fetch user-information from a database, so that exposing users of XWiki might be simple.
Did anyone try this already?
Did it scale?
thanks in advance
paul
Hi devs,
We've moved more and more toward an UTF-8-only application, and XWiki
has only been tested with this configuration for several years.
I propose that we require UTF-8 for a valid, supported installation.
This means:
- JVM encoding (-Dfile.encoding=UTF8)
- Container default URL encoding (Tomcat has ISO-8859-1 by default)
- Database encoding (MySql is still configured with latin1 on some distros)
There's one big site to update on our side: xwiki.org.
Here's my +1. This is a move toward a future web, since more and more
standards require (or at least assume as a default) UTF-8.
After thinking a bit more, it would make sense to require a valid
Unicode encoding, including UTF-16, which is preferable in countries
that don't use a latin alphabet. However, XWiki doesn't currently work
under 16-bit encodings at all.
--
Sergiu Dumitriu
http://purl.org/net/sergiu
Hi,
I'd like to propose to use the following naming conventions for the extensions from commons/rendering/platform that are exposed on e.x.o:
- "API" suffix for api modules. e.g. "Text API" , "Velocity API" instead of currently "Text Module", "Velocity Module", etc
- "Application" suffix for ui modules, e.g. "Wiki Manager Application", "Logging Application", "IRC Bot Application", etc
- "Macro" suffix, e.g. "JIRA Macro", etc
Technically this means using the following in pom.xml:
<properties>
<!-- Name to display by the Extension Manager -->
<xwiki.extension.name>Wiki Component API</xwiki.extension.name>
</properties>
WDYT?
Thanks
-Vincent
Hi,
I've been testing newrelic which has some interesting live JVM analysis.
I've been running the JVM Profiling while running a load test going on
various XWiki pages (except activity stream) including AWM, LiveTable
Searches, Content pages, Lucene search and loading ressources.
Here is a result of the profiling:
25.0% - java.net.SocketInputStream.socketRead0
32 Collapsed methods (show)
25.0% - com.xpn.xwiki.store.hibernate.query.HqlQueryExecutor.execute
21.0% - org.xwiki.query.internal.DefaultQueryExecutorManager.execute
11.0% - org.xwiki.query.internal.DefaultQuery.execute
127 Collapsed methods (show)
11.0% - java.lang.Thread.run
9.9% - org.xwiki.query.internal.SecureQueryExecutorManager.execute
4.2% - com.xpn.xwiki.store.hibernate.query.DefaultQueryExecutor.execute
12.0% - org.apache.velocity.runtime.parser.node.SimpleNode.render
12.0% - org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate
12.0% - org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate
8.1% -
org.xwiki.rendering.internal.macro.velocity.VelocityMacro.evaluateString
4.3% - com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate
9.2% - org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate
8.5% - com.xpn.xwiki.render.DefaultVelocityManager.getVelocityEngine
4.1% - org.apache.velocity.runtime.parser.node.ASTBlock.render
4.1% - com.xpn.xwiki.web.XWikiAction.execute
3.6% -
org.xwiki.rendering.internal.transformation.macro.MacroTransformation.transformOnce
3.2% - org.apache.velocity.runtime.directive.Foreach.render
2.9% -
org.xwiki.rendering.wikimodel.xwiki.xwiki21.javacc.XWikiScanner.docElements
2.8% - com.xpn.xwiki.plugin.lucene.SearchResults.getRelevantResults
2.6% - com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate
2.4% - com.mysql.jdbc.PreparedStatement.executeQuery
2.4% - org.xwiki.uiextension.internal.WikiUIExtension.getParameters
2.3% - org.xwiki.rendering.wikimodel.xwiki.xwiki21.javacc.XWikiScanner.line
My understanding is that we spend 25% of time reading from the DB (this
could be livetable related for instance). Around 50 in various velocity
parts. Around 10% in the XWiki rendering and 3% in Lucene search. 2,4% in
UI extensions (no visible rights, probably because running as Admin)
Velocity is clearly quite expensive. Now I see particularly:
8.5% - com.xpn.xwiki.render.DefaultVelocityManager.getVelocityEngine
Would this mean we are spending 8,5% of the time getting the velocity
engine to use ?
This sounds a lot.
In any case this is food for thoughts for future optimizations.
Ludovic
--
Ludovic Dubost
Founder and CEO
Blog: http://blog.ludovic.org/
XWiki: http://www.xwiki.com
Skype: ldubost GTalk: ldubost
Hi Devs!
Thought you might be interested
http://www.diagram.ly/
Roman Muntyanu | rmuntyan(a)softserveinc.com<mailto:rmuntyan@softserveinc.com> | Project Manager | SoftServe<http://www.softserveinc.com/> Inc. | +380-32-240-9090 x3738
Hi devs,
We have recently voted a rule where we said that we will do the following:
* Always deprecate APIs
* Always move them to Legacy modules
* And when there's a technical issue to moving stuff to the Legacy module, only then, send a VOTE to remove an API
(see http://markmail.org/message/tino4ngttflc5i3s).
This means that from now on (starting on 26th of April 2012) we're not allowed to put excludes in CLIRR.
However I've seen that we have added some CLIRR excludes after the vote was passed.
I believe that the main issue we have is for "young" APIs that are not considered stable.
Proposal 1: Internal package
=========
* Young APIs must be located in the "internal" package till they become stable. I propose "internal" to reuse an existing package that we filter when testing for CLIRR. "internal" means that users should not use this API because it's considered unstable and can change at any time.
* When a Young API is considered stable enough and we want to open it to public consumption then we move it from "internal" to its target package (that's easy to with IDEs). From that point forward any changes to them goes through the standard mechanism of deprecation/legacy.
* If we want to add a new method to an existing public API then this should not be considered a "young" API. It's just an addition to an existing API and thus goes directly to the deprecation/legacy cycle.
* We need to be careful to isolate "young" APIs from public API so that users don't inadvertently use "young" unstable APIs by mistake. If not possible then directly go through the deprecation/legacy cycle.
The advantage of this proposal is that it doesn't change our current practices and is very easy to verify through CLIRR.
Proposal 2: Experimental package
=========
Another possibility I can think of is to introduce a new "experimental" package instead of reusing the "internal" one. It has the advantage of being able to follow "young" APIs and ensure they don't stay in that state indefinitely, while still allowing the user who uses it to notice it's experimental.
Proposal 3: Experimental Annotation
=========
Another idea is to just use an @Experimental javadoc tag for experimental code. It has the advantage of using the target package but it has drawbacks:
* It's impossible for users to notice that they're using Experimental APIs since when they import a class they won't see anything that'll tell them they're using a "young" API
* It's almost impossible to tell CLIRR to exclude those APIs from its checks. The only way to do this is to modify the source code of the CLIRR plugin AFAIK. Thus we would need to exclude those manually using CLIRR excludes and thus before we release we would need to go over the full list of CLIRR excludes to ensure the excludes listed are only for "young" APIs marked "experimental".
Note that I mentioned javadoc tag and not annotation because I believe we need to add information about when the Experimental API was first introduced so that we eventually move it as a proper API by removing the Experimental tag. Maybe we would need a rule such as: keep that tag for less or equal to 3 full minor releases (i.e. 6 months).
WDYT? Any other idea?
Thanks
-Vincent
Hi devs,
So far I've always created a new Team for new repos on xwiki-contrib. This is a bit too high maintenance and in addition it doesn't help for cross-project collaboration.
Thus I'm going to remove them in the coming minutes and I'll keep only the following teams:
* currikiorg
* gsoc
* xclams*
* xwikiorg
Basically I'll fold all users currently on the other teams into xwikiorg.
Thanks
-Vincent