Hi devs,
For the upcoming Application Within Minutes I need to enhance the
XWiki platform with the ability to generate the page name
automatically when creating a new wiki page. For some applications it
doesn't make sense to have two creation steps: (1) specify the wiki
page name (i.e. the location) and then (2) edit the new wiki page with
the specified name. Let me give you two examples:
* It would be cool to be able to create a new blog post in a single
step. The blog post name could be generated from the blog post title
specified in the edit form.
* An application that manages holiday requests doesn't need meaningful
page names (i.e. free text, like a blog post would have) but something
like Request_XYZ, where XYZ is a unique counter/identifier.
These applications should be able to create new wiki pages with
automatically generated names without writing their custom create
forms.
Since 3.2RC1 was planed for today and these changes are in the 3.2
roadmap, here's a proposal that I think I can implement quickly and
safely:
(1) Introduce two new components:
// Used to generate a document name that doesn't have to be unique
(e.g. by cleaning the document title).
DocumentReferenceGenerator#generate(DocumentModelBridge):DocumentReference
// Used to make a document name unique (by suffixing an unique
counter/identifier)
DocumentReferenceDifferentiator#differentiate(DocumentReference):DocumentReference
(2) Modify editinline.vm to store "documentReferenceGenerator" and
"documentReferenceDifferentiator" request parameters in two hidden
input fields so that they are passed to the save action. Obviously,
these are component hints.
(3) Modify editactions.vm to replace "Save & View" + "Save & Continue"
with "Create" when "documentReferenceGenerator" or
"documentReferenceDifferentiator" (or both) request parameters are
set.
(4) Modify SaveAction to take into account these two request
parameters (only if they are specified). Something along these lines:
generateDocumentReference(doc)
synchronize(lock) {
doc.copyDocument(differentiate(generatedDocumentReference)).save();
}
I'm not sure where to place the two components from (1) though. WDYT?
Thanks,
Marius
Hi devs,
I'm implementing the LinkChecker UI and I want to be able to add a Tab in AllDocs. Right now I've coded it with a hardcoded #if but I really hate this.
So here's my proposal:
* Create the following modules:
xwiki-platform-uiextension/
|_ xwiki-platform-uiextension-ui/
|_ xwiki-platform-uiextension-api/
where:
* xwiki-platform-uiextension-ui/: contains XWiki.UIExtensionClass page
* xwiki-platform-uiextension-api/: contains a ScriptService to get UIExtension data + contains an EventListener that refreshes the UI Extension Cache when an UIExtensionClass object is modified (this is for performance reasons)
To start with I'm proposing to have the following fields for UIExtensionClass:
* type: String, represents the type of the extension (for example for the AllDocs needs, I'll use a "IndexTab" type (or "AllDocsTab" type)
* id: String, the technical name of the extension, which can be used for example as suffix for HTML class or ids.
* name: String, the name of the extension, which can be used for displaying. For example for the AllDocs needs, it would be used as the name of the Tab
* content: Textarea: the content of the extension. For example for the AllDocs needs, it would be used as the content to display when clicking on a tab
I'd like to implement this ASAP and thus stop hardcoding UI Extensions from now on.
Here's my +1
Thanks
-Vincent
PS: If you find a better than "uiextension" I'm all ears. A name without "extension" would be great to not confuse it with our Extensions (and with xwiki-platform-extension).
Hello developers,
I have been doing a nifty little cache monitor on www.curriki.org which is still running xwiki 1.5.4.
It can show the entries and capacity of the two xwiki caches, pageExistCache and the document cache which sound pretty central.
Not surprisingly, they're both full on our production server.
name entries capacity
pageExistCache 10000 10000
cache 3000 3000
The caches are using OSCache (through "OSCacheCache").
I've been trying to get a real statistical count but there I kind of failed thus far.
I can use the StatisticListenerImpl which gives fairly high numbers:
> StatisticListenerImpl: Hit = 9583563 / 1320251374, stale hit = 0 / 538, miss = 8541 / 17175236, flush = 27348, entries (added, removed, updates) = 25457741, 0, 7678101
these numbers are integers, and might actually be too small for the 2billions limit of an int, they are static (so count "all the OSCache instances").
What I would like to get is the "eviction rate" but I do not know how to compute it.
That is, I wish to read the number of cache-entries which are thrown away because the cache is full and others come in.
Once I can have it, I can start tuning (cache size, less greedy algorithms, ...) and measure the effect of such tuning. We have plenty of RAM space to accommodate the cache but we should use this well.
The big annoyance with this process is that it can only run on our production server. So thus far I did not dare register a CacheListener in groovy fearing it would suddenly be slow.
Has anyone used a different strategy or tool?
I'm happy to post my little monitor script.
thanks in advance.
paul
Hello fellow developers,
I am working on upgrading our core, based on xwiki 1.5.4 to xwiki 3.2. Quite a jump.
Among the least jump, velocity seems to have jumped from version 1.5 to version 1.7.
Nonetheless, this seems to cause a surprising effect: while macros could redefine the value of parameters they were passed, and that change was honoured after the macro, they cannot anymore.
The following script:
> #macro(redefine $var)
> #set($var="redefined")
> #end
>
> #set($x="original")
>
> x is $x
>
> #redefine($x)
>
> x is $x
gives the following output in a new core:
> x is original
>
> x is original
>
and the following in our old core:
> x is original
>
> x is redefined
>
This has all sorts of consequences including such macros as the navigation not receiving the result of normalizelink...
Did I miss an optional parameter of velocity to revert to the old method?
paul
Hi devs,
Right now renaming a key is a pain if we don't want to loose the
existing translations associated to it. Basically it require to rename
the key in all languages and reimport all of them on l10n.xwiki.org.
We already have a section in the translation file to indicate which
translations are deprecated. The idea is to indicate what is the new
name of a deprecated translation key directly in the translation file
so that l10n.xwiki.org can automatically copy the translation to the
new key when it find a new deprecated key while importing default
translation file.
For that we need to decide a syntax to indicate what is the new name of the key.
I propose to do something similar to java and indicate it in a comment
like the following:
new.key.name=Default translation
#@deprecated new.key.name
old.key.name=Default translation
Here are some other alternatives to "deprecated":
* replacedBy
* new
others ?
Here is my +1 for "deprecated", more intuitive for Java developers and
it's clear it's a deprecated translation key.
Thanks,
--
Thomas Mortagne
Hi devs,
I'd like to propose forking wikimodel (http://code.google.com/p/wikimodel/) and move its source code into our code base in XWiki Rendering as a separate module for now.
Motivation
========
* Wikimodel has been inactive for years (since 2009). Actually that's not quite true, there's been one developer working on it regularly, it's Thomas Mortagne…
* We heavily depend on wikimodel in the XWiki Rendering module (for our syntax parsers) which is a key module for XWiki
* It's more difficult for us to contribute to the wikimodel project since it means:
** committing in a different project with different rules
** there's nobody doing releases on the wikimodel projet and we need the releases to be synced with our releases since otherwise we cannot release on a SNAPSHOT dependency
** there's no community there so it's not fun and doesn't help for quality control/reviews/etc
** since we push XWiki Commons and XWiki Rendering to Maven Central we also need the wikimodel releases to be pushed to Central which is not happening now
* The wikimodel project has a different scope than our need. Mikhail (owner and admin of Wikiodel - not active since 2009 - some commits here and there) wanted it to remain only for wiki syntaxes. We added support for HTML parsing in it but Mikhail never liked it and wanted us to move it to XWiki.
* We have some impedance mistmatch between the wikimodel model and the xwiki rendering model which causes us to do some circumvolution in the code which leads to issues still beeing open in our JIRA (they've been opened for a long time now)
* We believe wikimodel would benefit from a larger and active community within the XWiki ecosystem. Wikimodel has been stagnating for years and we'd like it to live on and evolve.
Action Plan
=========
Thus Thomas and me are proposing to do the following:
* Move the sources in a new rendering module as is and use it as a library (same as now except we rename the module name and release it under the XWiki umbrella).
* Modify all header to put our LGPL headers everywhere
* We keep the attribution as is recommended by the ASL (see http://www.apache.org/foundation/license-faq.html#Distribute-changes) by adding a comment to all sources explaining where the source come from and in which license it was and who authored the initial code and how XWiki committers have participated to the wikimodel project. We also put that information in the NOTICE file.
* We modify the source code slowly over time to integrate it cleanly without our code and remove the hacks we had to do, and we bring improvements
* We post a mail on the wikimodel mailing list explaining all this and inviting the current wikimodel committers to become committers on the xwiki rendering module (provided they agree to follow our dev rules). We also explain how contributors can contribute (link to jira, link to github for pull requests, etc)
Related question (not part of the vote)
=============================
* We could decide to move XWiki Commons and XWiki Rendering under the ASL since they're libraries and as libraries the ASL is the license that makes it the easier possible to use from all other licenses. Right now ASL code cannot use our Rendering module because we're LGPL.
Here's my +1 to this plan.
I'm also currently +1 to brainstorm about moving XWiki Commons and XWiki Rendering to the ASL.
Thanks
-Vincent
Hi devs,
I see that in ApplicationResources.properties we now have some wrongly named properties.
For example for the scheduler I find properties of the type xe.scheduler.* but the scheduler is now in the platform.
There are also resources named core.*
Thus I'd like to propose a simple rule:
<short top level projet name>.<short module name>.<propertyName>
where:
<short top level projet name> = top level projet name without the "xwiki-" prefix, for example: commons, rendering, platform, enterprise, manager, etc
<short module name> = the name of the maven module without the <short top level project name> prefix, for example: oldcore, scheduler, activitystream, etc
<propertyName> = the name of the property using camel case, ex: updateJobClassCommitComment
For example the property core.importer.package.version would be renamed in platform.web.packageVersion
The only issue is when we rename modules we need to rename the properties for that module but I don't see any way out of this if we want to have expressive property names. But at least it's an easy and mechanic change
I also propose to introduce a different resource property file that will hold deprecated properties and that we can put in the xwiki-platform-legacy module. We could call it DeprecatedApplicationResources*.properties
Of course in the future each module should be able to contribute both resource properties (but they would use the same naming scheme!).
Even though this is not the topic of this mail this is how I'd implement this in the future:
* Implement a TranslationPropertiesConfigurationSource that is initialized by reading all property files found in the classloader under META-INF/translations.properties and META-INF/translations-deprecated.properties. This component would listen to observation events so that when a new jar is installed by the extension manager it can check if it provides some translations and add them.
* Have a Translation Manager component which is the main API to be used by other modules wishing to get translations. This manager would use the TranslationPropertiesConfigurationSource component.
Here's my +1
Thanks
-Vincent
On Tue, Jan 31, 2012 at 12:01 AM, <watchlist(a)xwiki.org> wrote:
> Error number 4001 in 4: Error while parsing velocity page
> xwiki:XWiki.WatchListMessage Wrapped Exception: Failed to evaluate content
> with id [xwiki:XWiki.WatchListMessage]
>
> Error number 4001 in 4: Error while parsing velocity page
> xwiki:XWiki.WatchListMessage
> Wrapped Exception: Failed to evaluate content with id
> [xwiki:XWiki.WatchListMessage]
> com.xpn.xwiki.XWikiException: Error number 4001 in 4: Error while parsing
> velocity page xwiki:XWiki.WatchListMessage
> Wrapped Exception: Failed to evaluate content with id
> [xwiki:XWiki.WatchListMessage]
> at
> com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate(XWikiVelocityRenderer.java:122)
> at
> com.xpn.xwiki.plugin.mailsender.MailSenderPlugin.sendMailFromTemplate(MailSenderPlugin.java:813)
> at
> com.xpn.xwiki.plugin.watchlist.WatchListNotifier.sendEmailNotification(WatchListNotifier.java:133)
> at
> com.xpn.xwiki.plugin.watchlist.WatchListJob.executeJob(WatchListJob.java:270)
> at com.xpn.xwiki.plugin.scheduler.AbstractJob.execute(AbstractJob.java:71)
> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> at
> org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
> Caused by: org.xwiki.velocity.XWikiVelocityException: Failed to evaluate
> content with id [xwiki:XWiki.WatchListMessage]
> at
> org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate(DefaultVelocityEngine.java:247)
> at
> org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate(DefaultVelocityEngine.java:184)
> at
> com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate(XWikiVelocityRenderer.java:117)
> ... 6 more
> Caused by: org.apache.velocity.exception.MethodInvocationException:
> Invocation of method 'getHTMLDiff' in class
> com.xpn.xwiki.plugin.watchlist.WatchListEvent threw exception
> java.lang.NullPointerException at xwiki:XWiki.WatchListMessage[line 148,
> column 33]
> at
> org.apache.velocity.runtime.parser.node.ASTMethod.handleInvocationException(ASTMethod.java:243)
> at
> org.apache.velocity.runtime.parser.node.ASTMethod.execute(ASTMethod.java:187)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.execute(ASTReference.java:280)
> at org.apache.velocity.context.ProxyVMContext.get(ProxyVMContext.java:219)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.getVariableValue(ASTReference.java:991)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.execute(ASTReference.java:240)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.value(ASTReference.java:567)
> at
> org.apache.velocity.runtime.parser.node.ASTExpression.value(ASTExpression.java:71)
> at
> org.apache.velocity.runtime.parser.node.ASTSetDirective.render(ASTSetDirective.java:142)
> at
> org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:72)
> at
> org.apache.velocity.runtime.directive.VelocimacroProxy.render(VelocimacroProxy.java:216)
> at
> org.apache.velocity.runtime.directive.RuntimeMacro.render(RuntimeMacro.java:311)
> at
> org.apache.velocity.runtime.directive.RuntimeMacro.render(RuntimeMacro.java:230)
> at
> org.apache.velocity.runtime.parser.node.ASTDirective.render(ASTDirective.java:207)
> at
> org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:72)
> at
> org.apache.velocity.runtime.parser.node.ASTIfStatement.render(ASTIfStatement.java:87)
> at
> org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:72)
> at org.apache.velocity.runtime.directive.Foreach.render(Foreach.java:420)
> at
> org.apache.velocity.runtime.parser.node.ASTDirective.render(ASTDirective.java:207)
> at
> org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:342)
> at
> org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate(DefaultVelocityEngine.java:228)
> ... 8 more
> Caused by: java.lang.NullPointerException
> at com.xpn.xwiki.objects.BaseProperty.equals(BaseProperty.java:108)
> at com.xpn.xwiki.objects.NumberProperty.equals(NumberProperty.java:57)
> at com.xpn.xwiki.objects.BaseCollection.equals(BaseCollection.java:624)
> at com.xpn.xwiki.objects.classes.BaseClass.getDiff(BaseClass.java:1211)
> at com.xpn.xwiki.doc.XWikiDocument.getClassDiff(XWikiDocument.java:5600)
> at
> com.xpn.xwiki.plugin.watchlist.WatchListEvent.getHTMLDiff(WatchListEvent.java:534)
> at sun.reflect.GeneratedMethodAccessor706.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.velocity.util.introspection.UberspectImpl$VelMethodImpl.doInvoke(UberspectImpl.java:395)
> at
> org.apache.velocity.util.introspection.UberspectImpl$VelMethodImpl.invoke(UberspectImpl.java:384)
> at
> org.apache.velocity.runtime.parser.node.ASTMethod.execute(ASTMethod.java:173)
> ... 27 more
>
>
> Wrapped Exception:
>
> org.apache.velocity.exception.MethodInvocationException: Invocation of
> method 'getHTMLDiff' in class com.xpn.xwiki.plugin.watchlist.WatchListEvent
> threw exception java.lang.NullPointerException at
> xwiki:XWiki.WatchListMessage[line 148, column 33]
> at
> org.apache.velocity.runtime.parser.node.ASTMethod.handleInvocationException(ASTMethod.java:243)
> at
> org.apache.velocity.runtime.parser.node.ASTMethod.execute(ASTMethod.java:187)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.execute(ASTReference.java:280)
> at org.apache.velocity.context.ProxyVMContext.get(ProxyVMContext.java:219)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.getVariableValue(ASTReference.java:991)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.execute(ASTReference.java:240)
> at
> org.apache.velocity.runtime.parser.node.ASTReference.value(ASTReference.java:567)
> at
> org.apache.velocity.runtime.parser.node.ASTExpression.value(ASTExpression.java:71)
> at
> org.apache.velocity.runtime.parser.node.ASTSetDirective.render(ASTSetDirective.java:142)
> at
> org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:72)
> at
> org.apache.velocity.runtime.directive.VelocimacroProxy.render(VelocimacroProxy.java:216)
> at
> org.apache.velocity.runtime.directive.RuntimeMacro.render(RuntimeMacro.java:311)
> at
> org.apache.velocity.runtime.directive.RuntimeMacro.render(RuntimeMacro.java:230)
> at
> org.apache.velocity.runtime.parser.node.ASTDirective.render(ASTDirective.java:207)
> at
> org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:72)
> at
> org.apache.velocity.runtime.parser.node.ASTIfStatement.render(ASTIfStatement.java:87)
> at
> org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:72)
> at org.apache.velocity.runtime.directive.Foreach.render(Foreach.java:420)
> at
> org.apache.velocity.runtime.parser.node.ASTDirective.render(ASTDirective.java:207)
> at
> org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:342)
> at
> org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate(DefaultVelocityEngine.java:228)
> at
> org.xwiki.velocity.internal.DefaultVelocityEngine.evaluate(DefaultVelocityEngine.java:184)
> at
> com.xpn.xwiki.render.XWikiVelocityRenderer.evaluate(XWikiVelocityRenderer.java:117)
> at
> com.xpn.xwiki.plugin.mailsender.MailSenderPlugin.sendMailFromTemplate(MailSenderPlugin.java:813)
> at
> com.xpn.xwiki.plugin.watchlist.WatchListNotifier.sendEmailNotification(WatchListNotifier.java:133)
> at
> com.xpn.xwiki.plugin.watchlist.WatchListJob.executeJob(WatchListJob.java:270)
> at com.xpn.xwiki.plugin.scheduler.AbstractJob.execute(AbstractJob.java:71)
> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> at
> org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
> Caused by: java.lang.NullPointerException
> at com.xpn.xwiki.objects.BaseProperty.equals(BaseProperty.java:108)
> at com.xpn.xwiki.objects.NumberProperty.equals(NumberProperty.java:57)
> at com.xpn.xwiki.objects.BaseCollection.equals(BaseCollection.java:624)
> at com.xpn.xwiki.objects.classes.BaseClass.getDiff(BaseClass.java:1211)
> at com.xpn.xwiki.doc.XWikiDocument.getClassDiff(XWikiDocument.java:5600)
> at
> com.xpn.xwiki.plugin.watchlist.WatchListEvent.getHTMLDiff(WatchListEvent.java:534)
> at sun.reflect.GeneratedMethodAccessor706.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.velocity.util.introspection.UberspectImpl$VelMethodImpl.doInvoke(UberspectImpl.java:395)
> at
> org.apache.velocity.util.introspection.UberspectImpl$VelMethodImpl.invoke(UberspectImpl.java:384)
> at
> org.apache.velocity.runtime.parser.node.ASTMethod.execute(ASTMethod.java:173)
> ... 27 more
>
>
> _______________________________________________
> notifications mailing list
> notifications(a)xwiki.org
> http://lists.xwiki.org/mailman/listinfo/notifications
>
A pity we don't have the document and the compared version or at least
the field name in the log.
According to the error, some property class has a new number field
compared to 3.1.1. Does this sounds expected to anyone ?
--
Thomas Mortagne
Hello fellow XWikiers,
I am regularly working with my favorite IDE, IntelliJ IDEA, and it is doing a very good work for me to edit velocity and groovy. I am using this also for the source of wiki pages which I upload using a tiny upload script (that posts like a form using curl and simple preemptive authentication, [1]).
The interest of storing page-sources as files is that you get all the IDE services (it could work with most IDEs) such as auto-completion, code-usage-tracking, or html and javascript validation.
The other interest is that my source files enter versioning. I can commit these so that I and other developers know it's part of the build for the future, independently of my server upload and developer readable in versioning system.
Thus far I've been using wiki/src/main/pages/<spaceName>/<pageName>.vm (or .grv, .properties, ...).
There's a single issue I stumble across: for IntelliJ to do me classpath resolution (e.g. recognize an import for the Context class), I need to change the maven project type from xar to jar. And this is not so good.
My next step would thus be to create a new sub-project, xwikipages, containing these pages... but maybe everything is wrong here.
My questions:
• how much of that is good or best practice?
• what do others use as IDE-exploitation for XWiki-pages? (XEclipse and the Git xwiki-application?)
• is anyone else interested into sharing such a practice and enhance it commonly?
thanks in advance
Paul
[1] upload-to-wiki can be found in https://github.com/xwiki-contrib/xwiki-clams-core/tree/master/tools/src/mai…
While trying to reduce the likelihood of duplicate ids for documents, and
extending my patch to provide a proper solution for objects, I fall on a
really unexpected issue: the type of the object identifiers are Integer
where those of documents are Long. This is completely abnormal since we
have several objects per documents, and therefore we need more distinct ids
for objects than documents.
I have therefore upgraded the ids of objects to use Long, and provide an
implementation that use the lowest 64bits of an MD5 key for object in the
same way I do for documents. This implementation introduce two new
serializer: UidStringEntityReferenceSerializer and
LocalUidStringEntityReferenceSerializer.
I have also bridged the statistics that derived from objects. The new
implementation works perfectly on a new database but...
... I am unable to provide a proper migration procedure, since hibernate
cannot manage changing the types of existing columns. It does not complains
during schema update, it simply do nothing about them. And later the data
migration breaks since my hashes cannot fit in the database properly. After
thorough googling, I understand that hibernate schema updates were not made
for production use and are really limited opposed to the general idea we
had of it.
Since I am currently stucked, I propose to move ahead and use a new tool to
properly manage our database schema upgrade:
http://www.liquibase.org/
Liquibase is Apache licensed and provide a database agnostic version system
for migrating databases. It does not works like a diff tool, but more like
a patch tool, where you provide several XML description of your wanted
changes and it manage to apply or rollback there changes in an ordered
manner to upgrade the database to the latest schema.
There is several way to apply these changes and I would like to see if it
could be integrated in our current migration procedure, or at least as an
independent listener.
WDYT ?
--
Denis Gervalle
SOFTEC sa - CEO
eGuilde sarl - CTO