Hi all,
I would to know how the document rights is executed in XWiki.
Furthermore, i would like to ask whether can the XWiki rights follow the Active Directory rights.
Eg. Only users in the group abc (Active Directory) can access the webspace abc.
Regards,
Colin
__________________________________ <br>
Tired of visiting multiple sites for showtimes?
Yahoo! Movies is all you need
Hi Sergiu/Evelina,
Just wanted to check if you've been able to progress on the UI
problem. I thought I had created a jira issue a week ago about it but
couldn't find it so I've recreated one:
http://jira.xwiki.org/jira/browse/XE-166
We cannot release RC2 without this fixed.
Thanks
-Vincent
Cristian and Marius are working on Curriki, especially the space manager
and invitation manager plugins which will move to the core plugins.
We would like to add them as Curriki commiters (which means currently
core SVN commit rights but only allowed to commit in the curriki zone).
I'm also proposing to promote Marius to XWiki commiter as he has
provided modules/patches especially for the Statistics and now the
Invitation Manager.
Here is my +1 for both
Ludovic
--
Ludovic Dubost
Blog: http://blog.ludovic.org/
XWiki: http://www.xwiki.com
Skype: ldubost GTalk: ldubost
Hi all,
One Codehaus (http://www.codehaus.org/) need to eventually use XWiki
as main web site engine would be for us to implement an HTML export
feature.
I made some work on this and I propose to commit on 1.3 (Sergiu just
created the 1.2 branch) a new ExportAction (like the pdf/rtf one) that
handle HTLM/ZIP export (http://jira.xwiki.org/jira/browse/XWIKI-564):
- support a range a multiwiki pages in view mode without request parameters
- add skin dependencies in the package
- add attachments in the package
- modify links targeting skin, attachment and exported pages in
exported pages (using a custom URL factory)
- package all this in a zip file
WDYT ?
--
Thomas Mortagne
Hi,
I too wish to checkout the source code of xwiki and build the application
from it.
Could anyone please tell me the svn url for the same.
I tried accessing the old links pasted in this forum for the same ie:
http://www.xwiki.org/xwiki/bin/view/Community/SourceRepositoryhttp://www.xwiki.org/xwiki/bin/view/Community/Building
but since the xwiki site has been migrated to new version, these links are
not longer reachable.
It would be great if someone can get these links back up or tell what are
the new links to access the content under these.
Thanks
Sachin
SAi Kumar wrote:
>
> Hi,
> I am new to Xwiki.I need Xwiki source code to build and Xwiki application.
> Can you tel me where can i download the following modules
> xwiki-platform-tools
> xwiki-platform-core
> xwiki-platform-web
> xwiki-platform-plugins
> xwiki-platform-applications
> xwiki-product-enterprise
> xwiki-product-enterprise-manager
> xwiki-product-curriki
>
> Thanks in Advance,
> Saikumar B
>
--
View this message in context: http://www.nabble.com/Where-to-get-the-Xwiki-source-code.--tp13080439p14370…
Sent from the XWiki- Dev mailing list archive at Nabble.com.
Hi Sergiu,
Shouldn't this have been a new migrator instead? Artem had a comment
about leaving the old XWD_ARCHIVE column in the document table. Lots
of users have already executed this migrator so they won't execute
this. So if it's important to have everyone in the same state, I think
a new migrator would be maybe better. But then, maybe it's not
important?
Thanks
-Vincent
On Dec 14, 2007, at 2:14 PM, sdumitriu (SVN) wrote:
> Author: sdumitriu
> Date: 2007-12-14 14:14:08 +0100 (Fri, 14 Dec 2007)
> New Revision: 6377
>
> Modified:
> xwiki-platform/core/trunk/xwiki-core/src/main/java/com/xpn/xwiki/
> store/migration/hibernate/R4359XWIKI1459Migrator.java
> Log:
> XWIKI-1954: When migrating the document archive format from 1.1 to
> 1.2, delete the contents of the old XWD_ARCHIVE field
> Fixed.
>
>
> Modified: xwiki-platform/core/trunk/xwiki-core/src/main/java/com/xpn/
> xwiki/store/migration/hibernate/R4359XWIKI1459Migrator.java
> ===================================================================
> --- xwiki-platform/core/trunk/xwiki-core/src/main/java/com/xpn/xwiki/
> store/migration/hibernate/R4359XWIKI1459Migrator.java 2007-12-14
> 13:08:02 UTC (rev 6376)
> +++ xwiki-platform/core/trunk/xwiki-core/src/main/java/com/xpn/xwiki/
> store/migration/hibernate/R4359XWIKI1459Migrator.java 2007-12-14
> 13:14:08 UTC (rev 6377)
> @@ -19,6 +19,7 @@
> */
> package com.xpn.xwiki.store.migration.hibernate;
>
> +import java.sql.PreparedStatement;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.sql.Statement;
> @@ -88,7 +89,7 @@
> Statement stmt =
> session.connection().createStatement();
> ResultSet rs;
> try {
> - rs = stmt.executeQuery("select XWD_ID,
> XWD_ARCHIVE, XWD_FULLNAME from xwikidoc");
> + rs = stmt.executeQuery("select XWD_ID,
> XWD_ARCHIVE, XWD_FULLNAME from xwikidoc where XWD_ARCHIVE is not
> null order by XWD_VERSION");
> } catch (SQLException e) {
> // most likely there is no XWD_ARCHIVE
> column, so migration is not needed
> // is there easier way to find what column
> is not exist?
> @@ -97,6 +98,7 @@
> Transaction originalTransaction =
> ((XWikiHibernateVersioningStore
> )context.getWiki().getVersioningStore()).getTransaction(context);
>
> ((XWikiHibernateVersioningStore
> )context.getWiki().getVersioningStore()).setSession(null, context);
>
> ((XWikiHibernateVersioningStore
> )context.getWiki().getVersioningStore()).setTransaction(null,
> context);
> + PreparedStatement deleleteStatement =
> session.connection().prepareStatement("update xwikidoc set
> XWD_ARCHIVE=null where XWD_ID=?");
>
> while (rs.next()) {
> if (logger.isInfoEnabled()) {
> @@ -104,13 +106,13 @@
> }
> long docId = Long.parseLong(rs.getString(1));
> String sArchive = rs.getString(2);
> - if (sArchive==null) {
> - continue;
> - }
> XWikiDocumentArchive docArchive = new
> XWikiDocumentArchive(docId);
> docArchive.setArchive(sArchive);
>
> context
> .getWiki().getVersioningStore().saveXWikiDocArchive(docArchive,
> true, context);
> + deleleteStatement.setLong(1, docId);
> + deleleteStatement.executeUpdate();
> }
> + deleleteStatement.close();
> stmt.close();
>
> ((XWikiHibernateVersioningStore
> )context.getWiki().getVersioningStore()).setSession(session, context);
>
> ((XWikiHibernateVersioningStore
> )context
> .getWiki().getVersioningStore()).setTransaction(originalTransaction,
> context);
>
> _______________________________________________
> notifications mailing list
> notifications(a)xwiki.org
> http://lists.xwiki.org/mailman/listinfo/notifications
Hi,
I spent most of today profiling XWiki, and here is a summary of my findings.
First, I tested the following three situations:
(1) Creating many documents with 500 versions each, using XWiki 1.1
(2) Updating those documents to XWiki 1.2
(3) Creating many documents with 500 versions each, using XWiki 1.2
(1) spent most of the time in Hibernate and RCS. And by most, I mean
almost all time. For rcs, each new version required parsing all the
history, adding the new version, reserializing the history. And the RCS
implementation we're using (org.suigeneris.jrcs) seems very inefficient,
a lot of calls ending at charAt, substring, split. And since each update
requires sending all the history back, the rest of the time was spent in
hibernate, sending large strings to mysql. Creating one document with
all its 500 versions takes several minutes.
(2) spent most of the time doing one thing: parsing the old RCS archive.
Still, a lot less time was spent in Hibernate, about 2% of all the
execution time, as opposed to 80% in (1). Updating one 500 versions long
archive took 6 seconds. Updating a 300 version document takes 2 seconds,
while a document with one version is almost instantly updated.
(3) spends incredibly much less time in rcs, as expected (1% of all the
running time). So the new archive mechanism is a major performance
boost. Instead, most of the time is spent in saveBacklinks, which goes
to Radeox. This is even more serious, given the fact that the document
content was very small (200 random characters). I'd say that a major
flaw is that the backlinks gathering process uses the radeox engine with
all the filters and macros enabled, while all we need is the links
filter. Here, creating a document requires a few seconds (2-3).
From the remaining (little) time that is left, a big part is consumed
with document cloning. See http://jira.xwiki.org/jira/browse/XWIKI-1950
for this. This is true both for XWiki 1.1 and 1.2.
On the database performance, beside the fact that Hibernate is slow, too
much time is spent on saveOrUpdate. It seems that instead of checking
the document object to see if it needs to be saved or updated, Hibernate
retrieves the document from the database and compares the retrieved
versions with the provided object to see if it is a new object that
needs to be inserter or an existing one that needs updating.
As a total for (3): 90% of the time is spent in saveDocument, of which
38% in saveLinks (most of which is spent in radeox), 22% in
hibernate.saveOrUpdate, 16% in hibernate.endTransaction (actual saving
of the document to the database) and 13% in updateArchive.
Thus, gathering backlinks with only one filter enabled will probably
reduce 35% of the running time, and improving the calls to saveOrUpdate
will give another 10-20%.
Remarks:
- This shows only the performance when saving documents, so I will need
another day to test the performance on view. I'd say that most of the
time will be spent in radeox, parsing the wiki content.
- As it is known, observing a process modifies it. All the numbers would
probably be (a little) different under real usage, without a profiler
sniffing everything.
Sergiu
Hi all,
I made a tool to easily manage wiki class and wiki class's objets bundled in
application manager some times ago. Problem is theses tools changed since
it's creation and new has very wrong names and I would like to rename it.
There is two java classes :
- SuperClass : manage wiki class with it's attached document/sheet/template
(assume the wiki class exist in the wiki we working on) and give methods to
search, etc. in documents containing objects of this class
- SuperDocument : the default implementation overload Document and manage
one object of a particular wiki class in its document (to be able to save or
delete it), this class is generally overloaded to add some acces methods
like getDocuments() in Application java class.
These tools are also usable from Velocity.
For some examples you can view Application Manager plugin (Application and
ApplicationClass classes) and Wiki Manager plugin (XWikiServer and
XWikiServerClass classes).
I propose :
- SuperClass -> ClassManager
- SuperDocument -> ObjectDocument (or ObjectManager but i don't like it
because it does not imply object in a document)
--
Thomas Mortagne