Hi guys,
There are a couple of places in XWiki where we use trees to display
structured / hierarchical data. Are trees a good solution for data
visualization in those cases? Can those trees be improved? Are there
other places in XWiki where you would like to see a tree?
Let me review some of the trees we currently have:
1. Document Index Tree
wiki > space > page > [attachments | page]
This tree is also used in the WYSIWYG editor when you create a link to
a wiki page or to an attachment. It shows the spaces on the first
level and then under each space the parent-child hierarchy of
documents from that space. If a document has attachments then a
special child node is added. The tree can also display wikis on the
first level and then spaces on the second.
So it mixes two hierarchies: wiki > space > page > attachment and
parent > child. This can be confusing. For instance Blog.WebHome has
Main.WebHome as parent, but you don't see Blog.WebHome under the
Main.WebHome node in the tree because it is in a different space.
2. XAR Import
space > page
This is a very simple tree with only two levels. I don't have any
problem with it. Would be cool if it would show more information, like
attachments or objects, but it's a bit more complex to get this kind
of data from the XAR without reading the documents first.
3. Dynamic Hierarchy Macro (
http://extensions.xwiki.org/xwiki/bin/view/Extension/Dynamic+Hierarchy+Macro
)
parent page > child page
Unlike the Document Index Tree, this tree uses only the parent-child
hierarchy. You don't see the spaces but at least you get the full
parent-child hierarchy. This time Blog.WebHome is a child of the
Main.WebHome node.
I'm not sure it's better than the Document Index Tree, at least not on
the default XWiki documents, maybe because the document titles and the
way we set the parent in the default distribution is not very
consistent.
----------
WDYT about these trees?
As a developer, I would love to see a full XWiki Entity Tree:
wiki > space > page > [translations | attachments | objects |
classProperties] > object > objectProperties
As in http://imgur.com/q0br8xT .
As a user, I don't know. You tell me :)
Thanks,
Marius
Hi devs,
I just created http://jira.xwiki.org/browse/XWIKI-11108.
You have all the details in the issue but basically there is a
difference between old and new localization systems behaviors with
document based translation bundles that can lead to translations being
broken after an upgrade in the following use case:
* default version of the document contain French override of some standard key
* a translation of the document is "en" and empty
When current locale is "en":
* before 4.3 you get default translation from the
ApplicationResources.properties
* after 4.3 you get the French override from the document because this
is the default translation and documents have priority over resources.
Do we need to do something about it ?
Overriding standard translation with a totally inconsistent document
translation was not very clean in the first place IMO and I'm not sure
it really worth it (supporting it in the current architecture would
probably mean breaking some API).
--
Thomas Mortagne
Hello,
I'm trying to import Wikipedia xml (English dumps w/o history) into the
Xwiki 6.0.1 running using PostgreSQL as a DB. I'm using mediawiki/1.0
syntax to easy job on my side especially when the task is to test if
xwiki is able to hold just this amount of data and nothing more.
So far probably the most critical found issues are:
1) wikipedia's links are a little bit longer than expected. I'm afraid
this is usually whole citation going into the link hence after
installing xwiki and initialization of hibernate I needed to switch it
off and alter PostgreSQL table by:
alter table xwikilinks alter column xwl_link type varchar(4096);
that ensures that much more pages may be imported.
2) while importing I hit issue on duplication of the xwikircs_pkey key.
It shows as:
STATEMENT: insert into xwikircs (XWR_DATE, XWR_COMMENT, XWR_AUTHOR,
XWR_DOCID, XWR_VERSION1, XWR_VERSION2) values ($1, $2, $3, $4, $5, $6)
ERROR: duplicate key value violates unique constraint "xwikircs_pkey"
DETAIL: Key (xwr_docid, xwr_version1,
xwr_version2)=(3170339397610733377, 1, 1) already exists.
in PostgreSQL console and as:
2014-09-22 00:53:51,601
[http://localhost:8080/xwiki/rest/wikis/xwiki/spaces/Wikipedia/pages/Brecon_…]
WARN o.h.u.JDBCExceptionReporter - SQL Error: 0, SQLState: 23505
2014-09-22 00:53:51,601
[http://localhost:8080/xwiki/rest/wikis/xwiki/spaces/Wikipedia/pages/Brecon_…]
ERROR o.h.u.JDBCExceptionReporter - Batch entry 0 insert into
xwikircs (XWR_DATE, XWR_COMMENT, XWR_AUTHOR, XWR_DOCID, XWR_VERSION1,
XWR_VERSION2) values ('2014-09-22 00:53:51.000000 +02:00:00', '',
'XWiki.Admin', 3170339397610733377, 1, 1) was aborted. Call
getNextException to see the cause.
2014-09-22 00:53:51,601
[http://localhost:8080/xwiki/rest/wikis/xwiki/spaces/Wikipedia/pages/Brecon_…]
WARN o.h.u.JDBCExceptionReporter - SQL Error: 0, SQLState: 23505
2014-09-22 00:53:51,601
[http://localhost:8080/xwiki/rest/wikis/xwiki/spaces/Wikipedia/pages/Brecon_…]
ERROR o.h.u.JDBCExceptionReporter - ERROR: duplicate key value
violates unique constraint "xwikircs_pkey"
Detail: Key (xwr_docid, xwr_version1,
xwr_version2)=(3170339397610733377, 1, 1) already exists.
in xwiki/tomcat console.
This issue I'm not able to solve so far as it looks like the key value
itself is somehow generated by xwiki probably from some other data and
I'm not able to find so far related code.
Also the question is if this is kind of hash function if I did not break
that by making links longer by hack in (1).
Any comment on (1) and its correctness and idea to fix (2) is highly
appreciated here.
Thanks!
Karel
Hi devs,
XWiki 6.3 is the first of the 2 stabilization releases leading to the end of the 6.x cycle. As such it’s important to polish and fix bugs. It’s also going to be short so that we can finish 6.4 by the end of the year and start the new XWiki cycle (7.x) at beginning of next year.
After brainstorming offline with several XWiki committers (devs from XWiki SAS mostly) here’s a proposal for the 6.3 roadmap.
Content
=======
* Guillaume with help from Caty: Continue polishing Flamingo. Some specific items:
- Ensure it works fine on mobile devices
- Fix known bugs. Among them:
— Make Livetables responsive with screen size XWIKI-10727
- Make Tables responsive XWIKI-10736
- Polishing
- Fix the xwiki-enterprise-test-webstandards build!
* Thomas: Continue on improving page loading times and performances in general as much as possible in the time frame, with the goal of having better performances than we had in 5.4.5.
* Marius: Finish File Manager + javascript framework evaluation/proposal (propose angularJS with LT in angular + treeview, etc)
- Would also be good to fix the Tree view in AllDocs using the new jstree-based tree used for FM (and support drag and drop reorganization)
* Edy: Implement the JIRAs from the list below:
- Extension Manager add extension search should suggest only compatible versions XWIKI-9920
- Welcome block is not easily editable XE-1389
- Delete space should list all pages that are going to be deleted XWIKI-8320 - Clemens started this one. Clemens, do you want to finish it for 6.3?
- Wrong rendering of annotations for documents with sheets XWIKI-6328
- The default value of a date field should be empty or today, not the date when the class has been created XWIKI-10296
- User Directory should be configurable globally XWIKI-9170
* Caty: Collaborative apps investigations and specific investigation on some Collaborative apps + help the work on polishing Flamingo and make it work on mobile devices.
* Lyes: Work on the Repository Application/Extension Manager (with Thomas) to be able to surface the “best” extensions both inside XWiki’s EM UI and on e.x.o + work on Collaborative apps (especially on File Manager to take over from Marius)
Anyone else wishing to commit to something for 6.3? Denis, signed scripts? ;)
Dates
=====
This is a short stabilization release (1 month roughly):
- 6.3M1: 13 October (3 weeks)
- 6.3M2: 27 October (2 weeks)
- 6.3RC1: 3 November (1 week)
- 6.3Final: 10 November (1 week)
WDYT?
Thanks
-Vincent
Hi devs,
Right now there’s a problem when XWiki is shutdown. This is what currently happens:
2014-09-23 13:54:05,563 [Thread-1] DEBUG o.x.shutdown - Starting XWiki shutdown...
2014-09-23 13:54:05,563 [Thread-1] DEBUG o.x.shutdown - Stopping Database Connection Pool...
2014-09-23 13:54:05,735 [Thread-1] DEBUG o.x.shutdown - Database Connection Pool has been stopped
2014-09-23 13:54:10,513 [Thread-1] DEBUG o.x.shutdown - Stopping component [org.xwiki.localization.wiki.internal.DocumentTranslationBundleFactory]…
2014-09-23 13:54:10,513 [Thread-1] DEBUG o.x.shutdown - Component [org.xwiki.localization.wiki.internal.DocumentTranslationBundleFactory] has been stopped
…
2014-09-23 13:54:10,514 [Thread-1] DEBUG o.x.shutdown - Stopping component [org.xwiki.search.solr.internal.DefaultSolrIndexer]...
2014-09-23 13:54:10,514 [Thread-1] DEBUG o.x.shutdown - Component [org.xwiki.search.solr.internal.DefaultSolrIndexer] has been stopped
…
2014-09-23 13:54:10,539 [Thread-1] INFO o.x.shutdown - XWiki shutdown has been completed.
As you can see the problem is that the DB is stopped first. So any code accessing the DB will get an error once the connection pool is stopped. This is why we get a lot stack traces when you ctrl-c XWiki quickly after it starts (i.e. when the init is not finished or when SOLR is still indexing). You can also get stack traces if a watchlist is being sent or some scheduler job is executing for example.
There are 2 ways to fix this:
- solution 1: all code that accesses the DB could check if an XWiki shutdown is in progress and don’t generate stack traces if this is the case
- solution 2: stop the DB as the last thing, after everything else has stopped
Obviously solution 2 is much better.
I’ve started implementing it by doing this:
- Remove the current HibernateShutdownEventListener since this is executed first during shutdown, before component disposal
- Instead have XWikiHibernateStore implement Disposable and move the content of HibernateShutdownEventListener in that method
This works already better but it’s not enough since ECM.dispose() works out the component dependencies but this works only for explicit dependencies and doesn’t work if a component uses the ComponentManager to dynamically find implementations (as this is the case for example for DefaultSolrIndexer).
Thus in order to help the component disposal process, I propose to introduce a new annotation: @DisposePriority(int).
Note that an alternative would be to introduce a new Disposable interface with a new getPriority() method in it (we cannot modify the current Disposable since that would break backward compat).
The idea is then to use this information in ECM.dispose() to first sort all components based on this and then only to order them with their dependencies.
WDYT?
Thanks
-Vincent
Hi devs,
We’ve released XWiki 6.2 on the 18th of September, i.e. almost a week ago.
In the meantime we’ve found and fixed some bugs:
http://jira.xwiki.org/secure/Dashboard.jspa?selectPageId=12597
16 issues fixed, 10 bugs fixed (so far). Including some pretty important ones.
Thus I’d like to propose to release 6.2.1 on the 29th (next Monday).
And then I’d like to plan a 6.2.2 around the 15th of October.
Thanks
-Vincent
Hello,
I'm attempting to import Wikipedia dump into XWiki using mediawiki/1.0
syntax. Since this process is quite slow and dealing with a lot of data
I'm trying to strip everything I don't need now.
While connecting visualvm to the xwiki JVM I've noticed that majority of
both CPU and memory resource is consumed by Solr index thread and Lucene
index thread. Solr leads here. To switch this off, I've switched off
completely search suggestions in Home->Administer Wiki->Search Sugest
and I've also switched Search system to be based on Database instead of
Solr. I've restarted XWiki and to my surprise from console log I see
Solr is started again and while profiling with visual vm I see again
those two index threads running and consuming majority of resources.
The platform is JDK 8, Solaris 11.1, XWiki 6.0.1 on top of PostgreSQL.
Is there any hint how to switch those index/search engines off completely?
Thanks!
Karel
Hi,
I would like a repository on https://github.com/xwiki-contrib/ for
publishing the code behind the design.xwiki.org application.
It's an application intended to collect feature proposals (requirements,
design mockups, implementation details, etc.).
Name: application-proposal
Description: Manages feature proposals from idea state to completion
User: evalica
Thanks,
Caty
The XWiki development team is proud to announce the availability of XWiki
6.2.
This release is mainly focused on the Flamingo skin and it being now used
by default, but also features improvements for applications such as AWM and
Blog and various performance improvements.
Developers can benefit from new APIs such as the new Mail Sender API and
the new Blame API but also from improved APIs such as the wiki module API
and JS widgets.
You can download it here: http://www.xwiki.org/xwiki/bin/view/Main/Download
Make sure to review the release notes:
http://www.xwiki.org/xwiki/bin/view/ReleaseNotes/ReleaseNotesXWiki62
Thanks
-The XWiki dev team
Dear XWiki developers,
again and again I see code in our Curriki (quite some of which has been written by XWiki experts) that employ LIKE hql queries to evaluate if an XWikiRights object applied to a give group or user.
I has happened often that some such queries kill our production application server; in the meantime we've managed to remove them. However, there are many more and it really appears inefficient to me.
Is the "new xwiki rights model" that has been talked about here and there an enhancement in this respect?
(it is in XWiki 6 or so if I remember well, we are still at 3.5).
Simply splitting the addresses of the xwikirights object in the DB could leverage effectively an index which these LIKE queries fail to do.
thanks in advance.
Paul