Hi again,
I wanted to create a space for making notes for myself about administrating
xwiki.
As admin, I created new space, and wanted to add a link to XWikiPreferences.
I clicked Edit, to edit the page.
I clicked Link, to add a link to another page.
I expanded XWiki, but did not find XWikiPreferences
I typed in XWikiPreferences (see screenshot), and it seemed to work fine -
it found its child pages too
I clicked select, and then I saw what you see in the screenshot - a
"DocumentDoesNotExist" returned and squashed into a small box.
what happened?
thanks
Paul
Hi!
Kindly ask you to solve some unclear topics:
I. How can I find information about such dependencies:
- How many server RAM memory is required for each 1GB of attachments?
- The same for CPU.
How to estimate this and calculate hardware? What are main principles?
II. Is it possible to customise WYSIWYG editor separetly for each space in one sub-XWiki ?
III. Is there any way to manage anchors from Links plugin in WYSIWYG editor?
The logic is:
- select space
- select page
- select anchor on this page
- put the link
For now, even if I write down XWiki.WebHome#anchor in the link field manually, I get #anchor cut out.
And the only way to do it via source code editor manually. Then it works fine. Personally me found more or less suitable solution with FF plugin https://addons.mozilla.org/ru/firefox/addon/416/
It's very easy to get anchor, but not so easy to put it. For unqualified users it makes XWiki "one-handed".
IV. Is there any way to make TOC macro to build table of contents of several pages and put it on one page?
For Example:
toc Page1, Page2, Page3 ....
It's very useful, when one can group all project highligts together in one TOC.
I used to use Track Wiki, it works excellent in there. I suffer from it's absence now :-)
Thank you
Dmitry Bakbardin
Hi,
I have an existing XWiki enterprise 1.3.1 running with a mysql backend
and storage version 7351. I would like to upgrade to XWiki enterprise
2.6, but I am running into problems.
- a xar export eats all ram, and the max heap is already the maximum
of my 32-bit linux (-Xmx2600m) . When I increase it the jvm fails to
start. The backup.xar stops at around 1.1GB, and then I gets
OutOfMemory execeptions all over the place.
- when I install 2.6 using the same mysql backend, the storage
migration throws a massive amoutn of errors, mostly due to "Exception
while saving object XWiki.XWikiUsers". This is migration
R15428XWIKI2977.
Is there a way to perform an offline migration, so I can debug where
needed, ot a way to perform an export and import in a clean database
offline?
Regards,
Leen
I see in the administration documentation:
Encrypt cookies using IP address
Even if the password cannot be extracted from the cookie, the cookies might
be stolen See: XSS and used as they are.
By setting the xwiki.cfg parameter xwiki.authentication.useip to true you
can block the cookies from being used except by the same ip address which
got them.
But when I look in xwiki.cfg, there is no mention of useip. Is this option
still recommended for use?
thanks
Paul
Hi all,
I see in the "Access Rights" documentation that there are 3 ways it can be
set up:
Open Wiki
Public Wiki
Public Wiki with confirmed registration
All of those options allow the user to register without forcing the admin to
confirm the registration.
I don't want users to be able to register themselves. I have a small set of
special users and I want to be able to register them manually, or at least
have to confirm their registration before an account is created for them.
any normal visitor should not be able to modify anything on the website, and
that includes registering themselves.
is this possible?
thanks
Paul
Hi
Our XWIKI is multi-language.
When we translate a document into other languages how are these documents stored?
The issue I have is that when we update the original it is not easy to remove the incorrect translations
How do I remove only a translation without removing the default language (or all the translations at once, without the original)?
Gerritjan
Hi,
today I've noticed that something bad had happen to some of the attachments in my XWiki, here is a
screenshot from one of the affected pages:
http://i.imgur.com/p6Xs7.png
Take a look, a couple of attachments have been uploaded but only one is displayed in the attachment tab.
Person who uploaded them claims that yesterday they were ok, but today somehow they disappeared.
It's weird that there is no trace of any operation on them after the uploading phase.
I'm using XWiki Enterprise 2.5.32127 with MySQL data base (Server version 5.1.47).
To add more context, last days my users started to add more attachements to their pages. Currently the
database after the dump is around 200 MB large.
Also looked at the logs and found several interesting fragments ( all of the log snippets are from the time
this have been noticed ):
2010-11-18 09:03:09,355
[http://apps.man.poznan.pl:28181/xwiki/bin/download/Documents/Proposals/2009…]
ERROR web.XWikiAction - Connection aborted
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
2010-11-18 13:23:53,118 [http://localhost:28181/xwiki/bin/view/Projects/Opinion+Mining] WARN
xwiki.MyPersistentLoginManager - Login cookie validation hash mismatch! Cookies have been tampered with
2010-11-18 13:23:53,119 [http://localhost:28181/xwiki/bin/view/Projects/Opinion+Mining] WARN
xwiki.MyPersistentLoginManager - Login cookie validation hash mismatch! Cookies have been tampered with
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
2010-11-18 13:57:55,471 [Lucene Index Updater] WARN lucene.AttachmentData - error getting content
of attachment [2009BEinGRIDwow2greenCONTEXTREVIEW.PPT] for document [xwiki:Documents.Presentations]
org.apache.tika.exception.TikaException: TIKA-198: Illegal IOException from
org.apache.tika.parser.microsoft.OfficeParser@72be25d1
at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:138)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:99)
at org.apache.tika.Tika.parseToString(Tika.java:267)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getContentAsText(AttachmentData.java:161)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getFullText(AttachmentData.java:136)
at com.xpn.xwiki.plugin.lucene.IndexData.getFullText(IndexData.java:190)
at com.xpn.xwiki.plugin.lucene.IndexData.addDataToLuceneDocument(IndexData.java:146)
at com.xpn.xwiki.plugin.lucene.AttachmentData.addDataToLuceneDocument(AttachmentData.java:65)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.addToIndex(IndexUpdater.java:296)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.updateIndex(IndexUpdater.java:237)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runMainLoop(IndexUpdater.java:171)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runInternal(IndexUpdater.java:153)
at com.xpn.xwiki.util.AbstractXWikiRunnable.run(AbstractXWikiRunnable.java:99)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.io.IOException: Cannot remove block[ 4209 ]; out of range[ 0 - 3804 ]
at org.apache.poi.poifs.storage.BlockListImpl.remove(BlockListImpl.java:98)
at org.apache.poi.poifs.storage.RawDataBlockList.remove(RawDataBlockList.java:32)
at org.apache.poi.poifs.storage.BlockAllocationTableReader.<init>(BlockAllocationTableReader.java:99)
at org.apache.poi.poifs.filesystem.POIFSFileSystem.<init>(POIFSFileSystem.java:164)
at org.apache.tika.parser.microsoft.OfficeParser.parse(OfficeParser.java:74)
at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:132)
... 13 more
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 3999
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 4006
Found a TextHeaderAtom not followed by a TextBytesAtom or TextCharsAtom: Followed by 4006
2010-11-18 15:05:10,412
[http://apps.man.poznan.pl:28181/xwiki/bin/download/Documents/Presentations/…]
ERROR web.XWikiAction - Connection aborted
Unfotunately, today this situation has repeated with other group of users, the same scenario - after the
attachment submission and few edits of the page, they are gone. A snippet from the log from that period of
time ( a lot of that warnings ):
2010-11-19 10:43:37,199 [Lucene Index Updater] WARN util.PDFStreamEngine - java.io.IOException:
Error: expected hex character and not :32
java.io.IOException: Error: expected hex character and not :32
at org.apache.fontbox.cmap.CMapParser.parseNextToken(CMapParser.java:316)
at org.apache.fontbox.cmap.CMapParser.parse(CMapParser.java:138)
at org.apache.pdfbox.pdmodel.font.PDFont.parseCmap(PDFont.java:549)
at org.apache.pdfbox.pdmodel.font.PDFont.encode(PDFont.java:383)
at org.apache.pdfbox.util.PDFStreamEngine.processEncodedText(PDFStreamEngine.java:372)
at org.apache.pdfbox.util.operator.ShowText.process(ShowText.java:45)
at org.apache.pdfbox.util.PDFStreamEngine.processOperator(PDFStreamEngine.java:552)
at org.apache.pdfbox.util.PDFStreamEngine.processSubStream(PDFStreamEngine.java:248)
at org.apache.pdfbox.util.operator.Invoke.process(Invoke.java:74)
at org.apache.pdfbox.util.PDFStreamEngine.processOperator(PDFStreamEngine.java:552)
at org.apache.pdfbox.util.PDFStreamEngine.processSubStream(PDFStreamEngine.java:248)
at org.apache.pdfbox.util.PDFStreamEngine.processStream(PDFStreamEngine.java:207)
at org.apache.pdfbox.util.PDFTextStripper.processPage(PDFTextStripper.java:367)
at org.apache.pdfbox.util.PDFTextStripper.processPages(PDFTextStripper.java:291)
at org.apache.pdfbox.util.PDFTextStripper.writeText(PDFTextStripper.java:247)
at org.apache.pdfbox.util.PDFTextStripper.getText(PDFTextStripper.java:180)
at org.apache.tika.parser.pdf.PDF2XHTML.process(PDF2XHTML.java:56)
at org.apache.tika.parser.pdf.PDFParser.parse(PDFParser.java:79)
at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:132)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:99)
at org.apache.tika.Tika.parseToString(Tika.java:267)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getContentAsText(AttachmentData.java:161)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getFullText(AttachmentData.java:136)
at com.xpn.xwiki.plugin.lucene.IndexData.getFullText(IndexData.java:190)
at com.xpn.xwiki.plugin.lucene.IndexData.addDataToLuceneDocument(IndexData.java:146)
at com.xpn.xwiki.plugin.lucene.AttachmentData.addDataToLuceneDocument(AttachmentData.java:65)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.addToIndex(IndexUpdater.java:296)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.updateIndex(IndexUpdater.java:237)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runMainLoop(IndexUpdater.java:171)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runInternal(IndexUpdater.java:153)
at com.xpn.xwiki.util.AbstractXWikiRunnable.run(AbstractXWikiRunnable.java:99)
at java.lang.Thread.run(Thread.java:662)
One more from another user:
2010-11-19 10:43:37,464 [Lucene Index Updater] WARN util.PDFStreamEngine - java.io.IOException:
Error: expected hex character and not :32
java.io.IOException: Error: expected hex character and not :32
at org.apache.fontbox.cmap.CMapParser.parseNextToken(CMapParser.java:316)
at org.apache.fontbox.cmap.CMapParser.parse(CMapParser.java:138)
at org.apache.pdfbox.pdmodel.font.PDFont.parseCmap(PDFont.java:549)
at org.apache.pdfbox.pdmodel.font.PDFont.encode(PDFont.java:383)
at org.apache.pdfbox.util.PDFStreamEngine.processEncodedText(PDFStreamEngine.java:372)
at org.apache.pdfbox.util.operator.ShowTextGlyph.process(ShowTextGlyph.java:61)
at org.apache.pdfbox.util.PDFStreamEngine.processOperator(PDFStreamEngine.java:552)
at org.apache.pdfbox.util.PDFStreamEngine.processSubStream(PDFStreamEngine.java:248)
at org.apache.pdfbox.util.operator.Invoke.process(Invoke.java:74)
at org.apache.pdfbox.util.PDFStreamEngine.processOperator(PDFStreamEngine.java:552)
at org.apache.pdfbox.util.PDFStreamEngine.processSubStream(PDFStreamEngine.java:248)
at org.apache.pdfbox.util.PDFStreamEngine.processStream(PDFStreamEngine.java:207)
at org.apache.pdfbox.util.PDFTextStripper.processPage(PDFTextStripper.java:367)
at org.apache.pdfbox.util.PDFTextStripper.processPages(PDFTextStripper.java:291)
at org.apache.pdfbox.util.PDFTextStripper.writeText(PDFTextStripper.java:247)
at org.apache.pdfbox.util.PDFTextStripper.getText(PDFTextStripper.java:180)
at org.apache.tika.parser.pdf.PDF2XHTML.process(PDF2XHTML.java:56)
at org.apache.tika.parser.pdf.PDFParser.parse(PDFParser.java:79)
at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:132)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:99)
at org.apache.tika.Tika.parseToString(Tika.java:267)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getContentAsText(AttachmentData.java:161)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getFullText(AttachmentData.java:142)
at com.xpn.xwiki.plugin.lucene.IndexData.getFullText(IndexData.java:190)
at com.xpn.xwiki.plugin.lucene.IndexData.addDataToLuceneDocument(IndexData.java:146)
at com.xpn.xwiki.plugin.lucene.AttachmentData.addDataToLuceneDocument(AttachmentData.java:65)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.addToIndex(IndexUpdater.java:296)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.updateIndex(IndexUpdater.java:237)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runMainLoop(IndexUpdater.java:171)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runInternal(IndexUpdater.java:153)
at com.xpn.xwiki.util.AbstractXWikiRunnable.run(AbstractXWikiRunnable.java:99)
at java.lang.Thread.run(Thread.java:662)
2010-11-19 11:32:39,900 [Lucene Index Updater] WARN lucene.AttachmentData - error getting content
of attachment [2008BEinGRIDdesignconceptdiagramdoneinVisio.vsd] for document [xwiki:Documents.Diagrams]
org.apache.tika.exception.TikaException: Unexpected RuntimeException from
org.apache.tika.parser.microsoft.OfficeParser@54ad9fa4
at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:134)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:99)
at org.apache.tika.Tika.parseToString(Tika.java:267)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getContentAsText(AttachmentData.java:161)
at com.xpn.xwiki.plugin.lucene.AttachmentData.getFullText(AttachmentData.java:136)
at com.xpn.xwiki.plugin.lucene.IndexData.getFullText(IndexData.java:190)
at com.xpn.xwiki.plugin.lucene.IndexData.addDataToLuceneDocument(IndexData.java:146)
at com.xpn.xwiki.plugin.lucene.AttachmentData.addDataToLuceneDocument(AttachmentData.java:65)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.addToIndex(IndexUpdater.java:296)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.updateIndex(IndexUpdater.java:237)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runMainLoop(IndexUpdater.java:171)
at com.xpn.xwiki.plugin.lucene.IndexUpdater.runInternal(IndexUpdater.java:153)
at com.xpn.xwiki.util.AbstractXWikiRunnable.run(AbstractXWikiRunnable.java:99)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.IllegalArgumentException: Found a chunk with a negative length, which isn't allowed
at org.apache.poi.hdgf.chunks.ChunkFactory.createChunk(ChunkFactory.java:120)
at org.apache.poi.hdgf.streams.ChunkStream.findChunks(ChunkStream.java:59)
at org.apache.poi.hdgf.streams.PointerContainingStream.findChildren(PointerContainingStream.java:93)
at org.apache.poi.hdgf.streams.PointerContainingStream.findChildren(PointerContainingStream.java:100)
at org.apache.poi.hdgf.streams.PointerContainingStream.findChildren(PointerContainingStream.java:100)
at org.apache.poi.hdgf.HDGFDiagram.<init>(HDGFDiagram.java:95)
at org.apache.poi.hdgf.extractor.VisioTextExtractor.<init>(VisioTextExtractor.java:52)
at org.apache.poi.hdgf.extractor.VisioTextExtractor.<init>(VisioTextExtractor.java:49)
at org.apache.tika.parser.microsoft.OfficeParser.parse(OfficeParser.java:127)
at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:132)
... 13 more
I'm counting on your help since I don't know it's more XWiki issue or maybe I misconfigured something.
Regards,
Piotr
hi -
I'm one of the stranded former users of the free hosted service wik.is
which mindtouch precipitously cancelled, forcing all users to either go
to one of their paid plans (which don't match my usage), or leave.
I'm going to host it myself this time, and have been looking through
the alternatives. I'm attracted to xwiki for several reasons:
full ACL support; ldap authentication; good open source license without
a bunch of proprietary features; Balsamiq and Word integrations;
wysiwyg editing; and strong REST api.
>From my first glance I do still have some reservations:
1. One concern is convenient maintenance of sorted children.
I see that in xwiki's own documentation wiki, this isn't done:
http://platform.xwiki.org/xwiki/bin/view/AdminGuide/
is just maintained manually as an index page.
As far as I can tell, by default children are just sorted by creation order.
With the http://code.xwiki.org/xwiki/bin/view/Plugins/DocumentTreePlugin
they can instead be sorted by name.
Lastly, there is http://code.xwiki.org/xwiki/bin/view/Plugins/SortedDocumentTreePlugin
But I really don't understand the instructions for "importing" a class to get
an additional sortable attribute.
Also I don't know if these plugins support sort order of spaces too, or not.
Ideally, however it is done, when a page is created, the form would have not
only the page title and parent title, but a place for an optional sort value.
2. I'd like to be able to export an entire space as a big PDF.
I can't tell if this plugin will do that:
http://code.xwiki.org/xwiki/bin/view/Applications/PDFExportPanelApplication
For example, suppose I wanted the whol xwiki AdminGuide as a single PDF,
what would I do?
3. I'd like a real bare bones look -- even slimmer than Confluence or MediaWiki,
and both of those are a little less cluttered than the Toucan skin.
I'm not finding any example skins that are like that.
4. I like the idea of supporting office app clients.
But it seems there is some clumsiness with XOffice specifying a parent:
http://jira.xwiki.org/jira/browse/XOFFICE-243
And I can't find any documentation on editing/creating content from OpenOffice
and/or via any webdav client. Things like URL conventions, etc.
This page http://platform.xwiki.org/xwiki/bin/view/DevGuide/Architecture
mentions XOO (Xwiki OpenOffice) but there is nothing else about it:
http://www.xwiki.org/xwiki/bin/view/Main/Search?text=XOO&wikinames=&space=
5. I might be blind but I can't find any documentation for the blogging support.
6. The page http://code.xwiki.org/xwiki/bin/view/Plugins/
says "Components" are preferred over plugins but the link to "Components" goes nowhere.
In general the documentation wiki is pretty confusing. The Enterprise docs link to basically
all the same places as the platform docs. There seems to be a mixture of information that is
years out of date and information about ideas not yet implemented. Sadly it is an example
of the dangers of using a wiki :(.
-mda
Hi, I create a page with macros, then I included it on another page
(#includeMacros("Sprava.Macros"), or {{include
document="SpaceOf.DocumentToInclude"/}}). Problem is when page is rendered,
sometimes are macros proceed, but sometimes when I change page where I use
macros I got instead rendered macro only name of macro (e.g.
#getType($type)). Page with macros is not changed. Seems to me like a bug.
Another problem is when I left empty line in macro page I got sometimes in
rendered page this text:
<div class="wikimodel-emptyline"></div><div
class="wikimodel-emptyline"></div><div class="wikimodel-emptyline"></div>
Some empty lines do not affect result some yes.
Any idea?
--
View this message in context: http://xwiki.475771.n2.nabble.com/Problem-with-includeMacros-tp5792301p5792…
Sent from the XWiki- Users mailing list archive at Nabble.com.
Hi!
I'm using the following:
$xwiki.jsfx.use("js/xwiki/table/tablefilterNsort.js")
$xwiki.ssfx.use("js/xwiki/table/table.css")
When defining rules="all", and I export the table to pdf, it doesn't come
with the lines it was supposed to come with... The only thing that works is
to define border="1".
How can I get all the lines when exporting my table?
--
Atenciosamente,
Erica Usui.
What are the recommended specs for a server to run XWiki? I understand this
is quite a vague question as there are many variables to consider. I would
like to run this on a VPS so a general spec would be nice.
Thanks
- Eric
Hello xwiki users,
I just installed xwiki on windows server. I unpacked the stand alone
version Stable: xwiki-enterprise-jetty-hsqldb-2.6.zip into a directory
called XWikiHome and setup MySQL data base, created a new user. When I am
in the server itself, I can login. But when I try to login from another
computer I cannot login and the xwiki creates a message:
WARN xwiki.MyPersistentLoginManager - Login cookie validation hash mismatch!
Cookies have been tampered with
I have seen a few other emails regarding the same issue. But I don't see a
solution. One suggestion was to change cookie version
to cookie.setVersion(1); but I am not sure where this change needs to be
made? My XWikiHome folder contains the following folders and files:
XWikiHome:
database (folder)
jetty (folder)
META-INF (folder)
webapps (folder)
start_xwiki
start_xwiki
start_xwiki_debug
start_xwiki_debug
stop_xwiki
stop_xwiki
xwiki
Within the folders, I don't see any configuration file or a file like a
"..LoginManager..". Did I not install this correctly? But I can actually
open the pages when I am in the server from internet explorer. But from
another computer, I get the login page but cannot login. As a side note, I
have tried version 2.5 last week and I did not have this problem. Then I
wanted to change the location of the xwiki, so I deleted that directory and
reinstalled the 2.6 version into another drive.
I appreciate your suggestions. Thanks, - Nevzat
Hi,
lately a user reported to me another issue regarding sections.
Scenario, there are two users and document like that:
== section 1 ==
sect content
== section 2 ==
sect content
== section 3 ==
sect content
1. UserA opens section 2 to edit
2. UserB opens whole document to edit ( no other way but with force editing )
3. UserB adds section 1.5 in the way it is:
== section 1 ==
sect content
== section 1.5 ==
sect content
== section 2 ==
sect content
== section 3 ==
sect content
4. UserB saves document.
5. UserA changes content of the section 2 and saves document.
The result is that userA saves overwrites section 'section 1.5' instead of 'section 2'.
I can see that you're indexing section with numbers and add this as a parameter to edit link and that is
probably the reason of shifting the section change but I've just checked the Media wiki sandbox and the same
scenario causes no errors - they are also adding the 'section' parameter but somehow an extra identification
is performed as well..
Is there any chance to get that working right ? maybe some configuration switch..
Thanks,
Piotr
Hey guys, just subscribed to this list.
I'm currently stuck between two softwares: XWiki and Foswiki.
What I'm looking for is:
Speed/Performance
Access control (Guests can only read and not modify, certain users can
modify)
Good theming/Easy theming.
An overall good wiki with user-friendly UI.
Foswiki is flatfile and xwiki is SQL based. I hear that SQL is generally
better, is that the case? Foswiki seems to be pretty fast even if it is
flatfile.
Thanks :)
- Eric
I have recently started to fill my wiki with lots of content. Currently
I have 4772 XWiki documents and 2260 XWiki objects. Not so much I think,
but it looks like I'm reaching a limit.
This is first time I have looked into SQL internals of XWiki.
Goddamn! There were no indices except primary ones! OK, I have lowered
load time from 15sec to 9sec having created indices (e. g. XWD_FULLNAME
in xwikidocs)
Still too much.
Maybe I could improve it any further, but some queries are maid the way
I don't know obvious way to make index for them. For instance, I can see
"where hidden <> 1 or hidden is null" in queries. It would be nice to
see "where hidden = 0" instead (and null being not allowed)
Xen:~ OCTAGRAM$ time GET http://blog.toom.su/
[...snip...]
</div></div></body>
</html>
real 0m8.395s
user 0m0.158s
sys 0m0.052s
Xen:~ OCTAGRAM$ time GET http://pascal.toom.su/Index
[...snip...]
</div></div></body>
</html>
real 0m5.537s
user 0m0.157s
sys 0m0.043s
Xen:~ OCTAGRAM$ time GET http://www.xwiki.org/xwiki/bin/view/Main/News
[...snip...]
</div></div></body>
</html>
real 0m1.981s
user 0m0.152s
sys 0m0.034s
2 second is also too much, but how do you keep response time that low?
Is it just due to the fact you scatter documents across different
databases or may be there was a huge improvement from 2.4M1?
http://pascal.toom.su/Index is even not a dynamic page, it just has lots
of wiki links.
--
If you want to get to the top, you have to start at the bottom
Hello XWiki Community,
Christmas is coming, and I have a gift for you. I hope you will enjoy it.
As you may know, recently, a Brazilia advertising agency produce fake
vintage ads for Facebook, Twitter, YouTube and Skype. I thought these
posters are really amazing. You will find the story here
http://www.dailybloggr.com/2010/08/cool-vintage-ads-of-facebook-twitter-you…
More recently, I read the book "From Airline Reservations to Sonic the
Hedgehog: A History of the Software Industry" by Martin Campbell-Kelly
(2003). Each chapter is illustrated by an official old ad of a
software. And I was charmed by the poster of AUTOFLOW, showing the
face of a woman with beautiful eyes.
http://u1.ipernity.com/17/25/71/9552571.c1e1c1ef.jpg
This give me an idea : to make a fake XWiki ad, copying the Autoflow
one. Of course, XWiki is a so great wiki that it doesn't need
advertising to spread the world. Maybe it can help... It's the result
of this idea that I'm glad to give you today.
I paid attention to produce this work with public domain material so I
think I can publish this poster in the public domain (even if I'm far
from being a license specialist). So I encourage you to print it on
your t-shirt, on your mug, to modify it, to share it, to redistribute
it.
I've tried to preserve the message of the ad, just replacing some
words, or at least to retain the spirit of the original sentences. But
I'm sure you can find better arguments and it's easy to edit with
Gimp.
This page will show you the fake ad for XWiki in a small size but it's
easier to compare with the original work :
http://www.ipernity.com/doc/sinclair/9552524
The full work is available in a public google doc folder (you don't
need to have an account)
https://docs.google.com/leaf?id=0B2p8AuezuBCcZDc2MWY0M2EtNjBiNC00MWRmLWEzZT…
- copyrights.txt describes the origins of the photo and fonts used
- XWikiForChristmas.xcf is the Gimp file of the poster
- XWikiForChristmas.jpg is the full resolution image (exported from Gimp)
For your information, Autoflow was a software development tool that
produces automatically the flowchart of a computer program. In 1965,
Autoflow was the first commercial software to be sold as an
independant product (not attached to a hardware device). It was a big
success for ADR, its editor, with more than 10000 units sold.
To conclude, I hope this encourage you, the XWiki community, to
develop smart features, to maintain a high level of quality and to
support kindly stupid users.
Long live XWiki and merry Christmas to you and your family.
Maxime Sinclair
Hi,
I'm testing XE 2.6 with XEM. My plugins/applications are
- xwiki-application-application-manager-1.16.xar
- xwiki-application-wiki-manager-1.22.xar
- xwiki-enterprise-manager-application-xem-2.6.xar
(corresponding .jar files are at WEB-INF\lib)
Our Environment is Jboss 4.2.1 web server and Oracle 10g database.
There is a problem with the Activity Stream at the homepages of the farm
members.
The "Activity Stream" keeps empty and there are some identical lines within
the jboss console log:
15:50:59,267 ERROR [DefaultVelocityEngine] Left side ($events.size()) of '>'
operation has null value at unknown namespace[line 648, column 21]
After disabling the "Right column" of the Dashboard page the error does not
appeare.
The error does not happen at the master wiki of the farm
(http://localhost/xwiki/bin/view/Main/Dashboard), only at all member wikis.
Do you have a hint where to look further?
Thanks,
Rudi
--
View this message in context: http://xwiki.475771.n2.nabble.com/i-got-an-null-value-at-unknown-namespace-…
Sent from the XWiki- Users mailing list archive at Nabble.com.
I'll tried to upload a number of files with these results:
1 - file type .jpg, size 2 MB --> ok
2 - file type .mp3, size 11 MB --> ko (?)
2a - when I used the button "attach" (at the bottom of the page, in the
attachments folder) I received no info but no attachments had been loaded
2b - when I used the "Link" command in Edit mode I received an "error"
message but the file had been correctly attached (and is still there and
correctly downloadable)
3 - file type .mp3, size 70MB --> ko (+ crash?)
I tried to upload a extra-large file in Edit mode. Everything appear to be
all right but after 6 minutes (equivalent to 15-18MB of upload @ 0.5Mb/s)
sharply the traffic on LAN was stopped but the upload popup continue to be
active and "busy". No errors info at all. After 2 hours I closed the popup
and correctly continue to edit the page. I saved. No attachments found.
Questions.
There are limits on file size?
There are limits on total files amount?
It's possible to state the limits on the popup (just to remember limits)?
Many thanks
Giuseppe Fortini
Hello,
I've installed a wiki farm and observed a minor problem:
in the main wiki, when I click in the standard search textfield, the
text ("search...") vanishes. In my virtual wikis it doesn't.
This is the tag (absolutely identical):
<input class="globalsearchinput withTip" id="headerglobalsearchinput"
name="text" value="search..." size="15" type="text">
The text field from the panel is working fine (everywhere).
Best wishes
Werner
Dear all,
I have recently installed Xwiki Enterprise 2.6 on our server (Ubuntu 10), deployed in Tomcat6. For this server we use two domains: domain1.example.com and domain2.example.com. By default Tomcat (and the XWiki) can be accessed via http://domain1.example.com:8080/xwiki/, but for security and easy-access we need the xwiki to be accessed from http://domain2.example.com/xwiki.
This is why Tomcat is fronted by Apache2 using the AJP connector via a proxy pass which is configured in Apache sites-enabled:
<VirtualHost *:80>
ServerAdmin root(a)example.com
DocumentRoot /var/www
ServerName domain2.example.com
<Proxy *>
AddDefaultCharset Off
Order deny,allow
Allow from all
</Proxy>
ProxyPass /xwiki ajp://localhost:8009/xwiki/
ProxyPassReverse /xwiki ajp://localhost:8009/xwiki/
</VirtualHost>
This configuration works, the xwiki can be accessed from domain1 and domain2 however there are some problems. The first thing I noticed is that the same wiki has has a different skin on each domain (no xar loaded, no user logged in). Second whereas I can access the login page athttp://domain1.example.com:8080/xwiki/bin/login/XWiki/XWikiLogin, trying to access the login page athttp://domain2.example.com/xwiki/bin/login/XWiki/XWikiLogin fails with an infinite http redirect (redirecting to the same login page, HTTP/1.1 302 Moved Temporarily).
My questions are: is this type of configuration supported? If so, why is xwiki generating a redirect on the login page to the login page on domain2 and not on domain1? Thank you for the help.
Best,
Mark
Hello
I have installed XWord without any problem on my Windows XP & Word 2007
system, however I'm encountering a strange problem. If I create a page in an
existing space in XWord, it gets the name as SpaceName.PageName as normal,
however the page seems to have no parent space, almost as if it is another
WebHome. The breadcrumbs for the page rather then being SpaceName >>
PageName, it just has the PageName (again much like a normal WebHome page).
It's kind of a problem because my menu system automatically lists the pages
within each space using a "#foreach($item in $xwiki.searchDocuments($sql))"
method, only the pages created in XWord don't appear at all.
Also, if I edit a page originally created in XWiki it suffers the same
problem if I edit it and publish it in XWord. If I go to the Information tab
that lists all child pages, only pages created in XWiki appear, and any that
have been edited or created in XWord do not. The only way I can find them is
to lucene search them.
Is it supposed to function like this? Any help on this would be really
appreciated, I hope I can get this functioning correctly.
Thanks
-----
----
Lockie
--
View this message in context: http://xwiki.475771.n2.nabble.com/XWord-strange-problem-regarding-pages-and…
Sent from the XWiki- Users mailing list archive at Nabble.com.
Hi Rudi, Hi Vincent,
Many thanks!
I removed xwiki-plugin-collection-1.1-xwiki20.jar, and now I can work!
Best regards
Boris
>
> Message: 2
> Date: Wed, 1 Dec 2010 06:57:34 -0800 (PST)
> From: Tronicek
> Subject: Re: [xwiki-users] (no subject)
> To: users(a)xwiki.org
> Message-ID:
> Content-Type: text/plain; charset=us-ascii
>
>
> Hi Boris, Hi Vincent,
>
> i realized a problem in MultiPageExportApplication. It requires the
> xwiki-collection plugin to be installed.
> The problem is that xwiki-plugin-collection-1.1-xwiki20.jar has a reference
> to the Class org.xwiki.rendering.listener.Attachment
>
> After removing the plugin from xwiki.cfg the ClassNotFoundException was
> gone. So remove:
> com.xpn.xwiki.plugin.collection.CollectionPlugin, \
>
> I've been missing a JIRA category for this. Vincent, can you report this
> error, please?
> Thanks,
> Rudi
> --
> View this message in context: http://xwiki.475771.n2.nabble.com/no-subject-tp5787856p5792137.html
> Sent from the XWiki- Users mailing list archive at Nabble.com.
>
>
> ------------------------------
>
> ------------------------------
>
> Message: 6
> Date: Thu, 2 Dec 2010 10:12:15 +0100
> From: Vincent Massol
> Subject: Re: [xwiki-users] (no subject)
> To: XWiki Users
> Message-ID:
> Content-Type: text/plain; charset=us-ascii
>
> Hi Rudi,
>
> On Dec 1, 2010, at 3:57 PM, Tronicek wrote:
>
> >
> > Hi Boris, Hi Vincent,
> >
> > i realized a problem in MultiPageExportApplication. It requires the
> > xwiki-collection plugin to be installed.
> > The problem is that xwiki-plugin-collection-1.1-xwiki20.jar has a reference
> > to the Class org.xwiki.rendering.listener.Attachment
> >
> > After removing the plugin from xwiki.cfg the ClassNotFoundException was
> > gone. So remove:
> > com.xpn.xwiki.plugin.collection.CollectionPlugin, \
> >
> > I've been missing a JIRA category for this. Vincent, can you report this
> > error, please?
>
> Well right now this collection plugin is not developed nor supported by the xwiki dev team. It's not part of any release and doesn't have any jira project. It's been contributed by an individual (Ludovic Dubost). It's not even documented on code.xwiki.org...
>
> It's only available here:
> http://svn.xwiki.org/svnroot/xwiki/contrib/sandbox/xwiki-plugin-collection/
>
> Since it's used by the mutipage export app it should be at least:
> 1) promoted to contrib/projects
> 2) documented on code.xwiki.org
>
> Any taker for that?
>
> Then we need to review it and decide if that's something we want in the platform (in which case we'll need to rewrite it as a component IMO) and if so take ownership of it.
>
> So in the meantime, you could request access to the contrib sandbox and fix it, see:
> http://contrib.xwiki.org/xwiki/bin/view/Main/WebHome
> (note: we have a skin issue I've just noticed on contrib ;)).
>
> Thanks
> -Vincent
>