On 9/26/08, Paul Libbrecht <paul(a)activemath.org> wrote:
Le 26-sept.-08 à 15:28, Bey Youcef a écrit :
Note that you can also define in the
administration-settings XWiki
documents that are resource-bundles, they will override the things below.
Somewhere close to these settings you can also define the supported
languages (and define whether it's multilingual!).
Thanks so much.
Translating all strings is time-consuming, the best way thus to get quick
idea about the arabic localization is MT (Machine Translation).
The MT can be used for the experimentation.
Honestly, it's a donkey work but if you type fast it should not take you
more than 3-4 hours for XWiki I think.
Curriki translation took a few days to our translators and it is several
orders of magnitudes bigger.
As we are starting to collaborate together, I don't appreciate the word
*donkey*!
No harm.
But it seems that you're not award about the advantage of MT nowadays. It
can be used to save more than 50% in some fields. Personally, I translated
web sites using Google MT into Arabic. Let me say that it helped to save
more 50% of translation time (especially for editing) and the quality is
quit acceptable! The statistical MT are also more powerful.
As first step, I want to show the technical problems that can be produced
when shifting into Arabic, especially those related to RTL, css, ... The
translation as well as revision can be done efficiently by occasional
translators (e.g me and other communities). I'm not worried about this
issue.
How about the translation of installation Wiki pages?
They are not
in the resources! For example, where could we
find strings of the main
pages
when XWiki is installed for the first time?
The web-pages are content, they are wiki-editable.
Once multilingual, your wiki always offers you the possibility to
translate a page at editing time, that's what you want to use.
This mean that the end-users have to translate these pages? I don't thing
so!
Not all of them are translators or willing to translate.
All users? Just one... the translator...
The translation must be done in translation environments rather in the
application itself.
Even XWiki offers open editing, it is not computer-aided translation tools
to be used for translation. Translating a Wiki page from English into
another language in Wiki editor is not good method. The source and target
document must be displayed in the same interface. This concept is proposed
by Alen Melby since 1982, when he proposed the concept of translation memory
(TM). This is very known for professional translators.
Strings must be stored in separated resource files, which can be translated
in specific environments like OmegaT, Pootle, poedit, etc.
Mozilla is translated into 70 languages in high quality. The translation is
never done in Mozilla itself, especially for documents. All strings are
saved in ".po" files and in object formats. This is why the translation is
efficient!!
The localization, once done, it must be done for all text in documentation
and GUI.
it is well known that translating documentation is long. It generally is
delayed ;-))
Not exactly. It depends on size of documents. The translation of XWiki pages
(not documentation) can be done in acceptable time. I'm talking about the
Wiki pages, like the main page, and some help pages.
The scenario is simple : strings must be stored in separte files.
Translators recover these files and translate them in computer-aides
translation environments (e.g OmegaT). Once the translation is done, these
files are integrated into XWiki. The last step is QA!
The XWiki was not designed for localization that's why the resources
properties are used for conversion. In fact, these resources can't be used
for managing format of document. They don't support metadata...This is one
among another reasons for which LISA proposed the XLIFF standard.
Anyway, thanks for your help!
Youcef