Hi devs,
Since I started working on XWiki, I tried as much as possible to make
it as standards-compliant as possible, but I didn't see the same
tendency from the rest of the team.
The question is, do we want XWiki to be an "almost web 2.0 wannabe"
project, or "the most 2.0" web application ever?
At the dawn of the web, people cared only about the end result, how the
page looked viewed in Netscape or Internet Explorer (sometimes "and"
instead of "or"). But as the web gathered more attention, and people
started to see it as the next big thing, a platform with a lot of
potential, other concerns were raised, such as how the inner and
generated code looks, how efficient is the page, how "zen" is the
interface and the code... All these are part of Web 2.0. Most people
tend to think of it as "the social web", but others also think of it as
"the semantic web" and "the web of trust", and even as "the web
of
accessibility" (ubiquitous web). This, however, comes at a price, that
of "high quality".
Like there are software design patterns, that make the difference
between ugly code and good code, the are also Web design patterns, and
Web bad practices. While most of the sites are still ugly, although they
provide ultra-cool features and are very popular (like myspace), there
is an increasing number of sites that are very appreciated by the
developers, like CSSZenGarden. I never use a poorly written site, no
matter how "cool" it is, unless I have to. Sites like AListApart, PPK,
WASP and others keep propagating good practices that each site should
use, like separating the content from style and behavior, using semantic
markup for the content, and not meaningless divs and spans (or worse,
tables), dropping deprecated JS practices like browser detection,
document.all, document.write, trying to keep css hacks to a minimum,
having a site that looks good and works well without CSS or javascript,
and many others. The final goal is to have sites that are efficient,
clean, accessible, browser-independent, but in a full-featured browsing
mode they provide the same (or better) browsing experience as with any
"only looks matter" page.
I can give many examples of sites which are so bad that I get angry
whenever I have to use them (most of them are sites which people must
use, like SNCF, online banking, e-government). They are all colorful and
flashy and shiny and rounded, and I guess mediocre IE users are really
happy with them, but if I don't use IE and I keep my hand on the
keyboard rather than on the mouse, they stop working as expected. The
web is a large platform, with different browsers (agents, to be more
generic) and different users, and each combination is different. This is
why a site must work in any environment, and it must not make
assumptions about the user ("the user is doing what I am doing, so if it
works for me, it works for anybody"). Doing otherwise is a
discrimination, as such a site denies access to many categories of users.
In platform/XE we're following some of these guidelines, but that's not
the case with other products.
Goals: an XWiki product should do its best to ensure that:
- works without javascript
- works without css
- works without images
- works with any input method (I don't usually click the submit button,
but press enter, so doing something only on "onclick" is bad)
- changing the text size doesn't mess the layout
- a crawler does not change the content (GET does not alter the
database, except maybe statistics)
- works from a text browser (links)
- looks good enough when read by a voice browser or displayed with braille
- looks good on different displays, ranging from old 600x480 desktops to
large 4000x3000 displays, 600x800 handhelds and smaller phone displays,
1024x1280 portrait displays, and also when printed
An old and false assumption is that a site that respects these goals
looks like a plain HTML file good enough to display some text, without
any interactivity and no images... No, the idea is to ensure that the
site looks good enough without any browser feature, but when such a
feature is available (like support for javascript or images) these
features enhance the content already available with a new look, or a new
behavior. Of course, it is impossible to ensure that everything that is
available with javascript/flash is also available without it, but this
is limited to very graphical/interactive features, like drawing, games,
or animated content, which is usually done using flash.
BAD practices: each developer should try to avoid doing any of these:
- using style attributes
- putting CSS code inside a page
- putting javascript inside a page (both in a <script> tag and in
onclick attributes)
- putting a lot of divs and spans just for formatting
- using meaningless classnames or ids on HTML elements
- using invalid markup or css
- copy/pasting some snippet from a web page without checking/cleaning
the code; most of these snippets were made in the previous millennium,
and use really bad and deprecated practices, and they don't work (good
enough) with modern browsers or when combined with other snippets and
new frameworks
- writing convoluted css selectors
While it is acceptable to do something like this locally, in the first
phase of writing a feature, it is just for the PoC, and the code should
be cleaned before committing.
GOOD practices: we should try to have this kind of code in the future:
- putting the behavior in an external javascript, and attaching it to
the target elements on onload or using XBL
- using "alt" for images
- keeping the content markup simple and enhancing the content with
needed divs/spans from javascript or using XBL
So, what does this have to do with the title of this message? Well, I
already started simplifying the generated HTML syntax generated by wiki
markup, as proposed in another mail.
I don't like any of the current skins, as they were all done quickly, on
top of the previous one, either preserving the flaws or covering them by
adding more rules and making the skin heavier. Unfortunately, the last
one, Toucan, although it looks very nice and classy, is a nightmare to
modify (and sometimes even to view, in Firefox for Ubuntu users, for
instance, where it freezes the browser). I think everybody would be
happier with a cleaner skin, not a more convoluted one. The way it is
now it is very fragile, lots of bugs discovered since the first version,
most of them because of some improper CSS rules. Fixing something
sometimes breaks something else, like it happens in the current WYSIWYG
editor, and we all agree that the editor is bad. And instead of reducing
the size of the stylesheets, so that it would be easier for other people
to change the skin, we now have a huge file where users easily get lost.
Also, we had to disable the CSS validation because of some invalid CSS
rules that cannot be removed because they are at the core of the skin IE
compatibility. That could have been ensured in a different way, as
albatross shows. Relying on browser hacks is very bad.
So, what can I say about XE:
- The markup is mostly OK, with some minor problems
- The JS needs to be cleaned, as we're using some old snippets that are
very poorly written; all the code should be upgraded to use prototype
- The CSS used to be of medium quality in Albatross, but now is of poor
quality in Toucan
- It works pretty good without javascript, css or images (some problems
with white text on white background without images) and works good from
a text browser
I know that it is time consuming to ensure the quality of the features,
and clients need features more than quality. Still, are we willing to
have products of poor quality? Vincent has been trying to improve the
quality of the java code, and it annoys him when people are not doing
anything to improve this quality by not committing tests, or breaking
the checkstyle, or introducing regressions. Well, I feel the same when
people commit invalid markup, forms with "onclick" submission, invalid
CSS, deprecated javascript... Unfortunately we don't have a checkstyle
tool for the interface, that would automatically break the build when
people do bad commits. We have the validation checks (and one had to be
disabled), and that's the best we can do for the moment. Maybe later
we'll be able to write a test that checks if there is any inline
javascript or css, or the complexity of the HTML markup (like a maximum
depth and forbidding too many imbricated divs).
I'm sorry if this message hurts somebody, I'm not trying to blame
people, just pulling the alarm signal because we're going in the wrong
direction, and we're going deeper and deeper. We should hire people to
work on the QA for the web part, but this kind of skills are hard to
find, and there is enough manpower needed for the core or for adding new
features, we might not have resources to invest in Web QA.
It is hard to introduce this quality requirement in the existing large
projects, but it would be better to start new projects with a clean web
part in mind.
So, what else can we do to improve the quality of the web part? Any ideas?
--
Sergiu Dumitriu
http://purl.org/net/sergiu/