W3C members' sites put to the test.
February 25, 2003 11:16 AM Subscribe
State of Validation 2003. Off the 430 W3C members, only 28 (6.5%) have sites that validate with the W3C validator as either HTML or XHTML! This represents an increase in standards compliance of 75.7% from the year ago tests. [via the big orange Z]
Compliance is an all-or-nothing thing too... my site is mostly compliant, but the validator doesn't like the fact that I have JavaScript in the body of my documents, even though the JS works fine on every browser released since 1996. So because of one* deviation, I'm not "compliant". Whatever.
*Except on my index page - on that there's a whole slew of complaints because I've linked to a .swf hosted on another site.
posted by eustacescrubb at 11:47 AM on February 25, 2003
*Except on my index page - on that there's a whole slew of complaints because I've linked to a .swf hosted on another site.
posted by eustacescrubb at 11:47 AM on February 25, 2003
At work, my Netscape 4 userbase is high (about 4%) and once it gets below 2% I no longer have to support it. My main site validates, but looks like crap in Netscape 4, so I'm in the middle of a project to add a dual purpose page to be both the Netscape 4 and Print versions of the site.
It sucks, and it's really the only thing that kept me from ditching tables when we were in the design phase of our relaunch. But as long as we receive a few thousand hits per month from those older browsers, I have to support them.
posted by mkelley at 11:47 AM on February 25, 2003
It sucks, and it's really the only thing that kept me from ditching tables when we were in the design phase of our relaunch. But as long as we receive a few thousand hits per month from those older browsers, I have to support them.
posted by mkelley at 11:47 AM on February 25, 2003
Rule number one of website developement on the enterprise level:
Make it accessible to as many users as possible
Rule number 2:
Make it as standards compliant as possible.
It's a good sign I guess but anyone trying to make a site validate is in for a world of hurt
I dunno though, there's been a lot of issues in terms of using the current XHTML specs from a usability perspective. It's made validation virtually a joke unless they redo a lot of the XHTML 2 recommendation
posted by bitdamaged at 11:48 AM on February 25, 2003
Make it accessible to as many users as possible
Rule number 2:
Make it as standards compliant as possible.
It's a good sign I guess but anyone trying to make a site validate is in for a world of hurt
I dunno though, there's been a lot of issues in terms of using the current XHTML specs from a usability perspective. It's made validation virtually a joke unless they redo a lot of the XHTML 2 recommendation
posted by bitdamaged at 11:48 AM on February 25, 2003
There is no excuse for not bothering to include a doctype on a web page. That has nothing to do with Netscape 4 compliance; it's sheer laziness.
It *is* difficult to create a compliant web site, but I doubt many of the W3C members have much interest in standards. They are buying the ability to influence technologies and standards endorsed by the W3C, likely for the benefit of their own company.
posted by xyzzy at 11:58 AM on February 25, 2003
It *is* difficult to create a compliant web site, but I doubt many of the W3C members have much interest in standards. They are buying the ability to influence technologies and standards endorsed by the W3C, likely for the benefit of their own company.
posted by xyzzy at 11:58 AM on February 25, 2003
Speaking from minimal web design experience (mostly run-of-the-mill HTML, some PHP, a little javascript), I'm curious as to why all browsers interpret code differently. I mean, I understand that different browsers are unique in many ways. But when it comes to displaying code, shouldn't they all be the same? Why should designers have to double or triple the amount of work involved just to cater to these different browsers, when in my opinion they should all display a simple table exactly the same. To me, the only things that should differentiate one browser from the next are the features and the interface.
posted by Witty at 1:03 PM on February 25, 2003
posted by Witty at 1:03 PM on February 25, 2003
you've got things a bit backwards, the code (HTML CSS Javascript etc) is not (really) there before the browsers, the browser developers have created the code to add functionality to the browser.
This is especially true with Javascript. While it was initially developed by a Netscape/Sun venture to enhance the Netscape Browser, Microsoft glommed onto it and developed it simultaneously with Netscape. Since Javascript didn't exist per se when they first built it into the browser, it was up to the browser developer to decide how to implement it.
Ergo the need for standards before integration
posted by bitdamaged at 1:14 PM on February 25, 2003
This is especially true with Javascript. While it was initially developed by a Netscape/Sun venture to enhance the Netscape Browser, Microsoft glommed onto it and developed it simultaneously with Netscape. Since Javascript didn't exist per se when they first built it into the browser, it was up to the browser developer to decide how to implement it.
Ergo the need for standards before integration
posted by bitdamaged at 1:14 PM on February 25, 2003
Part of the problem is that different browser makers interpret standards differently. Hence browsers with broken box-model implementations and so on. Also, the standards are not always super specific as to how something *should* appear, down to pixels or text styles. Also, there is no reason to display the same table the same way on your PDA, your cell-phone, your text-to-speech browser for the blind, your TV, and so on. Display *should* be manipulated for different media and audiences.
But display is only *part* of a bigger problem. A site may validate as XHTML 1.1 and have perfect CSS, but are still not properly marked up. For instance, using paragraphs to display lists of links (as opposed to unordered lists.) I am guilty of this sort of thing on my own site.
I am not a standards freak, really, but it does irritate me that so many companies who influence web standards won't even make a half-assed attempt to implement them.
posted by xyzzy at 1:15 PM on February 25, 2003
But display is only *part* of a bigger problem. A site may validate as XHTML 1.1 and have perfect CSS, but are still not properly marked up. For instance, using paragraphs to display lists of links (as opposed to unordered lists.) I am guilty of this sort of thing on my own site.
I am not a standards freak, really, but it does irritate me that so many companies who influence web standards won't even make a half-assed attempt to implement them.
posted by xyzzy at 1:15 PM on February 25, 2003
one problem is backwards compatability and robustness - that means implementing funcionality not in the standards which may imply compromises in the compliant parts. another problem is that the source specifies content, but not precise layout. optimising layout isn't exact (and depends on silly things like details of fonts), so different browsers that both perfectly implement the specs might still produce different results. finally, i suspect the specs can be ambiguous at times - i bet it's not specified whether, when a div is rendered on the screen, it must physically include all components that lie within the bracketing elemnts (this was a significant problem with my home pages until last weekend when i finally ran tidy on them to get xhtml and then used xsl to rearrange the structure into something ie and mozilla would render in the same way...).
[preview - and what xyzzy said]
posted by andrew cooke at 1:20 PM on February 25, 2003
[preview - and what xyzzy said]
posted by andrew cooke at 1:20 PM on February 25, 2003
Speaking as someone occasionally involved in browser development, these are all reasons that HTML isn't perfectly portable.
As andrew_cooke and xyzzy say: the standards are often ambiguous about how a certain construct will behave. Sometimes this is intentional; HTML was originally intended as a semantic markup language, but lots of people try to use it to get pixel-perfect rendering of a design they have. So they rely on one browser's particular choices, and often rely on that browser's bugs in order to get the display they want. From then on each new browser has a bunch of undocumented bugs it has to emulate in order to be perceived as "compatible". Standards efforts are supposed to mitigate this, but when it comes down to it, users care more about whether ESPN displays correctly than whether the browser is standards-compliant. This drives browser authors nuts.
bitdamaged: Doing an implementation before writing the standards can lead to divergent implementations. But writing the standards before anyone implements them tends to lead to awful, bloated standards that have to be broken in order to do anything useful. The best systems come about through an iterative process of specification, implementation, respecification, reimplementation, repeat until done. But that's a lot of time-consuming work that nobody was willing to do during the dot-com boom. (And usually aren't willing to do, ever.)
posted by hattifattener at 1:36 PM on February 25, 2003
As andrew_cooke and xyzzy say: the standards are often ambiguous about how a certain construct will behave. Sometimes this is intentional; HTML was originally intended as a semantic markup language, but lots of people try to use it to get pixel-perfect rendering of a design they have. So they rely on one browser's particular choices, and often rely on that browser's bugs in order to get the display they want. From then on each new browser has a bunch of undocumented bugs it has to emulate in order to be perceived as "compatible". Standards efforts are supposed to mitigate this, but when it comes down to it, users care more about whether ESPN displays correctly than whether the browser is standards-compliant. This drives browser authors nuts.
bitdamaged: Doing an implementation before writing the standards can lead to divergent implementations. But writing the standards before anyone implements them tends to lead to awful, bloated standards that have to be broken in order to do anything useful. The best systems come about through an iterative process of specification, implementation, respecification, reimplementation, repeat until done. But that's a lot of time-consuming work that nobody was willing to do during the dot-com boom. (And usually aren't willing to do, ever.)
posted by hattifattener at 1:36 PM on February 25, 2003
In my experience, automatic validators and accessibility checkers such as Bobby are not very useful - they follow dumb rules and don't really have any way of guaging how a page will be presented on a screen reader or text-only browser. Bobby even raises questions about it's own home page.
And too many web authors put web validation icons on their pages without actually checking them. I've tried to make my pages as accessible as possible and have found Dive Into Accessibility to be very useful, unlike most of the alternatives.
posted by jamespake at 1:37 PM on February 25, 2003
And too many web authors put web validation icons on their pages without actually checking them. I've tried to make my pages as accessible as possible and have found Dive Into Accessibility to be very useful, unlike most of the alternatives.
posted by jamespake at 1:37 PM on February 25, 2003
Being standards compliant is really easy for most sites... if you're willing to cast aside years of bodges, start again with simple, valid mark-up, and focus on styling it with CSS. Netscape 4 support is no excuse, those people can just get a plainer look applied to exactly the same pages.
The Web Standards hype may seem over the top at times, but I know from experience it's well worth putting in the effort, you soon find yourself churning out less mark-up yet producing better sites.
posted by malevolent at 1:44 PM on February 25, 2003
The Web Standards hype may seem over the top at times, but I know from experience it's well worth putting in the effort, you soon find yourself churning out less mark-up yet producing better sites.
posted by malevolent at 1:44 PM on February 25, 2003
I see, I see (thanks for not making me feel stupid... plenty of other threads for that). Like I said, I'm coming from a very elementary view on this situation. I just found it very frustrating and deflating to see anything I designed while using, say {ehem} IE6, look soooo different (and often horrible) in Opera or Mozilla. I always asked myself, "why can't my table border of one damn pixel look the same in ALL of these browsers?"
I can't imagine trying to be compatible for PDAs, browsers for the blind, etc. But y'all go ahead and carry on about your fancy web design talk. I uhhh... need to go find something about Kevin Bacon er sumthin'... Thanks though! :)
posted by Witty at 1:46 PM on February 25, 2003
I can't imagine trying to be compatible for PDAs, browsers for the blind, etc. But y'all go ahead and carry on about your fancy web design talk. I uhhh... need to go find something about Kevin Bacon er sumthin'... Thanks though! :)
posted by Witty at 1:46 PM on February 25, 2003
Supporting Netscape 4 does not automatically mean you're stuck with invalid code. It does mean extra work that, in my opinion, is not worth it considering most sites' small percentage of Netscape 4 visitors, but it can be done. Real World Style is a good reference for using standards with Netscape 4, including 2 column and 3 column CSS layouts.
posted by kirkaracha at 2:08 PM on February 25, 2003
posted by kirkaracha at 2:08 PM on February 25, 2003
This represents an increase in standards compliance
Um, what "standards" are we talking about? The W3C is not a standards body; they don't produce standards, they produce recommendations.
They are a vendor consortium, with most members interested in steering technology in a profitable direction. I mean, look at W3C XML Schema. You can't convince me that it wasn't created solely so that vendors with big budgets can create and sell the complex tools required to work with it.
There are some smart, dedicated folks in the W3C, but not everything that comes out deserves to be followed.
(In fact, I think the reason they don't produce standards is because somebody, early on, maybe TimBL, realized that the recs should be tested out in real-life for a few years to see if they hold up. Then ISO or somebody can declare them standards.)
posted by Ayn Marx at 2:13 PM on February 25, 2003
Um, what "standards" are we talking about? The W3C is not a standards body; they don't produce standards, they produce recommendations.
They are a vendor consortium, with most members interested in steering technology in a profitable direction. I mean, look at W3C XML Schema. You can't convince me that it wasn't created solely so that vendors with big budgets can create and sell the complex tools required to work with it.
There are some smart, dedicated folks in the W3C, but not everything that comes out deserves to be followed.
(In fact, I think the reason they don't produce standards is because somebody, early on, maybe TimBL, realized that the recs should be tested out in real-life for a few years to see if they hold up. Then ISO or somebody can declare them standards.)
posted by Ayn Marx at 2:13 PM on February 25, 2003
One example that the specs aren't quiet as detailed in how things should work as they could be: Eric Meyer admits he's not quite sure how his page should look.
Why? The CSS spec simply doesn't address the particular situation he's created with his markup, and so the browser vendors have interpreted the information differently.
posted by Su at 2:23 PM on February 25, 2003
Why? The CSS spec simply doesn't address the particular situation he's created with his markup, and so the browser vendors have interpreted the information differently.
posted by Su at 2:23 PM on February 25, 2003
I've been trying to get the numerous sites I manage to use XHTML strict. It saves lots of trouble when somebody complains about how some page looks in BrowserX. I just tell people to go get a reasonably current browser. Hey, they're free!
Now, to be honest, I don't exploit every cool-boss-wow feature of XHTML and CSS, but it isn't all that hard to create W3C-compliant sites.
posted by Ayn Marx at 2:33 PM on February 25, 2003
Now, to be honest, I don't exploit every cool-boss-wow feature of XHTML and CSS, but it isn't all that hard to create W3C-compliant sites.
posted by Ayn Marx at 2:33 PM on February 25, 2003
Seriously, this is a bit unfair, if you have to support netscape 4 (which, honestly, many of these business have to -- or had to -- do), and you want standards compliance, you have to create two different sets of site templates.
That's not true. If your markup is created in conformance with what we call web standards (yes, they're not formally standards, but recommendations) your site will render just fine in Netscape 4. In fact, it's almost only non-standard extensions to HTML that render improperly, even in that mess of a browser.
posted by anildash at 3:43 PM on February 25, 2003
That's not true. If your markup is created in conformance with what we call web standards (yes, they're not formally standards, but recommendations) your site will render just fine in Netscape 4. In fact, it's almost only non-standard extensions to HTML that render improperly, even in that mess of a browser.
posted by anildash at 3:43 PM on February 25, 2003
To go even further, anildash, I would say that at MOST, a site designer might have to use more than one stylesheet (to serve to Netscape 4.x users) in order to tame an XHTML 1.1 compliant site into a reasonable display in Netscape 4.x. The markup can be the exact same for all browsers.
I personally don't even look at my sites in Netscape 4.
posted by xyzzy at 4:02 PM on February 25, 2003
I personally don't even look at my sites in Netscape 4.
posted by xyzzy at 4:02 PM on February 25, 2003
Seriously, this is a bit unfair, if you have to support netscape 4 (which, honestly, many of these business have to -- or had to -- do), and you want standards compliance, you have to create two different sets of site templates.Everyone knows that Netscape 4 is doomed, but you're exagerating the effort required to support standards and Netscape 4 at the same time. Using tables for layout is bad practice, but it may not break standards (as the previous W3C homepage design demonstrated). As Netscape 4 doesn't understand @import it makes it incredibly easy not to break N4.
It's easy - even popular - now to criticise Netscape 4 for it's lack of CSS layouts. CSS layouts (at the time, the new name for CSS-P) were finalised with CSS2 in May 1998, and Netscape 4 was out half a year before that. Like IE4, Netscape 4 implemented CSS from standards in progress, leading to other bugs such font-size: xx-small. Netscape 4 was a browser of the era, and IE4 was only a smidgen smarter. The only fault of Netscape 4 is that it's still hanging around, but blame admins and users - not the software.
It seems unfair that with this survey people are concentrating on the minority of sites that are uncompliant because of proprietary extensions, and not the majority of uncompliance caused by programming mistakes and slackness.
I've yet to see any W3C member sites fretting over whether Netscape 4 can render 2 column layouts. Most of them are barely one column.
(yes yes, they're not "standards", and it is actually "GNU/Linux", I'll try to do better next time)
(I don't even give Netscape 4 the time of day. I kick it in the nuts and set it on fire. I hate like you hate, love me!)
posted by holloway at 4:40 PM on February 25, 2003
I figured I'd chime in, even though most of my position has already been touched upon.
First off, I have never personally shamed anyone for their site not validation nor praised anyone for their site passing the validator. I also don't remember a page that I ran through the validation that wasn't mine or that of a friend who was asking me for help. This isn't for the reasons jamespake mentioned, because I do find the validators extremely useful and accurate (once you learn to read their output). Instead, it is because I find validation only one small part of building a website that follows the intent of W3C standards and results in a client independent & accessible site.
Validation is a check to make sure that a given document can be parsed by a 3rd party application based on the document's conformance to a given DT (or other rules in the case of cuss). And in my mind that is all it is.
Using parseable markup is only a piece of the pie.
The other piece of the puzzle is using your markup language of choice (HTML3.2, 4.01, XHTML, XML) to create semantically meaningful documents.
One can write a document that is based on an 8x nested table structure that carries no meaning but passes the validator and presents itself well in browser X. One can write a "table-less css layout" based document with nothing but the DIV and IMG elements that carries no meaning but passes through a validator and presents itself well in browser X. One can also write a well structured document that carries very much meaning but doesn't pass the validator because of a need to look good in browser X. Lastly, it is possible to write a well structured document that passes through a validator and its only in accomplishing this feat would I consider the page as adhering to the W3C recommendations. Its also possible to adhere to the W3C Recs and still look good in Browser X. That is what we should all be striving for.
The difficulty in drawing any conclusion from a survey like the one Marko has done here is that it is, as riffola so accurately stated in his FPP, it is purely a state of validation amongst W3C member organizations. It will not represent the state of compliance to the recs without further analysis of each site.
posted by 10sball at 5:20 PM on February 25, 2003
First off, I have never personally shamed anyone for their site not validation nor praised anyone for their site passing the validator. I also don't remember a page that I ran through the validation that wasn't mine or that of a friend who was asking me for help. This isn't for the reasons jamespake mentioned, because I do find the validators extremely useful and accurate (once you learn to read their output). Instead, it is because I find validation only one small part of building a website that follows the intent of W3C standards and results in a client independent & accessible site.
Validation is a check to make sure that a given document can be parsed by a 3rd party application based on the document's conformance to a given DT (or other rules in the case of cuss). And in my mind that is all it is.
Using parseable markup is only a piece of the pie.
The other piece of the puzzle is using your markup language of choice (HTML3.2, 4.01, XHTML, XML) to create semantically meaningful documents.
One can write a document that is based on an 8x nested table structure that carries no meaning but passes the validator and presents itself well in browser X. One can write a "table-less css layout" based document with nothing but the DIV and IMG elements that carries no meaning but passes through a validator and presents itself well in browser X. One can also write a well structured document that carries very much meaning but doesn't pass the validator because of a need to look good in browser X. Lastly, it is possible to write a well structured document that passes through a validator and its only in accomplishing this feat would I consider the page as adhering to the W3C recommendations. Its also possible to adhere to the W3C Recs and still look good in Browser X. That is what we should all be striving for.
The difficulty in drawing any conclusion from a survey like the one Marko has done here is that it is, as riffola so accurately stated in his FPP, it is purely a state of validation amongst W3C member organizations. It will not represent the state of compliance to the recs without further analysis of each site.
posted by 10sball at 5:20 PM on February 25, 2003
If designers would stop bending over backwards to support N4, maybe users would stop using the antedeluvian piece of poop. Ya reckon?
posted by stavrosthewonderchicken at 1:52 AM on February 26, 2003
posted by stavrosthewonderchicken at 1:52 AM on February 26, 2003
anildash:If your markup is created in conformance with what we call web standards (yes, they're not formally standards, but recommendations) your site will render just fine in Netscape 4
I totally agree, although Anil you've got 85 errrors on your own page that you might want to check out at some point :o)
posted by ralawrence at 1:57 AM on February 26, 2003
I totally agree, although Anil you've got 85 errrors on your own page that you might want to check out at some point :o)
posted by ralawrence at 1:57 AM on February 26, 2003
incidentally, those errors look like they're all caused by the & sign in urls, which, when they appear inside an xml document, need to be written as &. if you change each url from http://foo.com?a=b&c=d to http://foo.com?a=b&c=d you will be ok (or at least much better). this came up in another discussion here (on meta?) a while back. it's a nuisance because it means you can't just cut + paste urls (with parameters) from the browser location bar...
writing this post would have been much easier if there was an initial "post" button as well as a "preview", or if the system preserved character entities on preview
posted by andrew cooke at 2:08 AM on February 26, 2003
writing this post would have been much easier if there was an initial "post" button as well as a "preview", or if the system preserved character entities on preview
posted by andrew cooke at 2:08 AM on February 26, 2003
10sball, thanks for your commentary on meaningful as well as valid markup. I swear I make that speak at least once a week to some content contributor to the site I manage. IMO, meaningful markup is the most important piece of the puzzle.tho getting them to follow vaguely valid markup is also a never ending chore.
posted by Fezboy! at 9:51 AM on February 26, 2003
posted by Fezboy! at 9:51 AM on February 26, 2003
errrr make that speech. What's the use of a Preview button if I only treat it as an extra step to posting? Someday I'll learn.....
posted by Fezboy! at 9:53 AM on February 26, 2003
posted by Fezboy! at 9:53 AM on February 26, 2003
I know exactly what the problem is, but thanks for checking. I'd have to escape an ampersand in the URL of nearly every link I post on my sidebar, which is usually over a hundred a week. If I took the time to do all that, I wouldn't have time to find all the links, unfortunately.
What I plan to do is have software escape the ampersands for me. But I am lazy. Anyone want to write the Movable Type template for me?
posted by anildash at 10:22 AM on February 26, 2003
What I plan to do is have software escape the ampersands for me. But I am lazy. Anyone want to write the Movable Type template for me?
posted by anildash at 10:22 AM on February 26, 2003
« Older folk music | Homeland Security Threat Monitor Newer »
This thread has been archived and is closed to new comments
Seriously, this is a bit unfair, if you have to support netscape 4 (which, honestly, many of these business have to -- or had to -- do), and you want standards compliance, you have to create two different sets of site templates.
Or, you could create one set that works on all browsers, but isn't necessarily compliant with HTML 4 or XHTML. Can't really blame them for doing that.
But, signs are good, eventually they'll get to standard compliance -- and, I'm betting, just in time for the next set of standards to be released :).
posted by malphigian at 11:25 AM on February 25, 2003