Standards fail.
October 16, 2008 7:26 AM Subscribe
Only 4.3% of the web validates. Opera have finished a scan and validation check of the net using their new MAMA spider and have got an extremely interesting dataset. Did you check your website today?
MetaFilter: "look at my kittens" sites
posted by DU at 7:36 AM on October 16, 2008 [2 favorites]
posted by DU at 7:36 AM on October 16, 2008 [2 favorites]
I'd like to add a major reason to his list of reasons so few pages validate: Confusion.
The HTML/XML/CSS standards have gotten big. Worse, compliance is no guarantee of correct behavior in actual browsers. When I've made pages in the past, I've worked hard to get compliant. But many times I've just thrown up my hands and eventually done Something That Seems To Work.
I'd like to see some stats on what the compliance rate is among the subset of standards that ARE implemented correctly across all browsers.
posted by DU at 7:40 AM on October 16, 2008 [3 favorites]
The HTML/XML/CSS standards have gotten big. Worse, compliance is no guarantee of correct behavior in actual browsers. When I've made pages in the past, I've worked hard to get compliant. But many times I've just thrown up my hands and eventually done Something That Seems To Work.
I'd like to see some stats on what the compliance rate is among the subset of standards that ARE implemented correctly across all browsers.
posted by DU at 7:40 AM on October 16, 2008 [3 favorites]
Simply adding any Google analytics or sitemaps code in an otherwise valid website will make it fail validation, depending on your stated doctype. I haven't time to read the complete report...Do they make any sort of account for this?
posted by Thorzdad at 7:41 AM on October 16, 2008
posted by Thorzdad at 7:41 AM on October 16, 2008
I wonder how many this little bugger accounts for: &
posted by mandal at 7:44 AM on October 16, 2008 [4 favorites]
posted by mandal at 7:44 AM on October 16, 2008 [4 favorites]
obligatory
posted by East Manitoba Regional Junior Kabaddi Champion '94 at 7:56 AM on October 16, 2008 [2 favorites]
posted by East Manitoba Regional Junior Kabaddi Champion '94 at 7:56 AM on October 16, 2008 [2 favorites]
Either the whole point of having a "standard" is so that there is a baseline of functionality across the browsers or the whole point is to follow the rules Because They Are The Rules. Most of the people who build websites professionally can't pitch the second option to clients. There has to be a benefit to choosing the option that requires more work and has more restrictions on the final product -- so I find it's easy to pitch CSS-based layout/design, and most of the practices advocated by Zeldman and the like, but my clients don't care if a site validates -- they care that their customers can all see and use the site; they care about Google ranking; they care about functionality. If validation takes more time and effort and restricts what can be done design-or-function-wise but offers no benefit to the client's bottom line, then the client won't pay for it.
So, these results aren't surprising at all.
posted by eustacescrubb at 7:58 AM on October 16, 2008 [2 favorites]
So, these results aren't surprising at all.
posted by eustacescrubb at 7:58 AM on October 16, 2008 [2 favorites]
One of my sites (PoliticalFilter.com) validates fine until links start getting added to it, then validation fails because the URLs constructed by other websites, which I have zero control over, aren't made to a certain standard. Yet the links work fine in that people can just click on the link and go to it. Who's wrong and why should I care if I can't control it and everything works fine anyway?
posted by Brandon Blatcher at 7:59 AM on October 16, 2008 [1 favorite]
posted by Brandon Blatcher at 7:59 AM on October 16, 2008 [1 favorite]
Thanks, EMRJKC94. I shall take a pro-active stance, express my dissatisfaction, and no longer interact with any site that fail vali
posted by davemee at 8:05 AM on October 16, 2008 [2 favorites]
posted by davemee at 8:05 AM on October 16, 2008 [2 favorites]
Dammit East Manitoba Regional Junior Kabaddi Champion '94, now there's recursive failure too.
posted by mandal at 8:07 AM on October 16, 2008
posted by mandal at 8:07 AM on October 16, 2008
the URLs constructed by other websites, which I have zero control over, aren't made to a certain standard. Yet the links work fine in that people can just click on the link and go to it. Who's wrong and why should I care if I can't control it and everything works fine anyway?
You can control it, if you want to: it's not that hard to fix the broken URL:s on your side, using the same rules as web browsers use to parse broken URL:s. If you're not a good coder yourself, there are plenty of HTML normalization libraries that can do that for you.
Whether you should care or not is another issue.
posted by effbot at 8:11 AM on October 16, 2008
You can control it, if you want to: it's not that hard to fix the broken URL:s on your side, using the same rules as web browsers use to parse broken URL:s. If you're not a good coder yourself, there are plenty of HTML normalization libraries that can do that for you.
Whether you should care or not is another issue.
posted by effbot at 8:11 AM on October 16, 2008
My web 2.1 (yes, I upgraded) site, shuttercal.com, is so terribly invalidated, I'm surprised that it works. I use lots of custom attributes to make the lightbox take in all sorts of metadata. Plus, I'm lazy. If it works in FF3 and IE7, why even bother trying to get validated? I have to do some amazing hacks for transparent PNG's in IE, and I'm completely lost trying to get z-index to work in IE to make an pop up menu without more slow javascript. Front end web coding is a losing battle, I don't know how people do that as full time work. It PAINS me whenever I am forced to do it.
posted by Mach5 at 8:17 AM on October 16, 2008
posted by Mach5 at 8:17 AM on October 16, 2008
*resolves to use BDO tag in future post*
posted by lukemeister at 8:21 AM on October 16, 2008
posted by lukemeister at 8:21 AM on October 16, 2008
It's very tough to make a site that works on Internet Explorer, Firefox, and Safari. Validation is like ensuring your site works on a 4th browser that no one uses because it doesn't exist.
posted by specialfriend at 8:22 AM on October 16, 2008 [15 favorites]
posted by specialfriend at 8:22 AM on October 16, 2008 [15 favorites]
Metafilter: 29 Errors, 57 warning(s)
posted by lukemeister at 8:23 AM on October 16, 2008 [1 favorite]
posted by lukemeister at 8:23 AM on October 16, 2008 [1 favorite]
I don't understand all this "what does validation buy me" or "clients don't pay for standards" talk. Maybe the web design world doesn't work like the programming world, but here's why *I* use standards: So I can re-use components without worrying that it only worked in a particular context.
For instance, a few years ago I made a site that I used (briefly) to document some science projects. My format was pictures on the left and explanation on the right. It was easy to make that work on an entry-by-entry basis, but harder to make it work for all pictures and explanations. But eventually I got it and I could make a template. I spend a little (of my own) time up front and I save (the client's) time on the back end.
posted by DU at 8:24 AM on October 16, 2008
For instance, a few years ago I made a site that I used (briefly) to document some science projects. My format was pictures on the left and explanation on the right. It was easy to make that work on an entry-by-entry basis, but harder to make it work for all pictures and explanations. But eventually I got it and I could make a template. I spend a little (of my own) time up front and I save (the client's) time on the back end.
posted by DU at 8:24 AM on October 16, 2008
here are plenty of HTML normalization libraries that can do that for you.
Links please? If it's a matter of plugging in code to rewrite links automatically, that at least seems viable. Is this something that's often done with websites? Is there a server big server penalty in terms of cycles and processing?
posted by Brandon Blatcher at 8:26 AM on October 16, 2008
Links please? If it's a matter of plugging in code to rewrite links automatically, that at least seems viable. Is this something that's often done with websites? Is there a server big server penalty in terms of cycles and processing?
posted by Brandon Blatcher at 8:26 AM on October 16, 2008
Uh, I will weigh in on this entire topic at greater length, but I'll throw in a small bit for the derail: some URLs cannot be fixed.
I have had situations in which I have properly escaped said URL only for that URL to no longer work. I first ran into this around 2003. Sometimes it is an ancient server on the other end. Sometimes it is some poorly-considered script on the other end which is thrown into confusion by a properly escaped URL. Either way, you cannot blanket fix these situations and must live with invalidating URLs if you want the actual part where someone clicks your link and gets results.
posted by adipocere at 8:36 AM on October 16, 2008
I have had situations in which I have properly escaped said URL only for that URL to no longer work. I first ran into this around 2003. Sometimes it is an ancient server on the other end. Sometimes it is some poorly-considered script on the other end which is thrown into confusion by a properly escaped URL. Either way, you cannot blanket fix these situations and must live with invalidating URLs if you want the actual part where someone clicks your link and gets results.
posted by adipocere at 8:36 AM on October 16, 2008
So I can re-use components without worrying that it only worked in a particular context.
Validation doesn't ensure that.
posted by smackfu at 8:37 AM on October 16, 2008 [3 favorites]
Validation doesn't ensure that.
posted by smackfu at 8:37 AM on October 16, 2008 [3 favorites]
I still use URL Cleaner by Dan Benjamin over at hivelogic. I'm not sure if he is still offering it or not, I couldn't find it on his site.
posted by -t at 8:39 AM on October 16, 2008
posted by -t at 8:39 AM on October 16, 2008
Validation doesn't ensure that.
You are absolutely right. However:
1) I'm explaining why validation against a standard is a good idea in general. It's because having a standard means you can re-use things.
2) If my page doesn't work but it does validate, at least I know where the problem lies. Either a design error or a browser rendering error, not syntax.
posted by DU at 8:44 AM on October 16, 2008
You are absolutely right. However:
1) I'm explaining why validation against a standard is a good idea in general. It's because having a standard means you can re-use things.
2) If my page doesn't work but it does validate, at least I know where the problem lies. Either a design error or a browser rendering error, not syntax.
posted by DU at 8:44 AM on October 16, 2008
I've never even tried to create a "professional" website; all of the pages I've written have been much less complicated than that. And yet, it still seems like there's always at least one element that isn't properly supported in some popular browser, forcing me to give up on the idea or use a clumsy hack or... give up.
And that's pages that do validate.
Until being standards compliant means that my pages probably won't break, there's not much incentive to make it a priority.
/n00b perspective
posted by Kutsuwamushi at 8:53 AM on October 16, 2008 [2 favorites]
And that's pages that do validate.
Until being standards compliant means that my pages probably won't break, there's not much incentive to make it a priority.
/n00b perspective
posted by Kutsuwamushi at 8:53 AM on October 16, 2008 [2 favorites]
Validation is way under "making it work" and "getting paid" I'm afraid.
posted by Artw at 9:00 AM on October 16, 2008
posted by Artw at 9:00 AM on October 16, 2008
That said, I’m getting increasingly weirded out by the number of SEO buggers pushing me to do things that I’ve been pushing for years anyway.
posted by Artw at 9:02 AM on October 16, 2008
posted by Artw at 9:02 AM on October 16, 2008
Only 4.3% validate? We are "shocked, shocked" as Captain Renault would say.
posted by Robert Angelo at 9:04 AM on October 16, 2008
posted by Robert Angelo at 9:04 AM on October 16, 2008
If the leading browsers can't get together on a common set of standards...
OK yes web standards are good and important, and we follow them as much as possible, including accessibility, but as professionals we have a job to do.
eustacescrubb > [clients] care that their customers can all see and use the site; they care about Google ranking; they care about functionality. If validation takes more time and effort and restricts what can be done design-or-function-wise but offers no benefit to the client's bottom line, then the client won't pay for it.
Exactly.
Even so, it sometimes seems that we spend up to a third of development time trying to accomodate browser differences, or sort out browser-specific bugs.
posted by Artful Codger at 9:06 AM on October 16, 2008
OK yes web standards are good and important, and we follow them as much as possible, including accessibility, but as professionals we have a job to do.
eustacescrubb > [clients] care that their customers can all see and use the site; they care about Google ranking; they care about functionality. If validation takes more time and effort and restricts what can be done design-or-function-wise but offers no benefit to the client's bottom line, then the client won't pay for it.
Exactly.
Even so, it sometimes seems that we spend up to a third of development time trying to accomodate browser differences, or sort out browser-specific bugs.
posted by Artful Codger at 9:06 AM on October 16, 2008
You can (and should) use conditional comments to fraction off IE-specific markup.
Among other things, it allows you to write a global CSS document that every DOM-compatible browser (everything except MSIE) understands, and then write an addendum CSS document that only IE 5 (or 6 (or 7 (or combinations of them))) will see, load, and interpret. No more CSS hacks, no more invalid or bizarre markup that all browsers and validators have to endure for the sake of MSIE. There is a similar provision for Javascript.
These are two of the biggest favors Microsoft ever gave web developers, and hardly anybody knows about them.
posted by ardgedee at 9:12 AM on October 16, 2008 [2 favorites]
Among other things, it allows you to write a global CSS document that every DOM-compatible browser (everything except MSIE) understands, and then write an addendum CSS document that only IE 5 (or 6 (or 7 (or combinations of them))) will see, load, and interpret. No more CSS hacks, no more invalid or bizarre markup that all browsers and validators have to endure for the sake of MSIE. There is a similar provision for Javascript.
These are two of the biggest favors Microsoft ever gave web developers, and hardly anybody knows about them.
posted by ardgedee at 9:12 AM on October 16, 2008 [2 favorites]
effbot your English doesn't validate. The checker says:
URL:s wtf
posted by bonaldi at 9:13 AM on October 16, 2008
URL:s wtf
posted by bonaldi at 9:13 AM on October 16, 2008
Sorry, the Javascript link is wrong. Go to Conditional Compilation of Javascript instead.
posted by ardgedee at 9:14 AM on October 16, 2008
posted by ardgedee at 9:14 AM on October 16, 2008
You know who else never, ever wrote valid HTML, never put correct MIME types, never used smart-quotes properly, and NEVER WROTE bug-free javacript?
That's right. Hitler. Draw your own conclusions.
posted by blue_beetle at 9:27 AM on October 16, 2008 [2 favorites]
That's right. Hitler. Draw your own conclusions.
posted by blue_beetle at 9:27 AM on October 16, 2008 [2 favorites]
Validation is way under "making it work" and "getting paid" I'm afraid.
Yeah, this is unfortunately where I stand as well. Standards have done very good things for the Web as a whole, I think, if for no other reason than making browser designers start thinking about moving away from platform-specific implementations of features. I think that simply having standards out there will go a long way towards preventing another browser war a la Netscape/IE. That being said, I don't really care if my work is valid or standards-compliant per se as long as it works on FF2/FF3, IE, and Safari on both PC and Mac. I don't think I've ever put out a site that validates properly, but if it works for my audience I'm not going to lose any sleep.
posted by OverlappingElvis at 9:27 AM on October 16, 2008
Yeah, this is unfortunately where I stand as well. Standards have done very good things for the Web as a whole, I think, if for no other reason than making browser designers start thinking about moving away from platform-specific implementations of features. I think that simply having standards out there will go a long way towards preventing another browser war a la Netscape/IE. That being said, I don't really care if my work is valid or standards-compliant per se as long as it works on FF2/FF3, IE, and Safari on both PC and Mac. I don't think I've ever put out a site that validates properly, but if it works for my audience I'm not going to lose any sleep.
posted by OverlappingElvis at 9:27 AM on October 16, 2008
Krinklyfig nails it in one.
Very few people care, and of those that do, many are generally just circle-jerking their own self-controlled websites. It's very easy to make anything and everything "just so" when you only need to satisfy an audience, QA department, account rep, client, and review board that's made up of you, and only you.
And, of course, this survey also doesn't factor in the ever-present "I am aware that this sub-part does not validate by that reference, and here is why we made that decision, and here is how it performs in the browsers this project cared about." criteria.
Validation and standards are not inherently bad. They're just overhyped and too often oversimplified down to a true/false measure... like this survey.
posted by rokusan at 9:40 AM on October 16, 2008 [1 favorite]
Very few people care, and of those that do, many are generally just circle-jerking their own self-controlled websites. It's very easy to make anything and everything "just so" when you only need to satisfy an audience, QA department, account rep, client, and review board that's made up of you, and only you.
And, of course, this survey also doesn't factor in the ever-present "I am aware that this sub-part does not validate by that reference, and here is why we made that decision, and here is how it performs in the browsers this project cared about." criteria.
Validation and standards are not inherently bad. They're just overhyped and too often oversimplified down to a true/false measure... like this survey.
posted by rokusan at 9:40 AM on October 16, 2008 [1 favorite]
Maybe the web design world doesn't work like the programming world, but here's why *I* use standards
Standards work better in the programming world because they are almost always enforced better than they are in HTML. When I write code that has a minor syntax error, my compiler's parser throws a handy set of errors and/or warnings that tells me specifically what I'm doing wrong. A web browser, on the other hand, is specifically designed to always try to display a page, no matter how horribly mangled the content is. Even in the programming world, if standards aren't enforced in real life they tend to not be followed. SQL, for example, has a nice standard that database vendors ignore large parts of and add extra features on top of, which results in a lot of SQL code that doesn't follow the official standard.
I think HTML pages would be a lot more standards compliant if the standards and browsers had been created today instead of more than a decade ago. For a long time the mantra around implementing protocols and standards was Postel's Law: be conservative in what you do, be liberal in what you accept from others. The problem is that using a more "robust" parser will often result in very wrong output with little or no warning, and will not do much to deter people from giving it bad data, and people have recognized the impacts of these problems in established standards like HTML. The more recent XML standard is a case where the overall programming community has explicitly called for less robust parsers by writing parsers that only accept valid XML, in order to promote proper implementations. Obviously being able to handle errors gracefully is still important, but there is a lot less tolerance in the community as a whole for non-compliance to standards.
posted by burnmp3s at 9:47 AM on October 16, 2008 [1 favorite]
Standards work better in the programming world because they are almost always enforced better than they are in HTML. When I write code that has a minor syntax error, my compiler's parser throws a handy set of errors and/or warnings that tells me specifically what I'm doing wrong. A web browser, on the other hand, is specifically designed to always try to display a page, no matter how horribly mangled the content is. Even in the programming world, if standards aren't enforced in real life they tend to not be followed. SQL, for example, has a nice standard that database vendors ignore large parts of and add extra features on top of, which results in a lot of SQL code that doesn't follow the official standard.
I think HTML pages would be a lot more standards compliant if the standards and browsers had been created today instead of more than a decade ago. For a long time the mantra around implementing protocols and standards was Postel's Law: be conservative in what you do, be liberal in what you accept from others. The problem is that using a more "robust" parser will often result in very wrong output with little or no warning, and will not do much to deter people from giving it bad data, and people have recognized the impacts of these problems in established standards like HTML. The more recent XML standard is a case where the overall programming community has explicitly called for less robust parsers by writing parsers that only accept valid XML, in order to promote proper implementations. Obviously being able to handle errors gracefully is still important, but there is a lot less tolerance in the community as a whole for non-compliance to standards.
posted by burnmp3s at 9:47 AM on October 16, 2008 [1 favorite]
then validation fails because the URLs constructed by other websites, which I have zero control over, aren't made to a certain standard.
$link=str_replace('&','&',$link);
or
$link=htmlspecialchars($link);
Or the equivalent in your language.
Ampersands in links (and any other attributes and most pieces of text) should always be escaped in this way, and this will not break the links in any browser I know of.
posted by cillit bang at 9:55 AM on October 16, 2008
$link=str_replace('&','&',$link);
or
$link=htmlspecialchars($link);
Or the equivalent in your language.
Ampersands in links (and any other attributes and most pieces of text) should always be escaped in this way, and this will not break the links in any browser I know of.
posted by cillit bang at 9:55 AM on October 16, 2008
My sites generally validate almost automatically because I write standards-based code as part of the production process. It's not an extra step, and it doesn't take extra time. I like having valid code because I feel that it helps reduce cross-browser differences. I don't usually validate every page as part of launching the site, but when I spot-check the sites usually validate.
posted by kirkaracha at 10:02 AM on October 16, 2008 [1 favorite]
posted by kirkaracha at 10:02 AM on October 16, 2008 [1 favorite]
Hang on, since politicalfilter accepts HTML from users rather than just links, you need something like HTML Purifier, which will process the HTML and fix the dodgy links, as well as filtering various scripting attacks and doing other code tidying.
posted by cillit bang at 10:02 AM on October 16, 2008
posted by cillit bang at 10:02 AM on October 16, 2008
Also nowadays with everyone using javascript libraries, all your dependencies have to be valid too.
And all the ads you're including better be valid too.
posted by smackfu at 10:04 AM on October 16, 2008 [1 favorite]
And all the ads you're including better be valid too.
posted by smackfu at 10:04 AM on October 16, 2008 [1 favorite]
I'm unconvinced that messing about with ampersands in links benefit anyone whatsoever.
posted by Artw at 10:08 AM on October 16, 2008
posted by Artw at 10:08 AM on October 16, 2008
(It's exactly that kind of thing that makes working in XML or XSLT a pain in the ass)
posted by Artw at 10:28 AM on October 16, 2008
posted by Artw at 10:28 AM on October 16, 2008
My blog, which has a valid template, fails in part due to a Flickr generated badge in my sidebar and an embedded YouTube video.
Standards compliance is going to come from the top down. YouTube belongs to Google. Flickr belongs to Yahoo. If the content they're putting out isn't valid, why are all the hobbyists and smaller companies (like Last.fm and AllConsuming, whose badges in my sidebar are also not valid for my transitional doctype) supposed to bother with compliance, again?
posted by Dreama at 10:30 AM on October 16, 2008
Standards compliance is going to come from the top down. YouTube belongs to Google. Flickr belongs to Yahoo. If the content they're putting out isn't valid, why are all the hobbyists and smaller companies (like Last.fm and AllConsuming, whose badges in my sidebar are also not valid for my transitional doctype) supposed to bother with compliance, again?
posted by Dreama at 10:30 AM on October 16, 2008
My personal sites validate. My job-related sites, however, don't—because my bosses want it to look just so to match the piss-poor graphic design job they paid some hack to work up for them.
posted by sonic meat machine at 10:52 AM on October 16, 2008
posted by sonic meat machine at 10:52 AM on October 16, 2008
I'm unconvinced that messing about with ampersands in links benefit anyone whatsoever.
It's about ambiguity. The code:
<a href="http://example.com/query?q=2"234">
Should always mean the URL http://example.com/query?q=2"234, but since most HTML coders don't bother to escape their ampersands, the browser has to guess whether you might have meant http://example.com/query?q=2"234 instead*. And the logic to do that guesswork has to be incorporated into every single HTML interpreter.
Alternatively, everyone could follow the bloody standard in the first place and we wouldn't get into this mess. The horse has long since bolted for HTML, but XML has very strict rules that forbid any parser from doing any kind of guesswork, for exactly this reason.
(* this example might sound contrived or rare, but the way RSS fails to address it is a big part of the reason Atom exists and has been widely adopted)
posted by cillit bang at 11:00 AM on October 16, 2008
It's about ambiguity. The code:
<a href="http://example.com/query?q=2"234">
Should always mean the URL http://example.com/query?q=2"234, but since most HTML coders don't bother to escape their ampersands, the browser has to guess whether you might have meant http://example.com/query?q=2"234 instead*. And the logic to do that guesswork has to be incorporated into every single HTML interpreter.
Alternatively, everyone could follow the bloody standard in the first place and we wouldn't get into this mess. The horse has long since bolted for HTML, but XML has very strict rules that forbid any parser from doing any kind of guesswork, for exactly this reason.
(* this example might sound contrived or rare, but the way RSS fails to address it is a big part of the reason Atom exists and has been widely adopted)
posted by cillit bang at 11:00 AM on October 16, 2008
specialfriend wrote "It's very tough to make a site that works on Internet Explorer, Firefox, and Safari."
No, it isn't. It's extremely easy to make a site that works on IE (any flavor), Firefox, Netscape, Lynx, Amaya, Webkit, what have you. It is very tough to make a site that works well across browsers if you wish to have any semblance of control over the visual display of the content, or want to add basically anything except images. Plain old HTML is dead simple. The problem starts when you try to make a consistent look and feel across browsers.
A major part of this problem is the plain fact that up until fairly recently, some of the major browser vendors (*ahem Microsoft cough cough*) didn't give two shits about standards, and expected web authors to find hacky fixes. Things are beginning to get better.
You have three choices, as I see it.
1) Work towards validation only: Accept that you really have no control over the display of content, and use nothing more than plain unstyled text and perhaps static images, letting the browser display them as it will.
2) Aim for consistency at the expense of validation: Work damn hard trying to make things look the same in as many places as possible, using whatever measures necessary.
3) Do your best to generate valid code, that shows as much cross-browser consistency as possible, starting with the code itself and building upwards (rather than thinking about graceful degradation, go the opposite direction: Content first, presentation as supported) while at the same time pushing as hard as possible to get browser vendors to agree on at least basic standards.
1 is giving up and going backwards. 2 is a pain in the ass, now and forever, with a large possibility that each new iteration of a browser will require a huge time investment to test and patch against. 3 is perhaps harder now, but ought to be the easiest long-term solution.
But what do I know. I'm just a guy who writes web pages because I can, and reads A List Apart on occasion to see just how much of an amateur I really am.
posted by caution live frogs at 11:03 AM on October 16, 2008
No, it isn't. It's extremely easy to make a site that works on IE (any flavor), Firefox, Netscape, Lynx, Amaya, Webkit, what have you. It is very tough to make a site that works well across browsers if you wish to have any semblance of control over the visual display of the content, or want to add basically anything except images. Plain old HTML is dead simple. The problem starts when you try to make a consistent look and feel across browsers.
A major part of this problem is the plain fact that up until fairly recently, some of the major browser vendors (*ahem Microsoft cough cough*) didn't give two shits about standards, and expected web authors to find hacky fixes. Things are beginning to get better.
You have three choices, as I see it.
1) Work towards validation only: Accept that you really have no control over the display of content, and use nothing more than plain unstyled text and perhaps static images, letting the browser display them as it will.
2) Aim for consistency at the expense of validation: Work damn hard trying to make things look the same in as many places as possible, using whatever measures necessary.
3) Do your best to generate valid code, that shows as much cross-browser consistency as possible, starting with the code itself and building upwards (rather than thinking about graceful degradation, go the opposite direction: Content first, presentation as supported) while at the same time pushing as hard as possible to get browser vendors to agree on at least basic standards.
1 is giving up and going backwards. 2 is a pain in the ass, now and forever, with a large possibility that each new iteration of a browser will require a huge time investment to test and patch against. 3 is perhaps harder now, but ought to be the easiest long-term solution.
But what do I know. I'm just a guy who writes web pages because I can, and reads A List Apart on occasion to see just how much of an amateur I really am.
posted by caution live frogs at 11:03 AM on October 16, 2008
In related news, 99.7% of the web works pretty well in all major browsers.
posted by Nelson at 11:07 AM on October 16, 2008 [3 favorites]
posted by Nelson at 11:07 AM on October 16, 2008 [3 favorites]
Yeah, that's exactly the kind of utterly contrived problem you get by treating your HTML as XML. Putting an encoded character in there is ugly and a pain in the ass, and then you get into problems when your encoding and unencoding things and end up with ridiculous shit like & turning up in code.
I guess really the URL should be CDATA, but obviously that's not going to fly.
posted by Artw at 11:16 AM on October 16, 2008
I guess really the URL should be CDATA, but obviously that's not going to fly.
posted by Artw at 11:16 AM on October 16, 2008
A major part of this problem is the plain fact that up until fairly recently, some of the major browser vendors (*ahem Microsoft cough cough*) didn't give two shits about standards, and expected web authors to find hacky fixes.
Netscape 4, how quickly you are forgotten.
posted by Artw at 11:17 AM on October 16, 2008 [1 favorite]
Netscape 4, how quickly you are forgotten.
posted by Artw at 11:17 AM on October 16, 2008 [1 favorite]
Frameworks should handle validation as shipped, by default, including user-provided data, links, etc. That there are some that don't do this is why we have XSRF and XSS attacks.
posted by Skorgu at 11:20 AM on October 16, 2008
posted by Skorgu at 11:20 AM on October 16, 2008
XHTML deal-breakers:
- All content has to be dished out as application/xhtml+xml -- or application/xml for older versions of IE
- I have to encode all my ampersands. EVERY SINGLE SOLITARY AMPERSAND. Like this & this & this? That's 8 extra bytes. And you better do it, otherwise your browser will choke and your house will be catch fire and your family will die.
- < > & " and ' are the only guarnateed entities. Bye-bye ♥ × and a whoooole lot of others.
- Any javascript on the page? Better make sure to wrap it in a <[CDATA[ block or suffer the browser's wrath when it hits its first AND condition.
- Oh yeah, and document.write doesn't work. No Google Adsense for you (without tricks).
- <input>, <br>, and a bunch of other tags aren't closed! So any <br /> or <input ... /> will be invalid. Hope you're not using any server-generated code (no Java framework currently gets this right, for example, and I'd bet hard money no .NET platform can do it right, either).
- Bye-bye <blockquote>, <i>, <b>, <u>, <s>.
Yeah, that's exactly the kind of utterly contrived problem you get by treating your HTML as XML.
The problem is that treating HTML as "HTML" requires putting complex algorithms into the intepreter that try to guess at exactly which combinations of ampersands, characters and semi-colons should be counted as entities and which should be left as is. That's your encoding problem right there. Whereas if everyone was standards compliant there'd be exactly one rule for everyone to follow.
(I don't actually support the arbitrary standards compliance for HTML that Opera et al are advocating, since the problems caused by lack of validation or lack of well-formedness are a tiny, unimportant subset)
posted by cillit bang at 11:39 AM on October 16, 2008
The problem is that treating HTML as "HTML" requires putting complex algorithms into the intepreter that try to guess at exactly which combinations of ampersands, characters and semi-colons should be counted as entities and which should be left as is. That's your encoding problem right there. Whereas if everyone was standards compliant there'd be exactly one rule for everyone to follow.
(I don't actually support the arbitrary standards compliance for HTML that Opera et al are advocating, since the problems caused by lack of validation or lack of well-formedness are a tiny, unimportant subset)
posted by cillit bang at 11:39 AM on October 16, 2008
Visual Studio will autocorect your <br> tags to </br>.
And yes, all of this is very annoying when trying to work to standards, and you get the general impression that the people creating the standards are far more into some weird fetishism of the form than actually doing anything that’s useful, which contributes in part to my fuck you attitude to validation.
posted by Artw at 11:45 AM on October 16, 2008
And yes, all of this is very annoying when trying to work to standards, and you get the general impression that the people creating the standards are far more into some weird fetishism of the form than actually doing anything that’s useful, which contributes in part to my fuck you attitude to validation.
posted by Artw at 11:45 AM on October 16, 2008
Geez, C_D, that's grim. (I used to know all that shit when I first started out in the business, back when dinosaurs ruled the earth. But I gave up trying to keep up years ago and now I just make stuff that works. I'm a really bad person, I guess....)
posted by lodurr at 11:48 AM on October 16, 2008
posted by lodurr at 11:48 AM on October 16, 2008
Also what fucking idiot decided that the widths of elements would not include padding and borders? That opens up a whole heap of problems, and completely fucks up anything percentile without a lot of pissing around. I’d like to find whoever made that decision and beat them over the head with a book of CSS hacks.
posted by Artw at 11:58 AM on October 16, 2008 [3 favorites]
posted by Artw at 11:58 AM on October 16, 2008 [3 favorites]
You might want to talk to every client I've ever had about that.
posted by Artw at 12:23 PM on October 16, 2008
posted by Artw at 12:23 PM on October 16, 2008
wow. wow. seriously? seriously???
you'd think that a whole gang of web developers would get it, but apparently not...(seriously? really?)
the internet isn't about conforming to any standards...the internet is about AND AND AND AND AND. AND HTML4.0 works, AND CSS works, AND tag clouds work, AND embedded video works, AND, hell, even HTML 1.0 still works and that expired YEEEARs ago. Validated? validated by who? who's that? never heard of him, why should i care?
if only 4.3% of sites pass validation, then it sure don't look like validation is very valid. i'd suggest that it should be ejected from the web, but that's not really necessary, even if it's useless.
posted by sexyrobot at 12:37 PM on October 16, 2008
you'd think that a whole gang of web developers would get it, but apparently not...(seriously? really?)
the internet isn't about conforming to any standards...the internet is about AND AND AND AND AND. AND HTML4.0 works, AND CSS works, AND tag clouds work, AND embedded video works, AND, hell, even HTML 1.0 still works and that expired YEEEARs ago. Validated? validated by who? who's that? never heard of him, why should i care?
if only 4.3% of sites pass validation, then it sure don't look like validation is very valid. i'd suggest that it should be ejected from the web, but that's not really necessary, even if it's useless.
posted by sexyrobot at 12:37 PM on October 16, 2008
My site failed because when you compose a basic HTML page in notepad, it doesn't specify a UTF-8 encoding. Sorry, man. When I learned HTML back in 1993, we knew nothing of this "UTF-8". It was just HEAD, TITLE, BODY, done.
posted by Eideteker at 12:52 PM on October 16, 2008 [1 favorite]
posted by Eideteker at 12:52 PM on October 16, 2008 [1 favorite]
We've talked about this before.
I'm surprised that the percentage of valid websites is as high as 5% - as mentioned, it could be even higher, were it not for generated content breaking a number of pages.
But really, where's the shock? Most people are barely competent, in almost every field. (There are exceptions in the higher-level professions, where results really matter, but on the whole the larger the field the more niches exist for people who barely know what they are doing.) This is especially true in web development, where everyone considers themselves a creator, or at least a contributor. It's made somewhat better, at least in theory, by content management tools and design, if most of those tools weren't half-assed to begin with (Myspace and DreamWeaver, I'm looking in your direction).
Very few people taken from the street would consider themselves prepared to write, illustrate and bind a book. But everyone believes they can make a web page. Don't get me wrong: I'm all for open access. But standards have a purpose. They might be arcane, they might be fussy, they might be a pain to implement, but they serve a very logical and ultimately very useful purpose.
And, turning to my peers for a moment, the industry has not been served well by the teaching profession. I teach fulltime and continuing education courses at a polytechnic, and have to fight instructors in other departments who still teach their students to use the font tag, with tables for presentation.
In development, the credo has been the trend has been "make it work in IE, whatever it takes". Which is like bashing a square peg to fit into a round hole when the hole itself is wrong.
Doing it right the first time is not difficult. And, slowly but surely, developers are getting the idea: I was stunned to learn that the MSN.com index page validated as strict. Once you do start to build valid code, everything else - CSS, DOM JavaScript - makes a lot more sense, and actually works. Yes, IE6 continues to be a massive spear in the side of suffering web developers. Conditional comments work wonders; longer term, the slow, gathering, steady pressure to drop IE6 support next year may have some effect.
posted by Bora Horza Gobuchul at 1:18 PM on October 16, 2008 [1 favorite]
I'm surprised that the percentage of valid websites is as high as 5% - as mentioned, it could be even higher, were it not for generated content breaking a number of pages.
But really, where's the shock? Most people are barely competent, in almost every field. (There are exceptions in the higher-level professions, where results really matter, but on the whole the larger the field the more niches exist for people who barely know what they are doing.) This is especially true in web development, where everyone considers themselves a creator, or at least a contributor. It's made somewhat better, at least in theory, by content management tools and design, if most of those tools weren't half-assed to begin with (Myspace and DreamWeaver, I'm looking in your direction).
Very few people taken from the street would consider themselves prepared to write, illustrate and bind a book. But everyone believes they can make a web page. Don't get me wrong: I'm all for open access. But standards have a purpose. They might be arcane, they might be fussy, they might be a pain to implement, but they serve a very logical and ultimately very useful purpose.
And, turning to my peers for a moment, the industry has not been served well by the teaching profession. I teach fulltime and continuing education courses at a polytechnic, and have to fight instructors in other departments who still teach their students to use the font tag, with tables for presentation.
In development, the credo has been the trend has been "make it work in IE, whatever it takes". Which is like bashing a square peg to fit into a round hole when the hole itself is wrong.
Doing it right the first time is not difficult. And, slowly but surely, developers are getting the idea: I was stunned to learn that the MSN.com index page validated as strict. Once you do start to build valid code, everything else - CSS, DOM JavaScript - makes a lot more sense, and actually works. Yes, IE6 continues to be a massive spear in the side of suffering web developers. Conditional comments work wonders; longer term, the slow, gathering, steady pressure to drop IE6 support next year may have some effect.
posted by Bora Horza Gobuchul at 1:18 PM on October 16, 2008 [1 favorite]
conditional comments in IE are an example of the underlying browser problem- solving a kludge with another kludge.
BTW there are other (harder) ways to put in conditional actions depending on browser. They also happen to be standards-compliant... just sayin'. Also possible to handle server-side.
> These are two of the biggest favors Microsoft ever gave web developers, and hardly anybody knows about them
Favours? (head asplodes)
If you're really into self-abuse, try outputting HTML from MS Word... gaaaahhhh.
posted by Artful Codger at 1:47 PM on October 16, 2008
BTW there are other (harder) ways to put in conditional actions depending on browser. They also happen to be standards-compliant... just sayin'. Also possible to handle server-side.
> These are two of the biggest favors Microsoft ever gave web developers, and hardly anybody knows about them
Favours? (head asplodes)
If you're really into self-abuse, try outputting HTML from MS Word... gaaaahhhh.
posted by Artful Codger at 1:47 PM on October 16, 2008
Funny thing is nobody gives a shit about any of this anymore.
posted by tkchrist at 2:03 PM on October 16, 2008
posted by tkchrist at 2:03 PM on October 16, 2008
Rational and competent developers write valid HTML and CSS first, then tweak as necessary. In unusual cases the tweaks unavoidably include invalid elements, attributes (
It’s oft-repeated in fora like this one that I Just Have to Get This Fucker Working, but the way you do that is to start with correct code. You should never be surprised that this fucker isn’t working when it uses invalid code. And apart from maybe
Nonetheless, we are at the point where unescaped ampersands and the
posted by joeclark at 2:29 PM on October 16, 2008
hasLayout
, tabindex=-1
), and declarations.It’s oft-repeated in fora like this one that I Just Have to Get This Fucker Working, but the way you do that is to start with correct code. You should never be surprised that this fucker isn’t working when it uses invalid code. And apart from maybe
acronym
/abbr
in IE6 and longdesc
, I fail to see how using valid “tags” will cause things not to work in modern browsers.Nonetheless, we are at the point where unescaped ampersands and the
EMBED
tag (sic) mean nothing.posted by joeclark at 2:29 PM on October 16, 2008
This validator is invalid:
Quod erat demonstrandum.
posted by TheOnlyCoolTim at 2:52 PM on October 16, 2008 [1 favorite]
Quod erat demonstrandum.
posted by TheOnlyCoolTim at 2:52 PM on October 16, 2008 [1 favorite]
Oh no Metafilter! "64 Errors, 123 warning(s)"
posted by turgid dahlia at 2:53 PM on October 16, 2008
posted by turgid dahlia at 2:53 PM on October 16, 2008
now I just make stuff that works. I'm a really bad person, I guess...
Nothing wrong with the pragmatic approach.
They might be arcane, they might be fussy, they might be a pain to implement, but they serve a very logical and ultimately very useful purpose.
Really? Then can you please explain to me what the logical and ultimately very useful reason why this is invalid?
And how about the target attribute? The W3C in their infinite wisdom decided that the handy-dandy target attribute of a link is deprecated. So how do you open links up in new windows/tabs? Answer: Javascript. Any way to do it without javascript? Answer: nope. Next question?
posted by Civil_Disobedient at 3:53 PM on October 16, 2008
Nothing wrong with the pragmatic approach.
They might be arcane, they might be fussy, they might be a pain to implement, but they serve a very logical and ultimately very useful purpose.
Really? Then can you please explain to me what the logical and ultimately very useful reason why this is invalid?
<form> <input type="text" /> </form>A <form> and an <input> field... what could be simpler? Simple and invalid.
And how about the target attribute? The W3C in their infinite wisdom decided that the handy-dandy target attribute of a link is deprecated. So how do you open links up in new windows/tabs? Answer: Javascript. Any way to do it without javascript? Answer: nope. Next question?
posted by Civil_Disobedient at 3:53 PM on October 16, 2008
Screw validation. My website visitors can pay for their own damn parking.
posted by srboisvert at 3:56 PM on October 16, 2008 [2 favorites]
posted by srboisvert at 3:56 PM on October 16, 2008 [2 favorites]
Validation would be fine if the actual standards weren't written by a bunch of hose-heads who have never, ever been paid to design and construct a commercial website.
Yeah, "fluid design" websites are great, and I try to do them when I can, just as I try to make my sites validate. Alas, most clients, do, in fact, want "pixel perfect" designs. Yeah, that's just evil and blah blah blah. Too fucking bad, that isn't going to change.
Jakob whatever his name is can bite my unvalidated sack.
posted by maxwelton at 4:14 PM on October 16, 2008
Yeah, "fluid design" websites are great, and I try to do them when I can, just as I try to make my sites validate. Alas, most clients, do, in fact, want "pixel perfect" designs. Yeah, that's just evil and blah blah blah. Too fucking bad, that isn't going to change.
Jakob whatever his name is can bite my unvalidated sack.
posted by maxwelton at 4:14 PM on October 16, 2008
So how do you open links up in new windows/tabs? Answer: Javascript. Any way to do it without javascript? Answer: nope. Next question?
JavaScript is teh evils you terrible accesibility breaker you. Clearly, like aligning things vertically, its a thing you don't need to do and shouldn't do.
posted by Artw at 4:34 PM on October 16, 2008 [1 favorite]
JavaScript is teh evils you terrible accesibility breaker you. Clearly, like aligning things vertically, its a thing you don't need to do and shouldn't do.
posted by Artw at 4:34 PM on October 16, 2008 [1 favorite]
And 99.9999% lack properly formed OWL semantic web tags! Impudent charlatans, I say, all of them!
posted by RobotVoodooPower at 5:41 PM on October 16, 2008
posted by RobotVoodooPower at 5:41 PM on October 16, 2008
I only care about validation when something stops working and I can't figure out why. Then, I validate, and about 50% of the time fixing the validation error fixes the bug.
posted by signal at 6:14 PM on October 16, 2008 [1 favorite]
posted by signal at 6:14 PM on October 16, 2008 [1 favorite]
When I write html from scratch it's always 100% valid. When I inherit sites, if the code is invalid I don't lose any sleep over it as long as it renders in IE/FFox.
posted by furtive at 6:34 PM on October 16, 2008
posted by furtive at 6:34 PM on October 16, 2008
Really? Then can you please explain to me what the logical and ultimately very useful reason why this is invalid?
posted by Bora Horza Gobuchul at 7:33 PM on October 16, 2008
<form> <input type="text" /> </form>Sure. Just off the top of my head:
- Your input text box doesn't have a label. The person filling out the form has literally no idea what you're asking for.
- You're not using an action attribute for the form, and don't have a submit button, so you're not using PHP, Perl, or some other server-side language. Instead, the assumption is that you're validating the form via JavaScript. Fair enough. But there's no id on the input box itself for the JavaScript to hook into.
- An id would also work with the missing label to help make the form accessible, which is important - kind of like making a wheelchair ramp to the front door of a business or public institution.
posted by Bora Horza Gobuchul at 7:33 PM on October 16, 2008
I work in a shop where no one validates their code (except me). What that means is that every person writing a website has their own crazy style of code that is nearly impossible to interpret when another designer picks it up. '20 nested divs inside an a tag? Great idea!'. Validation isn't perfect but it does help weed out really bad code that is there for the sake of being bad (or a lack of understanding).
Now, that being said, when I'm done, my pages don't always validate. But that's because I've made a conscious decision to deviate from the standards for some reason or another. I just used lightbox++ on a site, and had to give up a perfectly valid page for the attributes to make flash work. But that's okay, the end result was more important.
The nice thing about page validation too is that it ensures longevity of a page. Not just in browser rendering but in search engine placement in regards to using the right code in the right places. Of course there are exceptions where things that were valid end up being depreciated, but on the whole, you're safer future-proofing a site by making sure its using valid markup.
Also, if clients won't pay for the time to do it right, that falls squarely on the shoulders of said developer for not making the business case for doing it right. Or, you know, of course why should you have to explain it if you're doing it right the first time?
posted by [insert clever name here] at 7:47 PM on October 16, 2008
Now, that being said, when I'm done, my pages don't always validate. But that's because I've made a conscious decision to deviate from the standards for some reason or another. I just used lightbox++ on a site, and had to give up a perfectly valid page for the attributes to make flash work. But that's okay, the end result was more important.
The nice thing about page validation too is that it ensures longevity of a page. Not just in browser rendering but in search engine placement in regards to using the right code in the right places. Of course there are exceptions where things that were valid end up being depreciated, but on the whole, you're safer future-proofing a site by making sure its using valid markup.
Also, if clients won't pay for the time to do it right, that falls squarely on the shoulders of said developer for not making the business case for doing it right. Or, you know, of course why should you have to explain it if you're doing it right the first time?
posted by [insert clever name here] at 7:47 PM on October 16, 2008
Yeah, "fluid design" websites are great,
No, they suck, as the line length is either too big or too small, depending on the size of the browser window.
The web is about typography, not code.
posted by signal at 7:49 PM on October 16, 2008
No, they suck, as the line length is either too big or too small, depending on the size of the browser window.
The web is about typography, not code.
posted by signal at 7:49 PM on October 16, 2008
The web is most definitely not about typography. If it is anything (and I don't find this definition particularly appealing either) the web is about the semantic markup of data.
However, I would give you the fact that support for typography on the web has been middling to poor until recently. Fortunately, the emergence and slow adoption of CSS 3 standards, particularly in regards to web fonts, is very promising (albeit long, long overdue).
posted by Bora Horza Gobuchul at 8:11 PM on October 16, 2008 [1 favorite]
However, I would give you the fact that support for typography on the web has been middling to poor until recently. Fortunately, the emergence and slow adoption of CSS 3 standards, particularly in regards to web fonts, is very promising (albeit long, long overdue).
posted by Bora Horza Gobuchul at 8:11 PM on October 16, 2008 [1 favorite]
Really? Then can you please explain to me what the logical and ultimately very useful reason why this is invalid?
Because it doesn't have an action attribute. Sure, most browsers will default to behaving the same way as if you'd inserted an empty action attribute, but since that behaviour isn't mentioned in the spec, you have no business relying on this happening, unless you're happy that your site will only work in whatever subset of browsers you know to behave this way. Which I guess you are.
posted by cillit bang at 1:59 AM on October 17, 2008
Because it doesn't have an action attribute. Sure, most browsers will default to behaving the same way as if you'd inserted an empty action attribute, but since that behaviour isn't mentioned in the spec, you have no business relying on this happening, unless you're happy that your site will only work in whatever subset of browsers you know to behave this way. Which I guess you are.
posted by cillit bang at 1:59 AM on October 17, 2008
Bora Horza Gobuchul: Typography != choosing fonts. Rather it's presenting textual and graphic information in a legible, useful, meaningful and attractive way. This is the main function/problem of the web, and fluid web sites mostly suck at it, present company excluded, of course.
I think the limited font choices on the web are a feature, making it much more readable. The A List Apart article you link to shows how unlimited font choices can lead to some fairly awful designs in the hands of the typographically uninformed, more of a threat than an opportunity.
Better 'support' for typography would include having decent justifying and hyphenation in browsers, being able to use true caps, consistent indentation, etc.
posted by signal at 7:38 AM on October 17, 2008
I think the limited font choices on the web are a feature, making it much more readable. The A List Apart article you link to shows how unlimited font choices can lead to some fairly awful designs in the hands of the typographically uninformed, more of a threat than an opportunity.
Better 'support' for typography would include having decent justifying and hyphenation in browsers, being able to use true caps, consistent indentation, etc.
posted by signal at 7:38 AM on October 17, 2008
The web is about content, and web standards should be about delivering that content in the best way. When they become about pedantic code smuggery then they fail utterly.
posted by Artw at 8:25 AM on October 17, 2008
posted by Artw at 8:25 AM on October 17, 2008
Signal, I assume you meant “being able to use true small caps.” I don’t see how indention is not “consistent” on the Web.
You can control the problem of marathon line lengths with
Artw, for “pedantic code smuggery” I will read “correct semantics.”
posted by joeclark at 10:18 AM on October 17, 2008
You can control the problem of marathon line lengths with
max-width
(requiring a JS fix for IE6).Artw, for “pedantic code smuggery” I will read “correct semantics.”
posted by joeclark at 10:18 AM on October 17, 2008
Hey, I'm all for nicely structured code and accesabilty and all that, but encoding mid-URL ampersands? That's bullshit.
Making a flap over people using I instead of EM is another one that’s always seemed damn silly, no one in the real world cares if you made something italic using I or EM, and TBH I couldn’t really tell you the difference between something that has been semantically emphasized and something that is semantically strong.
posted by Artw at 10:38 AM on October 17, 2008
Making a flap over people using I instead of EM is another one that’s always seemed damn silly, no one in the real world cares if you made something italic using I or EM, and TBH I couldn’t really tell you the difference between something that has been semantically emphasized and something that is semantically strong.
posted by Artw at 10:38 AM on October 17, 2008
Hope this helps!
No, you moron, you completely missed the point.
Because it doesn't have an action attribute.
Argh, idiots everywhere!
The big, large, flaming tree of a point that the awfully fucking literal folks around here seem to have missed is that the reason the code above is invalid is because you cannot enclose an input tag (an inline element) inside of a form tag. The code above should fail in HTML 4.01 (Strict). You must use a block level element for the form to be valid. (What's that...? An ACTION attribute, you say? NO KIDDING!?)
"No problem! Just add a <fieldset> around it," you exclaim! What? I'm sorry I can't hear you because I'm laughing so loud. The fieldset? One of the most inconsistently half-assed HTML tags!? Oooh yeah, let's add a <legend> tag to the mix as well, then slit our wrists playing with positions and borders for the next two weeks. Can we please?
Clearly, like aligning things vertically
ARGH! Exactly! To the W3C, even the concept of height scares them. It's a page of infinite goddamned length! "Uh, well, no your Majesty, actually it's a PDA with a resolution of 600x400... or it's a computer LCD with a resolution of 1280x1024... or it's a piece of paper with a resolution of 8.5" x 11"..." The W3C has lost all notion of sanity at this point. Just code what works.
posted by Civil_Disobedient at 4:00 AM on October 21, 2008
No, you moron, you completely missed the point.
Because it doesn't have an action attribute.
Argh, idiots everywhere!
The big, large, flaming tree of a point that the awfully fucking literal folks around here seem to have missed is that the reason the code above is invalid is because you cannot enclose an input tag (an inline element) inside of a form tag. The code above should fail in HTML 4.01 (Strict). You must use a block level element for the form to be valid. (What's that...? An ACTION attribute, you say? NO KIDDING!?)
"No problem! Just add a <fieldset> around it," you exclaim! What? I'm sorry I can't hear you because I'm laughing so loud. The fieldset? One of the most inconsistently half-assed HTML tags!? Oooh yeah, let's add a <legend> tag to the mix as well, then slit our wrists playing with positions and borders for the next two weeks. Can we please?
Clearly, like aligning things vertically
ARGH! Exactly! To the W3C, even the concept of height scares them. It's a page of infinite goddamned length! "Uh, well, no your Majesty, actually it's a PDA with a resolution of 600x400... or it's a computer LCD with a resolution of 1280x1024... or it's a piece of paper with a resolution of 8.5" x 11"..." The W3C has lost all notion of sanity at this point. Just code what works.
posted by Civil_Disobedient at 4:00 AM on October 21, 2008
« Older Under [his] gloves, were a pair of softer gloves. | June Carter and Johnny Cash with Pete Seeger Newer »
This thread has been archived and is closed to new comments
posted by krinklyfig at 7:36 AM on October 16, 2008 [3 favorites]