When Validation Becomes Unimportant
Post from June 16, 2008 (↻ November 1, 2022), filed under Web Development (feed).
This and many other posts are also available as a pretty, well-behaved ebook: On Web Development.
Validation becomes unimportant only once you’re ahead of the game. Even then, truly mastering HTML and CSS, it’s best to stick with valid markup and styling. Improving latency might constitute the only exception—if at all.
The motivation to point this out stems from several discussions I had that “deprioritize” validation in order to save bits and bytes . Alas, the context includes use of
font elements, layout tables, and the like, and as you may relate, talking about invalid markup in order to save load time in this environment is kind of silly. Therefore I repeat the statement above: Consider “invalidating” your documents and style sheets only for the sake of load time, once you proved to be a master.
Everything else is counter-productive. Validation fulfills several important purposes, one of them being that it drives learning. Let me tell you a short story: Through validation I probably learned the most about writing markup and style sheets, even though at the beginning, I still tortured the documents I wrote. Coming from that experience, I absolutely advise against using invalid HTML before a point of mastery.
There are many other things to address before dropping code that isn’t strictly necessary, but still required by the spec. Using markup according to its semantics can save bytes. Careful consideration of IDs and classes (including spare use of microformats) can save bytes. Questioning elements, referring to “divitis” and certain
meta elements, can save bytes. Proper use of HTML (instead of XHTML) can save bytes. And the list goes on, including many additional performance measures unrelated to HTML.
Once you’re there you might say “okay, now let’s help the user once more and improve performance by pushing things a little bit further.” Not a second earlier. Don’t. And please don’t put a beginner the idea into their head that invalid HTML was good for performance. That would be a bogus claim, a harmful claim, and our industry would never recover from the resulting increase in incompetence. (Uh.)
I’m Jens, and I’m an engineering lead and author. I’ve worked as a technical lead for Google, I’m close to W3C and WHATWG, and I write and review books for O’Reilly. I love trying things, sometimes including philosophy, art, and adventure. Here on meiert.com I share some of my views and experiences.
If you have a question or suggestion about what I write, please leave a comment (if available) or a message. Thank you!
On June 16, 2008, 17:19 CEST, Duluoz said:
Great article Jens. I hope there are no professionals out there that have this idea that “once you master” something, its acceptable to be lackadaisical. For example, I know that my doctor is a master of medicine, but would not want him to “deprioritze” anything during a surgery! 😉
On June 16, 2008, 17:41 CEST, Travis said:
This reminds me the post I read days ago about sitepoints’s website is badly invalidate. As a web solutionist / tutor, they should be shame of themselve for delivering such a laughable code.
On June 16, 2008, 18:48 CEST, Coen Jacobs said:
When we’re talking about saving a few bits here, and a few there, aren’t we looking past the true problems of poorly loading webpages?
Images are the real pain in the ass for lots of users, when pages seem to take ages to load a few (background-)images.
But besides that, the article is great. It shows that we are looking in rules, rules and more rules, but we’re forgetting what it really is about.
Hmm, maybe I’ll write a follow-up.
On June 16, 2008, 20:25 CEST, Devon Young said:
I think one of the coolest “secrets” about HTML, is that you don’t need the HTML, HEAD, or BODY elements for a document to be valid (strict or passe). Yet, in XHTML, they’re all required. That’s one of the best legit ways to save a few bytes when one wants to.
On June 16, 2008, 20:33 CEST, Peter said:
An other point of discussion rises whether invalidating a document for filesize purposes harms the accessibility and/or the integration of external and webbased apps with ones website.
I would and will always advice people to write accessible, valid and semantic markup. Slow-loading pages aren’t caused by valid markup, its the content where it goes wrong. A website should be minimalistic and content-centered (of course there are exceptions). Practicing goal or user centered design should give you the basis to avoid discussions like these.
On June 16, 2008, 20:36 CEST, Peter said:
I forgot to say: nice article Jens - i would like it even more if it would really emphasize the importance of valid and structured markup - even if one is given the title ‘master’ 😉!
On June 16, 2008, 22:12 CEST, Meint Post said:
Why contemplate such a move when you have easier (and in my opinion better) measures like HTTP compression and/or (output) caching that will save those extra bytes at nearly no overhead without any of the detrimental effects of willfully going against the standard?
On June 17, 2008, 21:58 CEST, Maniquí said:
I agree that validation is not a goal nor a panacea, and also agree is a good learning tool for newcomers.
But I d0n’t think I agree on invalidating code for the sake of performance when there are, probably, other places where performance could be tweaked (both by saving bytes or by oiling mechanisms).
And please never put a beginner the idea into his head that invalid HTML is good for performance
But what happens if a beginner is already reading your article? Aren’t you stating that invalid HTML improves performance?
Also, sadly, there are many beginners that feels they are already masters.
A similar article should be written: “when indentation becomes unimportant”.
I’ve read comments on sites and CMS forums about the indentation “problem”: OCD people complaining about the un-indented output code.
On June 18, 2008, 14:44 CEST, John Faulds said:
OCD people complaining about the un-indented output code
Reminds me of a forum poster having trouble with spacing between a horizontal list of navigational items (due to the line breaks in the HTML): they ended up writing a bunch of extra CSS for IE just so they could have their HTML nice and ‘pretty’ and not have to put each list item on a single line in the source.
On June 18, 2008, 19:28 CEST, Duluoz said:
But what happens if a beginner is already reading your article? Aren’t you stating that invalid HTML improves performance? -Maniquí
The performance of how markup is compiled is dictated by the code written by the browser vendor; is it not? Who is to blame? Would you not discourage a beginner from writing ‘invalid’ markup in favor of performance and hope browser vendors keep making the same ill advised non-recommendatory choices in code efficiency priorities? (I hope that makes sense!) 😉
Do what’s right - regardless of the marginal performance increases.
On June 25, 2008, 17:26 CEST, Jason Marsh said:
Very interesting theory, seems risky tough
On June 26, 2008, 14:37 CEST, Adult Ühler said:
This sounds very strange to me. Personally I think that the ways you have mentioned to save code are something to avoid.
On July 4, 2008, 13:10 CEST, Niels Matthijs said:
Well, I fail to see the real gain in this. Especially when working like this forces the parsers to keep up their draconian error handling (which I assume slows things down).
Anyway, I’ve seen some more people mentioning speed gains and page loads which surprises me a bit. Dedicated an article to it myself to clear up my point of view and express my worries better than this comment block allows me 😊
On July 5, 2008, 13:08 CEST, Niels Matthijs said:
What I’m trying to find out is why you would “need” this kind of speed/performance gain.
I’ve worked on quite a few projects already but never felt that the templates/sites I made were too heavy or slow. The only reason I can think of is development for phones, but even then invalidating sounds like a bad idea.
Could you give some reasons or situations where for example you would invalidate your code and the speed/performance gain you’re looking at ? We’ve done a few priliminary tests ourselves and the results weren’t really noticable at all.
On July 9, 2008, 13:19 CEST, Richard Morton said:
I think that it is really interesting to challenge accepted convention in this way. Working with standards and accepting them is fine but that doesn’t mean that we can’t push the envelope and change things. For example whilst dropping quotation marks around attributes might lead to code maintenance difficulties that is really just because it is a change in convention, not easy to manage but not impossible either.
I would have to strongly disagree with the first commenter Duluoz about deprioritising anything during surgery. It happens all the time and is necessary because complications occur, or something is discovered that wasn’t anticipated, but a good surgeon will be able to prioritise effectively.
On July 9, 2008, 15:48 CEST, Richard said:
I’d rather have valid markup myself, at the end of the day is a few bytes realy going to help in the age of broadband? Slow scripts, large flash/image files maybe, but skiming on markup is not really worth it.
People these days spend too much time worrying about every single person who is running on a 56.k modem or mobile device. The only time you should do this is if your expecting a large portion of your customers to be on their mobiles. I shouldnt have to skim on the functionality and beauty of my website just cause you dont want to upgrade (and to make matters worse they’re probably run IE6).
Even mobile devices are strating to catch up now (slowly). Design for the future, not the past. Otherwise you may find you will just have to end up redesigning the same project over years to come.
On August 20, 2008, 4:01 CEST, Sean Fraser said:
CSS validation becomes unimportant when one uses vendor-specific extensions. HTML should - Rarely - be allowed to fail validation.
On January 1, 2009, 22:44 CET, Skylar Saveland said:
Well, I am certainly still learning. I thought that www.google.com would be a good place to find standards. I was wrong. Still, I am modeling my little sandbox on the big one. Perhaps I can work backwards and make a site that behaves as google’s but that is cleanly and validly executed. Either way is a learning experience. This site is really nice btw.
On January 31, 2009, 4:18 CET, dani said:
as a volume-based Internet user, I fully agree with the saving bits/bytes
I guess, the problem here is the size of web pages which affects loading time
I always choose the lightweight web pages described by google on search engine result pages, especially in the end of months
On February 3, 2009, 19:54 CET, jura said:
“And please never put a beginner the idea into his head that invalid HTML is good for performance.”
currently taking webdesign in school (actually a senior) and funny that you mention this, because our professors are actually teaching us that if/when you validate your HTML that it’s ok for it not to validate. And precisely for the exact reason that you have above “invalidating” your documents and style sheets for the sake of load time.” Unfortunately I am far from a master, but interestingly enough I am being taught not to understand what is not throwing my code. Now I certainly love learning and finding the root to my problem, but it can become quite cumbersome once you correct an outstanding error that unleashes a slew of new ones. still learning the old and trying to keep up with the new. I use w3 all the time.
On February 8, 2009, 12:49 CET, Hagen von Eitzen said:
Should even amaster remove quotes from IDs to save a few bytes in trying to improve page download times? How much could that save? If you are lucky, you can reduce <a id=”X”>Y</a> to <a id=X>Y</a> - save 2 out of 15 bytes (and once you’ve run out of one-character IDs or have IDs at multi-letter tags or tags with longer attributes or have tags without IDs or do in fact provide any useful text content, the rate quickly drops enormously).
And after all, does it matter if this finally squeezes a 10000 char page into 9500 chars (which is optimistic)? Do the math: if I’m not mistaken, both require 7 ethernet packets, thus the loading time saved is zero.
Moreover, one usually can save much more from removing unnecessary white-space (as e.g. used for structured indentation) without breaking the standards - and yet keeping indentations may save much more developping time / prevent hard to spot errors.
On January 25, 2010, 15:17 CET, Ibiza said:
I think we always have to validate our documents and not just our tags and css. I write in Spanish and not use the proper html tags produces results very unpleasant accents, for example. If not valid css when you change your style sheet, the results are unpredictable. When you have to delegate your work to a partner the task is multiplied.
On February 10, 2010, 17:31 CET, Stephen said:
Even though this article is about a year old, I was sent here from my CSS professor to read some of your blog and came across this particular article. We are supposed to find an article that we can discuss with the class, in an open format. While I am struggling to understand every thing you are saying, I hope that many of my class mates can help me better understand validation and how it is best implemented. Thanks!
Maybe this is interesting to you, too:
- Next: 10 Measures for Continuous Website Maintenance
- Previous: Thoughts on Email
- More under Web Development, or from 2008
- Most popular posts
Looking for a way to comment? Comments have been disabled, unfortunately.
Get a good look at web development? Try The Web Development Glossary (2020). With explanations and definitions for literally thousands of terms from Web Development and related fields, building on Wikipedia as well as the MDN Web Docs. Available at Apple Books, Kobo, Google Play Books, and Leanpub.