« Google and my P.I.I. -- Scary thoughts | Main | Final test begins »

November 18, 2005

Comments

Nice Paul

A well coded XHTML page with one CSS file takes far less code than the same design built using font tags and nested tables on each page.

A badly coded XHTML page (defining a seperate class for each element, for example) can be a larger filesize, but that's down to developer incompetance rather than the coding language used.

michael

I wish folks at the compnay I work at could all read this. I think I'll pass this around. Thanks for the info. I find this extremely useful!

unknown comic

he is partly right on xhtml:

http://www.hixie.ch/advocacy/xhtml

also use hardware load balancing if you coun, products by foundry networks are good.

unimpressed

"do not use quotes on attributes unless necessary"

"Don't use XHTML"

utter, utter bollocks.

I bet you're the sort of person that still uses MS Frontpage

Ed

RE: b100dian

I'd take issue with a lot of what *you* have said...

Let's get down to fundamentals:
Kilobits are not the same as Kilobytes. So 35KB of data would not take 1 second to download on a 35 Kbps connection.

I think this article and the comments point out a common problem with HTML and web developers that no one wants to agree to standards. 'I want to use uppercase tags and I want to use html, I want to use XHTML with uppercase tags'. It's worth bearing in mind that if one develops using standards then their site will be accessible to more people, so for example reading a web-page on a psp, or mobile phone.

hahaha

Don't use XHTML. Don't use spaces and tabs. Ha ha ha.

Ha ha ha.

Ha!

anon

utter tripe

Zach

Speaking as a web professional, who agrees with all of the criticism of this article -- because the critics are absolutely right -- I think the author of the article should do the responsible and right thing, and either take this post down, or better yet include a foreword or afterword in it to acknowledge the article's mistakes.

The author is doing a major disservice to the industry, and to his or herself, by leaving the post as-is.

John Moo

No one should ever user something else than XHTML on newer pages!!!! We do have to get our standard, or web "development" will always be pain in the a (and we're there were we are because of ALL THE SEMI-PROs who think standards are for wumpuses - and because of companies like Microsoft).

Btw, uppercase as tags and so is forbidden in XHTML.

The best validator (not as broken as the one from the W3C):

http://www.validome.org

Carl Hubbers

Don't use XHTML/standards?

IMHO any speed benefits gained from using noncompliant code will only benefit the lucky users whose browsers happen to interpret your non-standard pages in the way you hope.

Good work getting this debate going :)

Roy

I think the overall idea behind putting a list like this together is to get the developer into an optimizing mindset. These days people take bandwidth for granted but you have to remember that over 70% of Internet users are still using dialup. I agree with some of the tips listed and I disagree with others, however I think the author is taking an important first step in emphasizing the importance of optimization. Just like when you code a standalone application, yes you can skip the optimization step and everything will work but you'll end up with a bloated and slower application in the long run. Just because we have the bandwidth doesn't mean we should not optimize. Also remember that your web page is not the only one a user will look at one time. I normally have 4 or 5 five browser tabs open at one time and 2 or 3 other applications running in the background and I'm sure that's lite compared to other people. With how Windows is, optimization is key. Maybe not everything that was mentioned here but like I said, this is a good first step in the right direction.

Matthias

I'm pretty much on the side of Sarah, and I would think most professional web developers would follow.
As stated above, some of the points seem badly researched, especially 11+12:
One of the reasons to use different subdomains for images and so forth is to get around the http-connection limit. Also, one of the ideas of CSS is that the CSS files are cached, and that reverses the problem, less data is loaded with each page request. Take away as much whitespace as you want, I'm pretty sure that an external css structure is, in the very most cases, much more effective.

As I'm not really sure: Is this to be taken as a serious article, or was that more a case of "let's rail up the developers?"

Adrian

Hmmmm. It would seem that you don't really know anything about professional web-development.

Use XHTML. Use well-written JavaScript with meaningful variable and function names (in .js files, not in-line). Use CSS (and use it well -- learn how to use float, margins etc -- again in external .css files). Do not use tables for layout -- not ever. Use Photoshop's Save For Web option to create your GIF/JPEG/PNG files.

We aren't living in the 90's anymore.

Marcelo Calbucci

IN RESPONSE TO ALL THE FLAME:
=============================

I'm sticking to what I said above. This is my opinion and I'll recommend it to anyone.

Specifics:
- HTML vs XHTML: For those that don't know, HTML *is* a standard.

- If you think stripping spaces (HTML,CSS,JS) is a maintenance nightmare, you are right. However who said you need to do it by hand? Write a filter to strip out the spaces on the Release version.

- These are not tips for your 10-page website. Nor to your 1000-page website. These are tips for people building massive websites (a la Amazon, Google, MSN, eBay).

All the tips above worked very well for me, and I learned them by trial and error. You should do your own research and find what works for you.


Cheers,
Marcelo

Matthew Price

I must disagree with your Javascript comment. If you can do things such as form validation and sorting on the client side and save a trip to the server, your application will perform much faster/better and your server will thank you for distributing the work load.

Son Nguyen

I know this blog software is based on TypePad and you might not have much control over the code generated. Nevertheless, viewing the source shows that the code does not follow your own tips/guidelines. Some optimizations is good but should not be abused, there are always trade-offs.

Sherwin Techico

Funny comments--makes this gloomy day in SF a good one =)

Happy New Year!

Adam

So you're trying to tell me (a comp.sci grad) that it's hard for a computer to parse a well-known and structured XML document, and that HTML tag-soup is faster?

Simon Willison

Adam: parsing HTML/tag soup almost certainly /is/ faster. The browser doesn't have to check each element for well-formedness, and HTML parsing has been around for much longer and hence is much better optimised than XHTML parsing (hardly anyone serves valid XHTML so there hasn't been nearly as much effort put in to optimising its performance). I've also heard that Firefox in XHTML mode won't start rendering the document until the entire XHTML page has been loaded, but I'm not 100% sure if this is true.

There are a lot of very ill-informed comments attached to this entry. HTML 4.01 is just as much of a "web standard" as XHTML 1.0 - and it will be supported by browsers for decades to come. Think about it: if a browser dropped HTML 4 support it wouldn't be able to render 99% of the web, and would be utterly useless.

I have no idea where the idea that "Ajax only works with XHTML" comes from, but it's not even remotely true.

John Koetsier

Umm ... to add to all the criticism ... it appears that your own company site (which does not use Typepad and therefore theoretically should be totally under your control) does not follow you own recommendations.

http://www.sampa.com/

Most of which, I agree with many of the other posters, are not great ideas at all.

Just one that I'll point out: embedding your CSS file in each page makes the download bigger, not smaller, since every single page must now contain all the CCS ... whereas if it's linked to, it only needs to be called once, and then is locally cached in the browser.

I notice that your company's site must not be done by you but a smarter web developer .... it calls an external CSS file.

;-)

miscblogger

great list. though i do not agree with the xhtml point. xhtml makes code more readable and clean. as a programmer, I am willing so sacrifice a couple of bytes for that.

nick

one thing that no one has mentioned is to question the whole premise of this post: that transmission and rendering time are the big problems... in many cases it isn't so!

In many cases the delay between request and render is due to server load. If you look at CNN or Google's markup (apparently the site scale this post is aimed at) they don't strip whitespace, they use external CSS, they use different domains etc.

The magic Google and these sites work is in using clustered databases, load-balance server farms, Akamai to reduce network latencies etc etc etc.

My thoughts on faster-loading pages:

1) Use the XHTML/external CSS/external JS approach
2) use gzip compression
3) make sure any dynamic server-side code is decently fast

anything else is marginal marginal marginal

Simon Willison

There's absolutely nothing to stop you from writing HTML 4.01 code that is just as readable and clean as the equivalent XHTML.

Jerry

"parsing HTML/tag soup almost certainly /is/ faster."

No, it absolutely is not. Valid html could be parsed with an SGML parser, which is slower than an XML parser that would be used on xhtml. But no browsers do this because there is so little valid html out there, so they have to use a slow, complicated parsing engine to decipher the mess that most "html" pages are. XML parsing is WAY faster, like 10x faster.

"I've also heard that Firefox in XHTML mode won't start rendering the document until the entire XHTML page has been loaded, but I'm not 100% sure if this is true."

Of course its true, and not just for firefox. You can't tell if an XML document is well-formed until you have the whole thing. So all browsers rendering xhtml (properly, with the right mime type set) will behave like this.

koenkai, you have no clue what you are talking about. First you say that using xhtml is good, then you say you like using all caps tags. Xhtml requires all tags to be lower case. Xhtml has nothing to do with gayjax, which predates xhtml first of all, and predates its new and retarded acronym. And it doesn't eliminate redundant calls to the server, it just requests less data with each call. 35KB of data will take NINE SECONDS to download on a 33.6K modem connection. That alone is enough time for people to give up on your bloated page and go somewhere else.

Marcelo, you are even more clueless. Your advice is not for sites like google and amazon, they aren't stupid enough to do this crap. You have obviously never worked on a large site, but if you at least look at the source of such sites, you will see they don't follow your terrible advice.

Your points contradict each other (if you are using output compression on the webserver, then stripping whitespace and comments and such gets you nothing), or are completely wrong (javascript and css belong in external files, so they are only requested on the first page view, then cached browser-side) or show your lack of understanding (using multiple subdomains is for spreading the requests for different kinds of content to different servers). At least you got a couple things right (that everyone else has known since 1994).

Ralesk, yes you need to put in image dimensions. The browser doesn't know how big the image is till it starts getting the image, but its already started rendering the page then. So it renders the page not knowing how much space to leave for your images, then as it gets the images it keep re-rendering the page with the right size for the image.

And to all the completely clueless people shouting nonsense like "use standards/xhtml", HTML is a standard. You bandwagon jumping dufuses that insist on pushing xhtml, but use tag soup with an xhtml doctype on your own pages are really annoying. If you have no clue about something, DON'T ADVOCATE IT. All you do is make people think that xhtml is empty hype with no value when you spew nonsense and show that you don't even understand xhtml yourself.

Simon Willison

"Marcelo, you are even more clueless. [...] You have obviously never worked on a large site"

You don't count MSN as a large site?

I stand by my statement: modern browsers probably render XHTML slower than HTML, so switching to XHTML because it is faster for browsers to parse is currently a waste of time.

Anyone know of any benchmarks that support or disprove this point?

The comments to this entry are closed.