How to make faster websites and enhance your site user experience – Part 1

In all my posts till now I concentrated over how to get started with web development and related introductory stuffs. In next few posts (series of 3 posts), I will write some of my learning in the field of “How to make your website faster and enhance your site user experience”.

So lets assume I made a site, which rocks with all those web2.0 features in it. But still my user complaints like:

  1. My site hangs their browser while loading.
  2. The page takes too long to load.
  3. Blah Blah Blah…… 🙁

So how to go about and make sure that you do your best to make things rocking. Here are a few of my tips on things that you should concentrate upon and which will definitely help in fast loading of your pages:

  1. GZip your site static content: This is one of the most important stuff which helps a great deal in enhancing your site download speed. Let me explain the process:
    • When I type www.yahoo.com in my browser and press enter, an interaction starts between my browser and the www.yahoo.com servers. My browser asks for the yahoo.com homepage from the yahoo servers. However all Grade-A browsers sends a few useful stuff along with this homepage request. They tells the server that I support gzip encoding, and if you serve me the contents in g-zipped format I will happily except it, unzip at my end and render it for the user.
    • The Yahoo server sees these headers being sent by the browser and depending upon its configuration it gzips the content before serving back to the browser. However, if Yahoo server’s are not configured for the gzip encoding, it will server the content in its raw form. Further, if Yahoo server’s are configured for gzip encoding but my browser isn’t a Grade-A browser, servers are smart enough to return back the content in the form which suits my browser.
    • For instance here are the request header sent by my browser to the yahoo servers when I type in http://yahoo.com in my browser:

      In the above image we can very well see that my browser sends in a header saying Accept-Encoding : gzip,deflate. On seeing this in the headers Yahoo server responds as expected i.e. it serves back the gzipped contents to the browser.

      In the above image we see the response headers being sent by the Yahoo server to my browser. We can see that Yahoo server tells back my browser that it has served the contents in Content-Encoding : gzip , so please unzip them at your end before rendering them for the user.

      So how does the end user’s experience gets enhanced?

      If we see the above image we can very well see that the Yahoo homepage which is actually of 125.1Kb comes in as 31.9Kb only. Further the javascript files gets compressed and hence faster data transfer through the channels and thus faster experience for your site users.

      Though gzipping the site content increases your server side CPU load, but that gets compensated by the fast that users are able to see your pages load faster.

    • What all contents should I gzip? Its always better to gzip the javascript files, CSS files, HTML files, Plain Text files (In some cases XML’s too but I have had a few problems with gzipped XML’s). We generally don’t compress images, pdf’s etc as they are already in best compressed format and further compressing won’t give us any smaller file sizes.
  2. Setting an ETag for your sites content: This is another factor which helps the page load faster for the users. By setting up an ETag , server sends an information about the contents being served to the browser, telling the browser about its last modified date. So when I type in http://yahoo.com in my browser and presses enter, my browser requests for content from the Yahoo servers. In reply if it sees that the ETags haven’t changed for the requested component, it simply uses the component from browser’s cache. Hence this time we saves the data transfer part, as browser is capable enough to cache enough static data from one single domain.

    In the above image Yahoo servers doesn’t reply back the ETags possibly because of two reasons. Advertisements and Expires Headers which we will see in the next point. As Yahoo have advertisements on all their pages, they simply don’t want those contents to be cached by our browsers. So that everytime we open yahoo.com , we see a new advertisement. (Experts, Kindly correct me if I am wrong in my explanation here)

  3. Setting up the Expires headers: This factor even saves the browser pinging the server time.

    In the above image we can see Yahoo servers setting up an expire header telling the browser that, this component is not going to change before the setup expired header. Hence next time you type in Yahoo.com in your browser and press enter, browser sees the content which still haven’t expired and picks them up from the browser cache. Thus they don’t even ping the servers, saving your servers BW and enhancing your users page load time.

    What if I have setup expired headers 2 months from today and I want to change the content before that?
    Simple enough, next time in place of http://l.yimg.com/us.js.yimg.com/lib/bc/bc_2.0.4.js will be change to something like http://l.yimg.com/us.js.yimg.com/lib/bc/bc_2.0.5.js ,  i.e. changing the version of the javascript file being served from server side. Hence browser will be forced to download the new javascript file from the servers rather than using them from the cache.

  4. Use of sub-domain or different domain for serving static contents: We can very well see in the above image that Yahoo serves all its javascript files from a server called http://l.yimg.com which is supposedly one of its static server. But why the hell do I need to do this? This is because currently all Grade-A browsers allows 2 parallel connections to a host. i.e while loading a page yahoo.com, my browser can download only two files at a time from the yahoo yahoo.com server. Hence this will make other contents to go in queue, wait for the already downloading contents to finish before they get downloaded.

    Hence for the same reasons, we generally server our static content from one or more different sub0domains or full-domains all together. In future Internet Explorer 8 and Firefox are planning to increase this 2 requests per domain to 4-8 requests per domain.

  5. Minify your Javascripts and CSS files: What is minify? It was something developed and introduced by Douglas Crockford. Minifying your sites javascripts and CSS files two purpose at the same time. They decreases the size of your javascripts and CSS files and also hides (make it tougher to parse by others) your javascript calls. I minified javascript file will look like this:

    We can see it makes the size of javascript file by eliminating unwanted white spaces and blank lines. Plus it also makes it difficult for a hacker to tweak in the javascript file.

  6. Put CSS on top and JS at the bottom , avoid inline javascripts: By putting CSS on top ensures that user’s browser will render the page content for your users as and when it gets the HTML content. We want to put Javascripts at the bottom and avoind inline scripts, because while the page loads we do not want browser to start executing the contents in javascripts, which may create a lock time for other contents before they gets downloaded.

Thats it! There are a lot of other things which you may want to do, for making your site page load faster. But the above described 6 points dominates the page load time. I am no expert and still learning the tid-bits of scalable websites, hence if you find any mis-leading content , kindly leave a comment. I will be more than happy to edit the post.

PS: These are a few of the frontend things that we can do on the frontend to make our sites faster. In my next post of this series I will try to bring in the key backend things which can help in faster loading of the webpages.

Enjoy making cooler, faster webpages for your users 🙂

  • Varun

    Nice post, add a lil extra about AJAX calls.
    You should also post some cool tutes about wire shark dude, i use fiddler per i guess wireshark does a better job for the same….

  • Hi Varun,

    No doubt ajax provides faster accessibility within the site once loaded. However what I have described is faster loading of your site in first place. So both the things, though linked are different.

    Further Ajax and other fancy stuffs are enhancements to your websites. They are never the key features on your website. For instance Search is the key feature, and if you can provide auto suggestion using Ajax, thats an enhancement. Thus while developing your page we shd always concentrate what if user have javascript disabled?

    From my learning I will say never make a site which never kills itself if javascript is not enabled. Else in the longer run its your own harm (obviously in cases like maps and a few others you simply can’t ignore javascript)

  • manish

    if people are using firebug, then they can use yslow to check their performance of their web page.
    also yahoo has published some guidelines:

    http://developer.yahoo.com/performance/rules.html

  • Very correctly said. I use the following tools to analyze my site speed and optimizing it for the best:

    1. Firebug
    http://getfirebug.com/
    Mainly used by frontend developers for analyzing DOM and changing them on the fly. But the Net option also provides you the page profiling option.

    2. YSlow
    http://developer.yahoo.com/yslow/
    Tool developed at Yahoo under guidance of Steve Sounders (Now a Googler). It derived 14 basic rules as to how to optimize your website.

    3. Cuzillion
    http://stevesouders.com/cuzillion/
    You can see the impact of positioning various things at various places using cuzillion. You will be amazed to see how placing your javascript at the end makes your page load faster.

    Here is the link to his famous Even Faster Websites google IO video.

  • I think most of these points are mentioned in High Performance Web Sites (http://oreilly.com/catalog/9780596529307/)
    I think if you have got some of your ideas from there you need to credit the book/author.

  • Hi Shams,

    No I haven’t read that book and I haven’t taken any thing from any book. What I have written is from my understanding and learning.

    Further I have been inspired by Steve Sounders talk which I have posted above. 🙂

  • Ah ok. Sorry if I have offended you.
    In fact that book has been written by Steve Sounders 🙂 You should read it if you get the chance, really good book. But I guess you already know most of what it says 🙂

  • Yes in his talk he has almost covered all the important topics which he discovered while he was in yahoo. Further in his talk he also tells about his focus in next coming years. So most probably his talk covers more than what he have in the book. 🙂

  • Sowmya

    Very well documented ! 🙂

  • varun

    here is how what abhi said can be used in asp.net ….
    using System;
    using System.Collections;
    using System.Collections.Generic;
    using System.Configuration;
    using System.Globalization;
    using System.IO;
    using System.IO.Compression;
    using System.Reflection;
    using System.Web;

    public class ScriptHandler : IHttpHandler
    {
    private static readonly string _versionNo;
    private static readonly bool _compress;
    private static readonly int _cacheDurationInDays;

    private static readonly List _files = new List();

    public static string VersionNo
    {
    get
    {
    return _versionNo;
    }
    }

    public bool IsReusable
    {
    get
    {
    return true;
    }
    }

    static ScriptHandler()
    {
    var settings = ConfigurationManager.GetSection(“scriptSettings”) as Hashtable;

    if (settings != null)
    {
    _versionNo = settings[“versionNo”].ToString();
    _compress = Convert.ToBoolean(settings[“compress”], CultureInfo.InvariantCulture);
    _cacheDurationInDays = Convert.ToInt32(settings[“cacheDurationInDays”], CultureInfo.InvariantCulture);

    var fileList = settings[“files”].ToString();

    if (!string.IsNullOrEmpty(fileList))
    {
    var files = fileList.Split(new[] {‘;’, ‘,’});

    if (files.Length > 0)
    {
    var context = HttpContext.Current;

    if (context != null)
    {
    foreach (var file in files)
    {
    _files.Add(context.Server.MapPath(file));
    }
    }
    }
    }
    }
    }

    public void ProcessRequest(HttpContext context)
    {
    var response = context.Response;

    if (_files.Count == 0)
    {
    response.StatusCode = 500;
    response.StatusDescription = “Unable to find any script file.”;
    response.End();
    return;
    }

    response.ContentType = “application/x-javascript”;
    var output = response.OutputStream;

    //Compress
    if (_compress)
    {
    var acceptEncoding = context.Request.Headers[“Accept-Encoding”];

    if (!string.IsNullOrEmpty(acceptEncoding))
    {
    acceptEncoding = acceptEncoding.ToUpperInvariant();

    if (acceptEncoding.Contains(“GZIP”))
    {
    response.AddHeader(“Content-encoding”, “gzip”);
    output = new GZipStream(output, CompressionMode.Compress);
    }
    else if (acceptEncoding.Contains(“DEFLATE”))
    {
    response.AddHeader(“Content-encoding”, “deflate”);
    output = new DeflateStream(output, CompressionMode.Compress);
    }
    }
    }

    //Combine
    using (var sw = new StreamWriter(output))
    {
    //Write each files in the response
    foreach(var file in _files)
    {
    var content = File.ReadAllText(file);
    sw.WriteLine(content);
    }

    sw.Write(“if(typeof(Sys)!=’undefined’){Sys.Application.notifyScriptLoaded();}”);
    }

    //Cache
    if (_cacheDurationInDays > 0)
    {
    var duration = TimeSpan.FromDays(_cacheDurationInDays);

    var cache = response.Cache;

    cache.SetCacheability(HttpCacheability.Public);
    cache.SetExpires(DateTime.Now.Add(duration));
    cache.SetMaxAge(duration);
    cache.AppendCacheExtension(“must-revalidate, proxy-revalidate”);

    var maxAgeField = cache.GetType().GetField(“_maxAge”, BindingFlags.Instance | BindingFlags.NonPublic);
    maxAgeField.SetValue(cache, duration);
    }
    }
    }

    this just increased performance in my current GIS project

    Thanx abhi for the insight 🙂

  • Hi Varun, Though I have little knowledge about ASP, but from your code I can access that you are checking for the request headers from the browser, and if it says “Yes i accept gzip encoding” You encode the data before sending.

    Is it?? If this is the case, then I must say one can control all this using .htaccess files and hence skip the coding parts.

    I use apache module for gzipping and .htaccess for setting the ETags. You can set the expires headers using the same too 🙂

  • varun

    well this is done without tempring with the server configuration….. plus a generic handler assista a lot speciaaly in serving images quickly… i had to optimise this in a given app which was alrady configured in the backend for high performane…. how ever adding the gzip stuff….. and eheader did a lot more….I guess with few more tampering i cut down respone time by 16%…..

  • Yes i guess this is the approach if you don’t have access to the server config files, possibily in case of free or shared hostings.

  • Thanks for the tips Abhi 🙂

  • Note that setting Etags incorrectly can be worse than leaving them alone.

  • Yeah ofcourse they can be, but if set correctly they work like a charm for your website users 🙂

  • I would also recommend this free online website performance testing tool: http://Site-Perf.com/

    It measure loading speed of page and it’s requisites (images/js/css) like browsers do and shows nice detailed chart – so you can easily spot bottlenecks.
    It’s very detailed and accurate, supports a lof of features like Keep-Alive or HTTP-compression.

    Also very useful thing is that this tool is able to verify network quality of your server (packet loss level and ping delays).

  • Nice advertisement for your blog, but its really cool if your site can achieve all this.

    Will try out in my free time

  • Great post: would like to add 2 free tools, we make available, to identify website speed over distance
    1. Website Speed Test
    2. Latency Simulator – an open-source Internet Explorer add-on that simulates the effects of network latency.

  • Hello abhinav,

    I have designed all websites in ASP. How do i enable “Gzip” option in IIS 5.1?

    Is there any other option to download my ASP websites faster. Can we use .htaccess?

    Looking forward to hearing back from you.

    Thanks,
    Rahul.

  • Akshat Mishra

    Hello Abhinav

    This is nice information , was helpful.Dont you think too many request to the servers is anyway going to slow the performance of the servers

    • Hi Akshat,

      What amount of traffic are you referring too?

    • Akshat Mishra

      Well I am talking about peak traffic

    • Actually that still doesn’t ans my question. I meant the figures in requests/sec. I have worked with php apps talking with memcached/mysql and a couple of internal API’s per request and still being able to serve 2500+ req/sec.

    • Akshat Mishra

      Thats pretty much the load I was talking about . Sounds Good

  • Nice article Abhi