Page Speed Analysis: Search Engine Roundtable

I am planing to do a series of posts on analyzing popular websites I visit frequently. By now I am so obsessed to optimize for speed that I've run out of sites to optimize, hence I'm doing it for other sites. To start with I'll analyze Search Engine Roundtable. Before I start, wanna make it absolutely clear that I am not looking to make the site owners or webmasters look bad. Barry Schwartz, editor of SERoundtable.com, tells me "The site is old and is being completely revamped from bottom up." About SERoundtable.com :- Traffic Rank: 5,738 (Alexa), 16,804 (compete) Browsers : Assuming lesser than average Internet Explorer users based on the nature of the target demographics. My assumption is that the site targets regular (loyal) readers, so I place more emphasis on repeat views and caching.

The Problem

The website is slow, I access it regularly and loose productivity while waiting for the pages to load. Page Load times (using WebPageTest.org - here and here):
Page Start Render Load Time
Home Page - First View 1.894s 9.733s
Home Page - Repeat View 0.914s 4.619s
Individual Post - First View 1.539s 11.855s
Individual Post - Repeat View 0.858s 5.171s
Web Speed Rankings (Using Show Slow):-
Page Yslow grade Page Speed score
Homepage D (65) C (74)
Individual Post D (64) C (75)
Visual Latency (tested from Dulles, VA using IE7): Homepage : 2.5 seconds Post Page : 2.6 seconds (The timing I've used here is the time at which the page becomes somewhat usable, this in my definition is the time at which the content title and body is shown on the screen. This is the time at which the user can start engaging with the site while other stuff loads.) Thus you can see that for whole 2.5 seconds the user is doing nothing but staring at an almost blank screen. So we have a problem here, unoptimized websites open exponentially slower from across the globe in Thailand especially when using crappy ISPs. So whats the solution?

The Solution

Firstly, kudos to SERoundtable.com for using Source Ordered Content(SOC). This ensures that the title and text of the post is the first thing to render on the page, lesser susceptible to Frontend SPOF. The reason for SOC here may or may not be for SEO purposes, but in terms of page speed this is a valuable optimizations. Also try to move the adslot SER-HOME-468x60-1 to below the content. Now the things that needs to be changed (In order of importance as I see it) :-
  1. Minimize the number of requests : The time taken to load the page is somewhat directly proportional to the number of requests. Most people these days use high speed internet access which basically gives them a higher bandwidth but not better speed (latency). The following steps need to be implemented to reduce the number of requests.
    1. Disable Etags and add far futures cache headers for static objects (Easy/Urgent/Obvious): 49 of the 50 requests made to www.seroundtable.com are non frequently changing static files (CSS, images, etc). Currently the browsers send a If-Modified-Since requests to all of them to check if it has changed since then or not. My advise is to send expires headers to at least 7 days( > 30 days preferred) in the future. This would make repeat views amazingly fast. Quite easy to implement cache control in Apache. Etags don't serve much purpose here, require a tiny bit of CPU for each request, useless information IMHO.
    2. Use CSS Sprites (Easy - Intermediate): Request for following files can easily be sprited (after converting everything to either gif or png - png prefered) -- nav_previous.gif, nav_next.gif, h2.gif, 3dots.gif, pen.png, sphinn-favicon.png, Twitter-16x16.png, digg-favicon.jpg, barry.png, entry-footer.png, comforminput.gif, comformtextarea.gif, sbauthors.gif, sbarchives.gif, sbmoreinfo.gif, mbg.jpg, sbhome.gif, sbourforums.gif, delicious-favicon.png, google-favicon-16.png, sbtheforums.gif, arrow_yellow.gif, rss.gif, foo.jpg, credits.jpg - Thats a reduction of 24 requests!
    3. Use Data-URI for the tiny background files (Advanced/Optional): Embed the tiny background files directly inside the CSS using Data URI Scheme. The drawback is IE7 and below dont support this, the workaround is to use the star hack to send them the current background settings. Another drawback is this would increase the size of the CSS file, but the total size after compression(see below) would be about the same as the current total size. This optimization reduces the need for the following files in all non IE browsers and IE8 -- sbbg.gif, formbg.gif, mbg.jpg, sbhbg.gif, sboxbg.gif, tbbg.gif, rcredbg.gif, rcbg.gif, threadlibg.gif, bg.gif, hbgleft.jpg, msep.gif, fbg.gif - Thats elimination of 13 requests! Another advantage is that the css file starts to download much before actual rendering starts, this at the start of rendering, these files would already be available, making the background graphics appear almost instantly without additional blocking of rest of the page.
    So Out of 50 requests made to www.seroundtable.com, 37 of them can be completely eliminated for first view, for repeat views almost all but base page should be cached.
  2. Enable Keep-Alive(Easy/Obvious): An uncached pageload makes about 50 requests to www.seroundtable.com. For each of these requests, the client must make a new connection with the server. Enabling Keep-Alive can get the job done in much fewer(depends on browsers parallel connection policies) connections, thus saving dozens of round-trips. Enabling keep-alive does increase memory utilization on the server, but the benefits are greater. See this chart, the brown portion of the loadtime can be removed for ~90% of the requests. Here are some instructions for Apache.
  3. Optimize each request : Reduce the size of each request/response using the following methods
    1. Gzip text assets (Easy/Obvious): About 75% of data transfer for the base page and the css file can be reduced by simple enabling gzip on Apache.
    2. Compress images(Easy) : My favorite tool for this is Smush.it by Yahoo! Similar results can be achieved using say GIMP or Photoshop.
    3. Use cookie-less domain for static content(Easy/Important): Serve static files(images/css/etc) from an external domain. Currently, the site sets cookies which the browser automatically sends with each subsequent request to the server while requesting static files. This cookie information is absolutely useless to the server and causes the request size to be big. The workaround is to host these files on another dedicated cookieless domain, say www.serassets.com or something. This tip may not sound important, in fact I used to ignore it, but once implemented the effects are HUGE.
    4. Minify code (Easy/Optional): Minify HTML, run the inline javascript using Google's Closure Compiler for faster execution. Smaller files also require lesser resources on the browser.
    5. Use a CDN(Easy): A CDN (Content Delivery Network) mantains POPs (Point Of Presence) globally, and can be used to serve the static files(images, javascripts, css, etc) which would be served from a server closer to you. The charges are not much. So if SERoundtable.com used say Internap, I would be accessing the static files from a server in Singapore instead of US. In case of Akamai I'd be accessing the static assets from a server within my ISPs network! Choose any from this list of CDNs.
  4. Onpage/rendering optimizations:
    1. Choose <link> over @import (Easy): Yahoo! explains it better than I can. Affects usability in IE, delays rendering.
    2. Move the inline JavaScript into an external file (Easy): This would make the page load faster for repeat users since the base page would be lighter and the JavaScript file would already be cached in the client's browser (or at upstream proxy). Make sure to minify the JavaScript (see point 3-4)
    3. Move JavaScript to the end (Intermediate): The JavaScript files/codes that are not necessary to be accessed during the page rendering, should be moved to the bottom of the page just before or just before they are needed.
    4. Use the Async Google Analytics Code(Easy): The new method really doesn't block the rest of the page. The current document.write method in the section blocks rendering on many browsers.
    5. Move DFP code lower(Intermediate): Right now, the javascript code for DFP is placed within the section, this makes 3 serial blocking requests which must be downloaded and executed before rest of the elements download and render blocking the page for 500 to 1000 ms. My suggestion is to move this just above the first Adblock(the GA_googleFillSlot() calls) in the HTML. This combined with the current Source Ordered Content architecture will ensure that the users will feel minimal impact even if DFP has problems.
    6. Try using DFP's Iframe tagging (Easy but Experimental/Risky): The major drawback to this is that it may mess up expandable ads, which I don't think SERoundtable.com uses. If you notice the video above, notice starting from about the 4 second mark, currently the 6 adblocks load one by one. With the Iframe tagging, they would load independently, not blocking each other or the rest of the page. A major protection from frontend SPOF due to 3rd party ad networks. Personally I am impressed with the results of this method.
    7. Domain Sharding (Advanced): Split requests across several domains like static1.seroundtable.com, static2.seroundtable.com, static3.seroundtable.com, etc to increase parallel downloads. The downside is that this results in increased DNS lookups which is bad. Fortunately there is a workaround to this issue.

Conclusions

Apart from the tips above make sure to read the best practices for faster websites. Most of what I've suggested is redundant from the best practices, tailored to this particular website. Lots more can be done, but at the moment compare to what I've mentioned above, the effects of additional optimizations may seem like a waste of time. Just a reminder that I've been told that SERoundtable.com is revamping the entire site and it "is really a full revamp from the CMS to the html on the frontend.". Implementation of most of the above tips would make at least one reader of SERoundtable.com very happy :). Total time spent : >= 5 hours. Appreciate any criticism of my analysis via comments below. A word of caution to other optimizers, don't get over obsessed with yslow, Google page speed or page load times, its the visitors experience which matters the most. Optimize for visual latency more than bragging rights!

Shameless Plug

I am available for consulting on web speed issues, contact details in the right sidebar.
Tags: analysis pagespeed reviews site performance
Categories: Webmaster Things