I am planing to do a series of posts on analyzing popular websites I visit frequently. By now I am so obsessed to optimize for speed that I’ve run out of sites to optimize, hence I’m doing it for other sites.
To start with I’ll analyze Search Engine Roundtable.
Before I start, wanna make it absolutely clear that I am not looking to make the site owners or webmasters look bad. Barry Schwartz, editor of SERoundtable.com, tells me “The site is old and is being completely revamped from bottom up.”
About SERoundtable.com :-
Traffic Rank: 5,738 (Alexa), 16,804 (compete)
Browsers : Assuming lesser than average Internet Explorer users based on the nature of the target demographics.
My assumption is that the site targets regular (loyal) readers, so I place more emphasis on repeat views and caching.
The website is slow, I access it regularly and loose productivity while waiting for the pages to load.
|Page||Start Render||Load Time|
|Home Page – First View||1.894s||9.733s|
|Home Page – Repeat View||0.914s||4.619s|
|Individual Post – First View||1.539s||11.855s|
|Individual Post – Repeat View||0.858s||5.171s|
Web Speed Rankings (Using Show Slow):-
|Page||Yslow grade||Page Speed score|
|Homepage||D (65)||C (74)|
|Individual Post||D (64)||C (75)|
Visual Latency (tested from Dulles, VA using IE7):
Homepage : 2.5 seconds
Post Page : 2.6 seconds
(The timing I’ve used here is the time at which the page becomes somewhat usable, this in my definition is the time at which the content title and body is shown on the screen. This is the time at which the user can start engaging with the site while other stuff loads.)
Thus you can see that for whole 2.5 seconds the user is doing nothing but staring at an almost blank screen.
So we have a problem here, unoptimized websites open exponentially slower from across the globe in Thailand especially when using crappy ISPs. So whats the solution?
Firstly, kudos to SERoundtable.com for using Source Ordered Content(SOC). This ensures that the title and text of the post is the first thing to render on the page, lesser susceptible to Frontend SPOF. The reason for SOC here may or may not be for SEO purposes, but in terms of page speed this is a valuable optimizations. Also try to move the adslot SER-HOME-468×60-1 to below the content.
Now the things that needs to be changed (In order of importance as I see it) :-
- Minimize the number of requests : The time taken to load the page is somewhat directly proportional to the number of requests. Most people these days use high speed internet access which basically gives them a higher bandwidth but not better speed (latency). The following steps need to be implemented to reduce the number of requests.
- Disable Etags and add far futures cache headers for static objects (Easy/Urgent/Obvious): 49 of the 50 requests made to www.seroundtable.com are non frequently changing static files (CSS, images, etc). Currently the browsers send a If-Modified-Since requests to all of them to check if it has changed since then or not. My advise is to send expires headers to at least 7 days( > 30 days preferred) in the future. This would make repeat views amazingly fast. Quite easy to implement cache control in Apache. Etags don’t serve much purpose here, require a tiny bit of CPU for each request, useless information IMHO.
- Use CSS Sprites (Easy – Intermediate): Request for following files can easily be sprited (after converting everything to either gif or png – png prefered) — nav_previous.gif, nav_next.gif, h2.gif, 3dots.gif, pen.png, sphinn-favicon.png, Twitter-16×16.png, digg-favicon.jpg, barry.png, entry-footer.png, comforminput.gif, comformtextarea.gif, sbauthors.gif, sbarchives.gif, sbmoreinfo.gif, mbg.jpg, sbhome.gif, sbourforums.gif, delicious-favicon.png, google-favicon-16.png, sbtheforums.gif, arrow_yellow.gif, rss.gif, foo.jpg, credits.jpg – Thats a reduction of 24 requests!
- Use Data-URI for the tiny background files (Advanced/Optional): Embed the tiny background files directly inside the CSS using Data URI Scheme. The drawback is IE7 and below dont support this, the workaround is to use the star hack to send them the current background settings. Another drawback is this would increase the size of the CSS file, but the total size after compression(see below) would be about the same as the current total size. This optimization reduces the need for the following files in all non IE browsers and IE8 — sbbg.gif, formbg.gif, mbg.jpg, sbhbg.gif, sboxbg.gif, tbbg.gif, rcredbg.gif, rcbg.gif, threadlibg.gif, bg.gif, hbgleft.jpg, msep.gif, fbg.gif – Thats elimination of 13 requests! Another advantage is that the css file starts to download much before actual rendering starts, this at the start of rendering, these files would already be available, making the background graphics appear almost instantly without additional blocking of rest of the page.
So Out of 50 requests made to www.seroundtable.com, 37 of them can be completely eliminated for first view, for repeat views almost all but base page should be cached.
- Enable Keep-Alive(Easy/Obvious): An uncached pageload makes about 50 requests to www.seroundtable.com. For each of these requests, the client must make a new connection with the server. Enabling Keep-Alive can get the job done in much fewer(depends on browsers parallel connection policies) connections, thus saving dozens of round-trips. Enabling keep-alive does increase memory utilization on the server, but the benefits are greater. See this chart, the brown portion of the loadtime can be removed for ~90% of the requests. Here are some instructions for Apache.
- Optimize each request : Reduce the size of each request/response using the following methods
- Gzip text assets (Easy/Obvious): About 75% of data transfer for the base page and the css file can be reduced by simple enabling gzip on Apache.
- Compress images(Easy) : My favorite tool for this is Smush.it by Yahoo! Similar results can be achieved using say GIMP or Photoshop.
- Use cookie-less domain for static content(Easy/Important): Serve static files(images/css/etc) from an external domain. Currently, the site sets cookies which the browser automatically sends with each subsequent request to the server while requesting static files. This cookie information is absolutely useless to the server and causes the request size to be big. The workaround is to host these files on another dedicated cookieless domain, say www.serassets.com or something. This tip may not sound important, in fact I used to ignore it, but once implemented the effects are HUGE.
- Onpage/rendering optimizations:
- Choose <link> over @import (Easy): Yahoo! explains it better than I can. Affects usability in IE, delays rendering.