Your website could be a visually-stunning conversion machine, but its appearance and functionality won't matter much if it takes too long to load. That's because web users are increasingly impatient and their impatience is likely to continue to grow as tablet and mobile web usage skyrockets.

Unfortunately, the list of things that can cause users to flee a website is long, and in many instances, any one of them can be enough to turn a new customer into a lost opportunity.

Here are 20 of the worst items on that list.

1. Social sharing buttons

You want your users to 'Like' you on Facebook, but all those 'Like' and social sharing buttons that rely on external JavaScripts could be slowing your pages down significantly, particularly if you're not loading them asynchronously. They can also make entire sites effectively useless, as some retailers learned the hard way earlier this year.

2. Ad network code

Ad revenue may be a boon for your bottom line, but ad network code, most of which is JavaScript-based, can make your pages slower. Some networks have implemented asynchronous delivery methods, which keep their JavaScripts from resource blocking, but many publishers still don't use these.

3. Analytics tags

Analytics is immensely valuable, but as with social sharing buttons and ad network code, JavaScript-based analytics tags can be a detriment to page load times. The good news: many analytics providers offer asynchronous delivery. The bad news: many publishers use multiple analytics providers unnecessarily, adding inefficiency that's entirely avoidable.

4. JavaScript-based functionality

JavaScript may be a necessary evil, particularly when it comes to ads and analytics. But plenty of sites employ JavaScript to provide functionality that is more efficiently provided server-side. One of the best examples: many sites use Disqus and Facebook for commenting, which is arguably the least efficient way of providing this basic functionality.

5. Unoptimized images

In many markets, fast internet connections may be the rule, not the exception, but that doesn't mean that you shouldn't maximize your image optimization. On an image-heavy page, a modest 5-10% reduction in file size per image can have a noticeable impact.

6. Bloated HTML

Getting your website to look right in all of the major browsers can be difficult; getting it to look right with efficient HTML markup can be even more difficult. Bloated HTML not only increases the amount of data that must be transferred to your users, it can have a significant impact on JavaScript performance when you're manipulating the DOM.

7. CSS (crappy stylesheets)

Serving up efficient HTML is just as important as serving up efficient CSS. Unfortunately, crappy CSS is easily found on many sites. From the use of insanely expensive descendant selectors to cutting-edge CSS3 selectors that are as performance-harming as they are useful, there are plenty of ways your CSS can be slowing your pages down.

8. Amateur jQuery

jQuery is the most popular JavaScript library and for good reason: it makes building sophisticated JavaScript functionality a snap. But that doesn't mean that everyone working with jQuery uses it efficiently. From event handlers to DOM manipulation, there are plenty of areas of jQuery where it's easy for less experienced developers to wreak havoc.

9. Subpar DNS

Many hosting companies and domain name registrars offer free DNS, but you often get what you pay for. While it's easy to take DNS for granted, you shouldn't: in some instances DNS lookups may account for a significant portion of load time.

10. Too many domains

Hosting page components across a few domains will enable parallel downloads -- a good thing -- but if components of your page are hosted across too many domains, expect the time required to resolve DNS to have a negative impact on page loading times.

11. Header fail

If you don't set a proper Expires or a Cache-Control header, you're making it harder for users' browsers to cache content locally. That means more requests for page components than is necessary.

12. Forgetting to Gzip it up

If your server isn't configured to apply Gzip compression to HTML, CSS and JavaScript, you're probably transferring a meaningfully larger amount of data than you have to.

13. Poorly-written server side code

You can do everything right when it comes to serving a rendered web page to a browser but if it takes forever to generate that web page because the web application behind your website is not performant, all of your efforts on the front end are for naught.

14. SQL queries from hell

If your web application isn't blazing fast, there's a decent chance that the problem might be a SQL query. From less-than-efficient joins to selects that involve large numbers of rows and no indexes, there are plenty of ways to poorly implement an SQL query.

15. Old versions of server-side software

From PHP to Java, popular server-side software on which many websites run has historically improved over time, so if you're stuck on older versions, there's a good chance you're not taking advantage of performance gains that could be realized with little effort beyond an upgrade.

16. The wrong web server

Web servers such as Apache and IIS may be mature, viable options for hosting a website. But when it comes to scaling and speed, many of the world's biggest sites turn to speed demons like Nginx and lighttpd -- web servers you hosting company or sysadmin is probably less likely use by default.

17. Flash and Java

While the number of sites requiring the use of third party browser plug-ins like Flash and Java has decreased significantly over the years, Flash and Java aren't dead and they still deserve a place on this list.

18. Shared web hosts

Thanks to the ever-decreasing cost of hardware, purchasing or leasing powerful servers no longer has to be a bank-breaker. But even so, many companies continue to host their websites in shared environments where somebody else's poorly-performing application turns your application into a poorly-performing application. Worth noting: this also includes cloud environments, where disk IO in particular has been an area of concern.

19. Public networks for private data transfer

One of the first things many publishers do when they need to start to scale is to split their web and database servers. But many don't connect the two using a private network, creating a huge bottleneck and point of failure that can easily impact website performance.

20. Inefficient server-side caching

Thanks to open source tools like Memcached and Redis, many publishers employ server-side caching, one of the most effective ways of improving web application performance. But not all cache implementations are created equal, and a poorly thought out cache invalidation strategy can render caching entirely useless.

Patricio Robles

Published 7 August, 2012 by Patricio Robles

Patricio Robles is a tech reporter at Econsultancy. Follow him on Twitter.

2647 more posts from this author

You might be interested in

Comments (12)

Save or Cancel
Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

In the large part, I would agree with this, but surely 'too many requests (from one domain)' should come in at number one? By employing techniques like using CSS sprites and base64 encoding images, dramatic improvements can be seen, much more than you would typically see from making DNS improvements.

about 6 years ago



In the past for me this has been where I have put in place too much graphical content / jpeg's into the coding. After all it does overload the page.

about 6 years ago



In the past for me this has been where I have put in place too much graphical content / jpeg's into the coding. After all it does overload the page.

about 6 years ago


David Rankin

fortunately Google's Webmaster Tools gives us information about page loading speeds. Point 10 is just weird - I can't imagine anyone ever deciding to "host page components across a few domains to enable parallel downloads" .This leaves you vulnerable several times over to slow loading or loading failure

about 6 years ago



Good list. You could sum much of it up by simply saying "Javascript".

As you note, finding a good server that allows full control of gzip and other variables is worth the money, especially when hosting multiple sites.

about 6 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

@David Rankin, it's a common practice to use multiple domains or sub-domains, for example, using Akamai or Edgecast CDN's, or indeed just an images. subdomain of your existing site. Most web browsers will download 6 items in parallel, only starting the 7th when the first of the 6 has finished. Being able to download another 6 by using another domain can make a massive improvement. Yes, DNS has to be watched, but these reputable CDN's will typically have world class DNS. Also, their routing to the end user (fewer hops, for example) will typically be much better than your average ASP. However, all that aside, the fastest request is the one you never make in the first place.

about 6 years ago


Deri Jones, CEO at SciVisum Ltd

All good tips Patricio.

But instead of diving into the list- you'll get better ROI and better engagament with your tech teams if you arrange for measurement first and then only ask for time and effort where you have genuine issues the tech team too can agree.

Ie - you'll get the best ROI by focusing your resources.

Simple page speed measuring tools like Google's or Firebug et al are great as an easy to get started with.

But what is most important to measure is speed for the multi-page Journeys through your site that are your money-makers: eg "Find a Product and Add to basket by navigating". Or 'Checkout' etc - using virtual user mystery-shopper style monitoring approaches.

Like I said on another thread here yesterday - using an evidence based approach is best - and in terms of your site's User Experience, the speed of a page here and page there may have no bearing on the overall speed of what users actually do, their multi-page Journeys.

Thinking mobile and tablet - I saw some interesting figures last month where there were significant differences between a main site and it's mobile version, that were quickly fixed once spotted: so do measure your mobile / flavoured RWD sites too.

They had sadly focused on optimising the search page, but ended up with the click after that being vary variable in performance on mobile, depending on which product on the results list you clicked and how long it happened to be since someone else had last searched for it !

To paraphrase the carpenter's addage: 'Measure twice and optimise once" !

about 6 years ago


Geoff Paddock

Hands up anyone reading this who was thinking "all this is a little bit technical and is there perhaps an easier way to make my pages faster?" I suggest non-technical web managers might like to think about using an automated external service which can help by pinpointing the actual cause of slow-serving pages.

With websites today featuring hundreds or even thousands of pages, manual testing of pages is an expensive option in terms of man-hours, and you will need a large web team if you plan on doing it in-house. Typically it can take 4-6 minutes to check a single page.

We know that a response time above 0.75 seconds to load a page can have a negative effect on user experience. Yet in a recent survey of the top UK retail companies, Sitemorse still found well a number of big high street names with web pages that took well over a minute to load.

about 6 years ago


Mystery Shopping

My site loads very slowly after I added the social buttons so I decided to add the social media share buttons only on my blog. And thanks for the other useful tips mentioned above

about 6 years ago


Kathy Heslop

And e-commerce merchants, watch out for synchronous JavaScript capture code which will also slow your load time down, let alone could bring your site down! You need asynchronous code. How to check: In order for a JavaScript code to be asynchronous, it must have the attribute value async set to true, so look for the word 'async' within the JavaScript code. If it's not there, the code your third party provider is using is synchronous.

almost 6 years ago


Linda Waller, Retired at Ms.

I hate when a web page requires java script in order to be viewed then takes forever to load or constantly reloads without being able to see the content I want to see. Lots of times the only thing java script is needed for on these pages is to place adds on the page. Sometimes it it used to show pictures as well. There are better & faster ways to show pictures. This is unacceptable and a waste of my time. I want to be able to get to the information I am looking for and be done with it. I can’t sit and wait several minutes or longer to see the information I looking for. When this happens I go to other webpages that download the information I’m looking for faster. This to me is disrespectful to me & my time.

about 2 years ago


kevin hansen, SEO Content Writer at logic web media

This was very helpful for some research I'm doing. Thanks for writing it!

over 1 year ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.