Measuring a government website is very much different from measuring a commercial one, since a government website’s job isn’t to make money for the Government (well not directly).

This lack of a bottom line often makes it difficult to justify changes to the website because of the lack of direct profit for the owner. One way to get over this is to ensure you have a robust measuring process for the website to monitor the changes over time and attribute them correctly.

Measuring the success of government websites is hard. To prove how hard it is, you only need to look as far as the UK Government’s web standards and guidelines.

However, you could argue that the COI guidelines are merely a way of allowing the Government to create a comparative view of all the websites and ensure that they are doing something to measure their website. It is also a very useful way of getting government departments (and of course the UK public) to think about whether government websites are performing or not.

The American Government looks like they might have cracked a system that works by looking at the differences between outcomes and outputs.

Outputs online are the direct effects of the website. These are the things that people do on the site, how long it takes them to do it and how much of it they do. Outcomes online are the effects of users using the website and what they go and do with it. have quite a robust method and they are quite nicely grouped into outputs and outcomes (possibly accidentally).


On a very basic level, one of things that you have to get people to do is actually come to the websites. That is why visits is one of the most important metrics. You could argue at this point that unique visitors would be a better metric, however unique visitors aren’t accurate and probably aren’t very precise either. Moreover, the trouble with unique visitors as a web metric is that the tendency is to think that unique visitors is the same as people, which we know not to be true.

Moving up a step, they want to make our visitors who have found the website come back and use other parts of the website. This means there needs to be a metric of repeat visits.

How you want to measure this metric is open to debate. If you use a percentage of your total visits and aim to increase it, there is a suggestion that you aren’t looking for new audiences, whereas if you use an absolute value then you sometimes end up ignoring large increases in actual visits. Usually the best option is to go for both to give the context.

The next thing you want to do is look at making sure that the people looking at your website are doing something. Given that quite a large portion of it is written content, you possibly don’t want to be looking at bounce rates (although ignoring them is a sin too). Users who find this content through Google are more likely to bounce (as they’ve found what they were looking for) than those coming into the home page (for example).

Therefore a different way of looking at it is definitive actions that the users can take. In the case of businesslink you need to look at how many of the interactive tools have been used successfully (which research has shown that people who use save more money than those who don’t) and the number of people who register to the site.


This is where it gets a bit more interesting. This is where you really have to start thinking about what the point of your website is and what your users do after using it.

In the case of commercial websites, this last point almost isn’t important. When you’ve bought your new Dell laptop, Dell don’t really care whether you spend your life working on it, playing games or writing guest posts for Econsultancy (unless, of course, they think that they can sell you something else).

For government websites, it is the entire point of them. They are only there so that the users will go and do different things.

These can be very difficult to measure, but one option is to go to your target audience and ask them. Obviously you can’t go to them all, but asking a significant enough volume that it becomes statistically sound is an option. It also enables you to ask your target audience other things, like whether they used interactive tools, etc.

So what do we ask our prospective audience? Well the first step is to ask them if they’ve used the website so that you get a market penetration metric. This metric cost me a lot of time whilst I worked at explaining why it is different from visits and why both are important. Well, it turns out that many people who use the website aren’t in the target audience. This also means that it can be calculated it in a different way by looking at the percentage of businesses (not people or even visits) who take advantage of the site’s content.

So the point of the website is that businesses (or the people in them as the mantra internally goes) are meant to save time and money when doing business (with the Government, with each other and with customers). So they were asked if they saved time or money and how much of it. It’s a very difficult question to answer, so you have to make sure that your calculations are sound. It's also important that these calculations are sound so that when you do comparisons over time, you can see the increased benefits.

The other point of the website is that it is that in quite the opposite way to the previous point, that it is meant to give businesses ideas and methods of increasing sales and profits. This metric, along with the one immediately above, give the website a great monetary value. This means that if you make changes to the website you can quantify (over time) a monetary return on investment for the website in terms of money for users.

Finally, it would be a bit remiss if users weren't asked if they were happy with it. If they aren’t happy with it, then they could possibly have saved, or made, more money. Therefore customer satisfaction tends to be quite key.

If you read the annual review for, you’ll find that there are some additional metrics how much content has taken from government departments and how satisfied they are. I’d argue that these aren’t metrics relating to the website, but more the organisation overseeing it.

Alec Cochrane

Published 18 February, 2011 by Alec Cochrane

Alec Cochrane is Business Consultant at Adversitement and a contributor to Econsultancy. He also blogs at

5 more posts from this author

You might be interested in

Comments (7)

Save or Cancel

Guy Courtney

With the massive public sector cuts "value for money" in terms of digital is key. In my experience not all commissioners within Government understand what they are looking for from a website or indeed why they are commissioning one in the first place. The start of a learning curve - benchmarks as outlined in your post are a good place to start and vital.

over 7 years ago

Alec Cochrane

Alec Cochrane, Head of Optimisation at Blue Latitude

Exactly Guy!

My next post (coming soon!) about DirectGov suggests that Government see the web as a way of reducing costs, not as a way of giving better public services. Whilst it can be the former, it has to be the latter.

Funnily enough, Neil Mason also wrote an interesting blog post on outputs and outcomes on Clickz recently:


over 7 years ago

John Jones

John Jones, Requirements analyst and user experience designer at Required Experience Ltd

Great piece, Alec, with which I fundamentally agree.

I’d just add that I think that any rounded measurement of whether a government website is a success or not has to include an assessment of whether the site helped users fulfil the tasks that they wanted to complete.

One of the problems that I’ve found in government (both central and local) over the last 15 years is that so little emphasis is placed on facilitating user tasks, usually because too much attention is placed on the vanity publishing of online content which is irrelevant to the user.

Tactically located user satisfaction micro-surveys (usually at the end of an online top-task user journey) can yield insightful quantitative and qualitative data in this area.

over 7 years ago


David Pullinger

Great to see Alec's piece and discussion. As the civil servant who has been helping lead this work across government, I much appreciate others appreciating that it ain't that simple!

The Martha Lane Fox review was very clear that we should be measuring public service and how it is for the citizen as individual or business. Can they complete what they set out to do easily and fully online? We're really going to drive this better way.

We're also shortly issuing a guide to a structured way of evaluating and optimising government digital media using the output/out-take/outcome model. This model is going to be used for all media evaluation across government and goes a long way towards focusing developers and managers' attention to what really matters. As well as taking digital mainstream in aligning it with other possible channels.

Always pleased to hear new ideas and ways of improving things.... David

over 7 years ago


Amish Patel

Great article Alec. I agree with most of your points but have to say that John has made a very good point about user tasks.

Working with local and central government, GovDelivery have found that an average member of the public will probably visit their council website once every 6 months and will visit for a specific reason. The individual will want to address/find specific info on whatever they are looking for and disappear. There is a whole host of great content which is of use to the public on council websites but people dont know about it and don't care to know until they need it.

By offering granular alerts via email/SMS/social media on topics of interest users are continually engaged and make the most out of published information.

The key message here is it's all great having a fancy website with lots of content but government need to look at why they have a website in the first place and hows efficiently and effectively the content is being consumed by it's users. People knowing that a site exists and being proactively engaged in it's content is a key measure of success.


over 7 years ago

Alec Cochrane

Alec Cochrane, Head of Optimisation at Blue Latitude

John, Amish - thanks for your comments. I tend to agree with you about the user tasks. There is a bit of a lack of focus on how to get users to complete the things that they are looking to do. In theory this should be born out in the time/money saved KPI, but this isn't always the case.

Exit surveys can be a great way of finding out frustrations of users at the point of contact - however this should be carefully integrated into the analytics to ensure that a fuller picture. Sometimes exit surveys can be a little skewed!

David - glad that I could be of some help. I do like the outcome/output model of measurement - especially in Government where the idea of the website is to have some sort of outcome for the user (something they do because of using the website) as opposed to a direct output.

Talking of the Martha Lane Fox review, my follow up article looks at DirectGov:


over 7 years ago


Alex Heaton

Good article. I've given this area some thought and as far as outputs and outcomes are concerned I couldn't agree more. I also agree with you that measuring them can be tricky and needs an organisationally if not multiorganisational approach to tracking. If we take young peoples journeys through educational decision making for example, a normal journey might consist of multiple websites, chats with a qualified advisor, teachers, friends and parents before an outcome is achieved. In this context we would need to use quant and qual post the outcome to establish what effect the individual parts of that journey had on the overall outcome.

The third sector have led the way in this field for quite some time and have many different tools and techniques for helping organisations map out and record success where social outcomes are desired. Much can be learnt from the more innovative organisations in this sector.

over 7 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.