Enter a search term such as “mobile analytics” or browse our content using the filters above.
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
There are currently some brilliant tools for carrying out user testing on your website. The basic premise is to invite a number of people on to your website to complete a specific task.
You then gather feedback about their experience which will highlight any usability issues.
I've been using UserTesting.com recently and will base this article on its features and how to get the most from the tool.
The main purpose of user testing for me is to gather ideas for tests that I could run. Visitors will talk candidly about areas of your site they don’t like or are confused about. It also becomes apparent when visitors are missing key elements or using the website in a way you hadn’t envisaged.
This is valuable information for understanding what tests are likely to produce results (either good or bad – but at least have an impact).
To begin with, you must set a task for your users to complete. I like to keep the task quite vague to try and simulate a real life situation and not influence the user too much.
In my experience, this is the way to get the most credible results. If you make your task too specific, then users will just follow instructions, which is not realistic.
Here is an example of a task I would normally set:
You are looking for a book or two to take on holiday. You might even be tempted by an e-reader. You simply want something to entertain you on the beach and flights.
Task 1: Browse the site for ideas. What appeals to you?
Task 2: Narrow down some options and then use any available information to make a decision on which items to buy. Is there any information missing/you would like to see?
Task 3: Please go through the process of buying the item/s but stop at the billing section. Was the process easy and what you would expect?
The scenario tries to mimic the initial Attention a user may have (see AIDA) to the type of product they may become interested in. A typical person going away will know they want something to read, and may have heard about e-readers.
Next, the actual task is in three parts (to simulate Interest, Desire and Action – AIDA). There is enough direction to ensure the user provides me with good information, but vague enough to allow them freedom to have a natural experience.
There is an option to specify the starting URL, and this is a good opportunity to simulate the experience a user may get if you are using specific landing pages for your digital marketing.
I would recommend you either start users on a commonly used landing page, but otherwise use the home page as most visitors will likely navigate there at some point.
You can use an analytics package to explore what your top landing pages and top visited pages are to decide the best starting place for your testing.
A number of demographic options are available, including country, which means you can specify UK users if that is where you target. There is the option to specify age and gender, which is useful in some cases.
You can replicate the test multiple times specifying different age groups and gender to ensure you get a good mix. I would be careful not to be too narrow in your target though, as the length of time you must wait for results will increase.
Each user will submit a video of their session on your website with audio commentary on top. You can watch these videos online or download them. The user will also leave some written comments after finishing their video to answer any post session questions you had.
In order to get a good result it is likely you will have a large number of videos to watch. However, I would not recommend sharing the job of watching all the videos between colleagues as it is essential that you get to grips with the larger picture of how visitors use your site, the approach taken by Appliances Online when improving its product pages.
I recommend that you watch the videos through once without stopping to get a full understanding of the user and how they moved through the site.
If you do this, it will be easier to spot common patterns of how visitors respond to your navigation and key calls to action. You can then document the areas that are not working as expected, any key focus points that might be improved, and any call to action which is unclear/ignored.
Part of your testing program should address areas like this to both improve user experience and improve the persuasiveness of your website in driving conversion.
The next step is to watch the videos a second time and take notes. You can stop the video as each relevant comment, idea or frustration is expressed to make a note. After doing this for all the videos, you can compile a list of comments for each page.
At this point you will have a list of genuine issues/areas for improvement that you can build tests around. This takes a lot of the guesswork out of the process, and combined with some in page and visitor analytics (such as ClickTale and Google Analytics) you can ensure you are running important tests that have a good chance of making an impact.
It is important not to believe everything your users tell you in the videos. One person’s opinion is not necessarily the best thing for your website and visitors as a whole. It is a good idea to try and read between the lines and spot areas for improvement.
Also, don’t forget that users under test conditions are not going to behave exactly as a genuine visitor would, so combine their behaviour with genuine behaviour as reported in analytics.
If every user tells you the “About Us” page is poor, but none of your genuine users ever go there, then a change is not going to have any impact on your performance.
Image credit: Provenmodels.com