The Student Room Group is one of the UK’s largest sites for students, and has experienced plenty of success with a continuous CRO (conversion rate optimisation) strategy.
I spoke to director of optimisation Pete Taylor to find out more about the company’s conversion optimisation strategies, what’s proved to be effective, and which tools are the most useful…
For the uninitiated, can you describe The Student Room Group and what it does?
The Student Room Group is a Brighton based company focused on providing students with web resources to help them be the best version of themselves.
We have three websites, TheStudentRoom.co.uk, GetRevising.co.uk and MarkedByTeachers.com.
The Student Room is our main site, and is quite simply the place where students connect. We are very proud that the site is by far the largest student community in the world with over 1.3m members discussing everything from what university to apply to, to what tech gadgetry to buy, or how to deal with life’s ups and downs.
If it’s important to students, its discussed on The Student Room.
Get Revising offers a range of social learning tools to help students prepare for exams, plan their time and share great information. We have tools to build everything from mindmaps, and flashcards to full term study timetables.
Couple that with a library of over 125,000 resources created by other students using our tools (with hundreds of new resources created everyday) and you begin to see why this is such a great social learning tool.
Marked by Teachers tackles something that all students are faced with – how to write great essays and achieve top marks. The site offers example essays that have been formatively marked by our team of highly experienced teachers and examiners.
Our team takes students through example essay structures, feeding back on where and why marks were gained and lost using a combination of annotations and highlighted text.
How does the site make money?
We have two main revenue models, on The Student Room it’s primarily advertising based and on Get Revising and Marked by Teachers we have subscription based models.
Can you give us an idea of traffic, turnover etc?
The Student Room Group serves 6.4m unique visits a month. This breaks down as 10m visitors and 29.5m page impressions helping us to be rated by comScore as the number one UK education site.
When did you first look into optimising the site?
Our very first foray in to optimising our sites was a steep and not very successful learning curve all the way back in 2009.
We were a very small company back then with limited resources and time. Add this to some pretty unstable A/B split testing technology, and some less than helpful advice and we ended up deciding to concentrate on building the sites and their resources!
We didn’t focus back on optimisation work again until late 2012. This led to appointing and working with a CRO agency, who were picked because of their ‘shared learning’ approach to client work.
We started working together in early 2013, initially with a four month ‘test of concept’ engagement on Marked by Teachers. Their approach, level of detail and access to a unique team of leading industry experts helped produce some really quite staggering results in that initial four months – and the best part of this was we were learning the processes at the same time!
This led to an ongoing relationship, spreading focus across all three of our sites during the entirety of 2013 (and continues in to 2014).
Did you identify any problems before you decided to bring a CRO agency in? What challenges were you faced with?
We were aware of problems on our sites, generally focused around user experience but, the same as I hear time and time again, we were really unsure of where to start.
I think we were faced with the same challenges that a lot of companies looking at CRO are faced with:
- How to identify the quick wins.
- Getting the test order right.
- Ensuring development time is freed up (not only to help build test concepts, but also to quickly put winning variations live to give us the right platform to move on with testing).
- Knowing when a test is ‘cooked’ (has it reached statistical validity or should we wait for two traffic cycles to pass?).
- Using the right tools.
- What areas should we be testing and when is it right to test or ‘just do it’?
This list could almost be endless, but you get the picture.
How did you identify the issues and potential solutions?
There isn’t one discipline really, it’s a combination of techniques based around a basic concept; Measure. Test. Learn.
We identify issues and areas that will offer high return for our efforts by looking at as many data sources as we can.
The picture is built up using a combination of analytics data (both top level stats and deep dive expert analysis), remote un-moderated user testing sessions, moderated user testing sessions (we are extremely lucky to have a user testing lab – complete with one way glass – in our office), site surveys, questionnaires (4Q, Qualaroo etc), expert opinion and in-house experience and hypothesis.
I think the key here is that ideas are listened to top to bottom, it doesn’t matter who you are, if you have an opinion it’s listened to. If we can then back that up using the stats and what we have learnt from our data gathering then an idea makes its way in to our hypothesis log.
We then look at our highest ranking tests, look at what external factors come in to play (development expense, design constraints, isolated site experiences – no test influence cross-over, time in the academic year etc) and decide on a test order.
The solutions to problems are more often than not highlighted through the process of identifying the issues. However, if not it really highlights the beauty of testing. Take what you believe is the right answer and test it.
You soon discover whether you were right!
Which tools and techniques have been the most valuable for you?
Tools wise it’s difficult to ignore Optimizely. It’s the split testing tool we have used for the last year and it’s so easy to use, and flexible (our dev team have used it in some pretty innovative ways).
I really couldn’t see us moving away from using it in any hurry.
I think the single most valuable technique we use is learning to know when to call a test, when is it cooked? We are quite lucky in respect to traffic numbers, we can always guarantee high visit and conversion numbers enabling us to hit statistical validity.
However, because we can hit this level quite quickly it can sometimes be a bit of a red herring. We have also learnt to allow usage cycles to pass. On our sites there is quite a pronounced site usage cycle over a week – different points in that cycle produce different user behaviours and requirements.
Call a test too quickly and you run the risk of having ‘statistical validity’ but biasing a particular user behaviour in your ‘cycle’. Getting the call right between these two areas makes sure you are allowing a test time to ‘cook’.
What have been some of the biggest successes your business has seen in CRO? Can you provide some examples?
Over the past years we have seen some amazing results, in our initial four month engagement with our agency we measured a 78% increase in subscription funnel conversion rate (going from 0.15% to 0.25%).
However I think one of our best designed and most successful tests was on our website Get Revising. This started life as a free-to-use site on which we implemented a subscription model.
However, this model didn’t really get traction and we found out through user testing and polling that most people didn’t see the value of the subscription package as they got what they wanted free of charge.
We quickly devised a ‘what you get and don’t get’ tick chart at the point of making the choice between a free account and a premium subscription. We then simply changed the combinations of what was with the package and what was free across several variations.
We made absolutely no change to the back end of the site (this would have made the development time for the test huge and made it really complicated) as all we wanted to find out was what combination caused the ‘tipping point’ from free to paid users. We were able to get this test up and running from concept to live test in less than 24hrs.
We then discovered what combination worked best, and the upshot was a huge increase in premium subscribers. We were then able to go away and develop the back end to incorporate what we had discovered during testing, confident that this was development time well spent as we had proven the case.
This subscription model continues to go from strength to strength (although we do continue to optimise it).
This test went on to win a silver prize in Which Test Won’s 2014 Testing Awards.
Do you see optimisation as a continuous strategy?
Our company culture has changed from zero optimisation to running 10-15 tests a month over the last year.
I am now Director of Optimisation (a role and department that didn’t exist 12 months ago) and thanks to unique levels of support from our CEO and senior management team we have engrained a tight culture of measure, test and learn.
Is optimisation a continuous strategy at The Student Room Group? I’d say it’s more part of our DNA!