John Straw InfluenceFinderJohn Straw is CEO of InfluenceFinder, which has launched a tool which enables search marketers to analyse backlink data and build a list of influential websites which are providing valuable links. 

InfluenceFinder has been using Econsultancy as a test subject, and soft launched at SMX London recently. 

We’ve been speaking to John about InfluenceFinder, recent changes to Google’s algorithm, and why he thinks that SEO needs to become more like PR… 

Can you tell us about your background before launching InfluenceFinder

I have been in digital marketing for the last 15 years, starting out in California with Interse, which specialised in analytics software, and was picked up by Microsoft in 1997. 

After this, I worked on an email marketing startup in Alabama (Revnet), and this was acquired by MessageMedia for $66m, and was subsequently purchase by DoubleClick, and then Google. 

Back in the UK, I founded search agency NetRank in 2000, before it was sold in 2007. 

I then retired for about three minutes before deciding to get back into search in 2008, and founded InfluenceFinder. 

What does InfluenceFinder do? 

The idea behind InfluenceFinder is to uncover for our clients the most influential and valuable links that sit behind competitors’ sites, so we can build similar relationships with these sites. 

Google was stopping us doing that, as it isn’t possible to view more than 1,000 links at once, partly because Google wants to keep its algorithm safe. 

InfluenceFinder sets out to get around that problem by using backlink information from a range of other sources. We launched in May. 

How do you find influential sites? 

We have access to 2.2 trillion links, and we can set up focused crawls using this data. So, using Econsultancy as a test subject, we set out to define the most influential websites in the US. 

We then go to Google, type in online marketing, and find the top ten websites. We can then crawl the backlinks of these sites, and send our spiders to each site which is providing links, looking at the pages for signals of relevance, influence and authority. 

We look for placement of keywords, and importantly, for a ‘heartbeat’ to determine how often the site is updated, and whether a human being is updating the site. 

There are thousands of dead sites that we have come across. Google likes human sites that are regularly updated, so it is vital to make this distinction. 

We are then able to slice and dice this data to find the right balance of influence and relevance, and produce a manageable set of sites to target.

Are there any surprises in lists of influential sites? 

Most are what I’d call the neck of search, and many are lying beneath the surface and wouldn’t be picked up in normal search engines, we can still pick up signals of influence from these sites. 

While every marketing blogger hankers after a link from Mashable or TechCrunch, and these are valuable sites, they are not necessarily as relevant as other blogs if you are looking for particular links. 

For example, if I’m Cisco looking for a link about a new router, a site with lower PageRank (I’m using PageRank very generally here) than TechCrunch may provide a more relevant and therefore more valuable link. 

The same can apply for sites like the BBC. Since the BBC covers just about everything, you can’t get such a relevant link from it, whereas you can with a more focused site with lower Page Rank. 

What changes are you seeing with Google? 

Google is always changing, but we think Google’s changes are becoming more profound, yet subtle at the same time. 

We have started to see some important changes. For example, every year, SEOmoz asks 100 SEO experts to rate search ranking factors in order of importance. 

Between 2007 and 2009, the top ranking factor voted by SEOs was always keyword-focused anchor text from external links. I would have agreed with this myself, until recently. 

In 2009, if you searched on Google for a term like ‘digital camera’ every result in the top ten, except Amazon, would have had this term in their root domain. This is an example of Google and its liking for anchor text. 

This meant that, if you owned the right domain for a particular keyword or phrase, you had a head start over all of your competitors. 

In 2010, though, this is now very different, and only a couple of the top results on Google have the term in their domain name. 

We have done a lot of research on this, helped by people like Nichola Stott, and there is now enough evidence that we now believed that Google is now nowhere near as reliant on anchor text as before. 

This therefore represents a big change in Google’s algorithm in a relatively short space of time – between 2009 and 2010. 

Any other evidence of changes? 

We also saw Matt Cutts come out in June asking the question: what are the links that stand the test of time?  The answer was that these are normally edited links from relevant and influential sites. 

We believe that this is more evidence that Google is more interested in relevance and context rather than just authority. I think Google is keen on tracking users’ intent and providing more relevance in search results based on previous user journeys. 

So, when a user enters a particular search term, they can look at what previous searchers wanted and the results page can therefore become very flexible in response. 

Google will also use other signals to determine user intent. For example, if someone in London types in sushi on Google using a smartphone, then it is relatively easy to guess the intent. 

What are the effect of these changes on SEO? 

I think that SEO isn’t evolving at the rate that it needs to in order to keep track of changes. 

This isn’t always easy, as Google is very secretive but we believe that SEO campaigns need to be more about going looking for outbound links and content deals with influential sites. 

On an SMX panel recently, we were asked what the most important thing would be for SEO for the rest of 2010, and Andrew Girdwood and I both replied that it would be relationship building. 

SEO will be more about finding influential sites and building relationships, and therefore links that will stand the test of time. In this sense, SEO could be morphing towards PR, as a big part of the job will be about creating relationships. 

In the same way, PR should be morphing into SEO, but this isn’t happening, as PR agencies are often uncertain about search. 

Google is relentlessly changing its algorithm and SEOs need to able to track the changes and adapt their ways of working in response.