SEOs seek to improve the rankings of the sites they work on with a combination of on-page and off-page factors. In the early days, a site’s position in the rankings was determined by a simple combination of on-page factors and manual review.
Over time, sites became more numerous, their owners became more savvy and the search engines became more discerning, which resulted in many more factors coming into play. Today, search-engine algorithms encompass a huge range of factors: on-page content and code, PageRank, anchor text used in links, link context, domain authority, domain age, social signals and more.
Trying to optimise for those factors is enough of a job in itself. But an equally important factor – more important, in some cases – is competitor activity. Because ranking isn’t just a function of what you do with your site – it’s largely determined by what other people do with theirs.
Google is believed to take over 200 factors into account when ranking sites. Its algorithm puts them all into the pot and boils them down into a single number – we could call it the ‘SEO score’ – which is then used to rank sites in order. And despite all the advances in the way it analyses sites (and the recent Google Places changes), the format of Google’s results is still pretty much the same: a ranked list or ‘Top 10’.
We intuitively understand the Top 10 format as an evenly ranked series where the members are separated by equal ‘gaps’. But this is rarely the case. In fact, search rankings are a lot like the singles chart. Being #1 in the charts doesn’t mean you’ve achieved any particular sales benchmark; it just means you sold more than everyone else that week. It might take 100,000 sales to be #1 in January, but only 50,000 to be number one in June. Similarly, being #2 means you have sold somewhere between #1 and #3, which could encompass a wide range of possible values. #1 could be out in front of #2 by tens of thousands of copies, or just one. In other words, the simple ranking format obscures the relative standings of the constituents.
Search works exactly the same way. The ‘SEO scores’ of the sites appearing in the top 10 may be very unevenly distributed. And that means that the result of future search efforts can’t be projected on a ‘pro rata’ basis.
For simplicity, let’s say that your SEO consists solely of link-building, and not any of the other tasks that make up a balanced search strategy. Let’s also suppose that there is no such thing as link quality – all links are created equal, no link is better than any other link. In this simplified SEO world, achieving rankings is simply a question of amassing as many links as possible.
It might have taken you 20 links to get to #10, and a further 20 links to get to #8. But there’s no guarantee that another 20 links will get you to #6. It all depends on how many backlinks the sites in positions #7 and #6 have. It might take you 60 links to surpass them. But once you do, if the differences between sites are less marked higher up the rankings, you might find that it only takes 10 links to move from #6 to #2. (A top ten where the constituents are regularly swapping places with each other suggests this kind of ‘first among equals’ situation.)
Let’s look at a couple of examples:
The graph shows ‘SEO Score’ against Google ranking. In this example, the sites in positions #10–#2 have scores that rise gradually. These sites might well be able to overtake each other relatively easily, and it could also be easy for a newcomer to break into the top ten. But site #1 is way out in front, meaning that #2 is probably the best that the other players can hope for, barring an absolutely huge SEO effort.
This is the position with a search term I monitor myself: ‘copywriter’ at Google.co.uk. Places #10–#2 are mainly held by freelance copywriters much like myself, with positions shifting around fairly regularly. We all have blogs, we all build links, we’ve all been around a few years and as a result we are all pretty much equal in Google’s eyes. But the #1 slot is held by Wikipedia, which in terms of authority and links is way beyond what any of us can achieve. It’s unlikely that any of us will ever overtake it.
Now look at a different example:
Here, positions #10–#6 have comparable scores, but the slope gets suddenly steeper around #5. SEOs looking to rank in this top ten might find it easy to break in and secure a position in the lower half, then stall for a while before reaching #5. But once there, it should take relatively little effort to rise to the top of the heap.
The implications on a commercial level aren’t particularly cheerful. Because competitors’ standing with Google can never be known for sure, SEOs have no way of knowing how much work it will take to reach the top of the search mountain. And that makes SEO, as a business, highly unpredictable. The SEO asks for payment not for results, but simply for carrying out a certain set of tasks – with no guarantee of success. It’s a bit like paying a taxi driver who might only get you half-way there. But until Google reveals its algorithm, which it never will, it’s the way things have to be.