Websites that are dependent on search engine traffic rely heavily on detailed keyword research to reach their target audience. Whether the resulting information is used for PPC, SEO or featured ads is beside the point. Simply put, if you want to exploit search traffic, you need accurate data on the number of searches carried out for each particular keyword.
Some companies will sub contract the keyword research to a specialist company and others will tackle it in-house. Regardless of who performs the research, a large number of people will primarily use the information provided by the Overture Keyword Assistant as the foundation of the project. I’ve been of the view for some time that the data Overture provides is often inflated, especially primary keywords. Recently I have been conducting tests to ascertain the accuracy of Overtures data in an effort to prove my suspicions and to see how big the problem is. The results so far are way beyond what I expected.
The SEO Research
Approximately one year ago I set up a new website focused on VoIP phone systems (www.ip-phone-system.co.uk). The website was built to rank highly on Yahoo for the search phrase « Phone System » and a number of other keyword phrases. According to Overture the phrase « phone system » has 350,066 searches performed each month in the UK alone. The website is currently on the first page of results in Yahoo.com and in the top three positions for Yahoo’s UK only search.
With the keyword tool reporting this amount of searches and the websites position, you would expect the site to be receiving a large volume of traffic. But to put it simply, it does not. For example, over the last two months the site has only received three visits from people searching for « phone system ».
This test is not concrete because the majority of searches for phone system could be performed on another engine that Overture pulls its results from like MSN. But you would have to agree that it’s not very likely. Especially when you consider the site ranks in the top three positions for the search phrase « phone system » on MSN.
Overture’s keyword tool pulls its results from a number of sources, Yahoo and MSN being the largest in terms of traffic. The site has a large number of top three listings on apparently high traffic yielding phrases e.g. IP Phone, Business Phone System, Office Phone System etc. yet only receives a very small number of visitors.
So what’s causing the highly inflated number of impressions the tool returns? I can’t say for sure but can certainly name a few things that could be significantly contributing to the effect. I’m also going to try and coin a phrase here and call the phenomena « Phantom Traffic », which simply means non-genuine traffic or searches conducted for other reasons than an actual genuine interest of finding a site relating to a keywords particular theme. I strongly believe both of the examples below are affecting Overtures data and are contributors of phantom traffic.
1. Manual SEO Position checking
People manually checking the search results to ascertain a websites position. Search phrases that are perceived to be high traffic yielding in theory will have more people conducting optimisation and therefore more people manually checking their positions. More people manually checking their positions causes the number of impressions to be inflated (phantom traffic). This is self perpetuating; the more people checking results inflate the number of impressions, causing even more people to target the phrase and manually check their positions etc. etc. etc.
I’m certain this is impacting the Overture Keyword Suggestion tool significantly enough to cause many sites to chase phantom traffic. I also believe this to be the biggest contributing source of phantom traffic. Many webmasters manually check their rankings every day and some even more.
Auto Generated Pages Compiled from SERP’s (Search Engine Results Pages)
Spam sites gathering keyword rich content from the SERP’s. These sites will automatically query search engines for their most sort after keywords (probably researched via the Overture keyword suggestion tool). The sites automatically copy the results pages of the search engines which are highly keyword focused. Quite often these sites will auto generate tens of thousands of pages, all focused on a select number of keyword variations. These keyword rich pages are normally buried quite deep in the site because they have no value to human visitors. Each page will be linked using rich anchor text and then pass the relevance back to one of the main pages via an anchor text link.
The idea behind this SEO trick is simply to produce large amounts of optimised content that’s linked together in a favourable fashion. Some of the programming that goes into these kinds of practices can be very clever, while others are very basic indeed. The problem is virtually every page being generated or regenerated for that matter is influencing Overtures data, unless the programmer is using an API key (which is unlikely).
This is very worrying to me as there must be a large number of people who base their entire keyword research campaigns on the data from Overture. This may cause their entire marketing campaign to focus on nothing more than Phantom Traffic. So what can one do to avoid targeting phrases that mainly consist of phantom traffic?
Well first of all it’s wise to use a combination of data sources. Wordtracker provides similar data to Overture but it’s gathered from different sources. Comparing the two data sources can sometimes highlight phantom traffic. If you notice keywords with an extremely high number of impressions, just ask yourself if it’s believable. Common sense can go a long way in this game.
Personally I’ve always advised clients to target sub-primary keyword phrases first and once rankings are achieved to focus on the next sub-primary phrase. If you intelligently select sub-primary keyword phrases that include the primary keywords, you are optimising the primary keyword at the same time.
A good example of this is the sub primary keyword phrases, « web design Manchester ». The company I work for is currently listed on the first page of the major search engines for this phrase. The primary phrase is « web design » and is also being optimised at the same time because the words are contained in « web design Manchester » (we’re currently holding position 11 on Yahoo UK for search phrase « web design »). The search phrase « web design Manchester » is also one of our best performing keywords because it is so targeted. Anyone searching on that term is specifically interested in web design in the Manchester area.
Optimising in this fashion has several benefits. First of all sub-primary phrases should be less effected by Phantom Traffic and the number of impressions you see should be similar to the number of genuine searches carried out. Sub-primary phrases tend to also be less competitive with fewer people specifically optimising for them (however, this is not always the case). So reaching a traffic generating position is easier and faster resulting in faster ROI.
Once enough sub-primary phrases are optimised to rank well for primary keywords. The campaign will already be bringing in targeted traffic and therefore cause much less pain and wasted effort if the primary keyword is heavily affected by Phantom Traffic.
The other advantage is much of the time sub-primary phrases are more targeted and the traffic they bring tend to convert much better. I have personally seen this time and time again. Sites that have little traffic but enjoy a conversion ratio of 1/3 because the traffic they do receive is extremely targeted sub-primary keyword phrases. These websites often out perform sites receiving ten times the amount of traffic from primary keywords. It’s all down to specifics though and what works for some may not work for you. As mentioned before, common sense goes a very long way in this game. Just don’t get caught up chasing phantom traffic.