03 January 2007

The value(?) of external metrics

I posted earlier about the difficulties of comparing the results of external metrics. Here is a more serious evaluation of my gut feel. This was a pretty exhaustive look at a number of sites using a range of sources. These included: Technorati, SEOmoz Page Strength, inlinks reported by Yahoo! Site Explorer, Bloglines subscriptions, Alexa rank, Netcraft rank, Newsgator subscribers, Compete.com rank, Ranking.com rank and Google page rank. They then compared the outputs from the metric with the traffic actually recorded at the site and tried to establish a correlation between the metric and the traffic for the range of sites in the study.

Their findings:
  1. Number of Technorati Links (0.74)
  2. SEOmoz Page Strength (0.60)
  3. Number of Links to the Blog URL via Yahoo! Site Explorer (0.56)
  4. Number of Links to the Domain via Yahoo! Site Explorer (0.54)
  5. Bloglines Subscriptions (0.49)
  6. Technorati Rank (0.49)
  7. Alexa Rank (0.49)
  8. Netcraft Rank (0.43)
  9. Newsgator Subscribers (0.39)
  10. Compete.com Rank (0.38)
  11. Ranking.com Rank (0.36)
  12. Google PageRank (0.21)

Their conclusion: none of the metrics are accurate enough to use, even in combination, to help predict a site's level of traffic or its relative popularity, even in a small niche with similar competitors ... it appears that the external metrics available for competitive intelligence on the web today simply do not provide a significant source of value ... anyone who applies this data for competitive analysis/research [should] do so with the following limitations in mind:

  • Unless the discrepancy between the metrics is high and universal, they cannot be taken to mean that one website, blog or page is necessarily more popular than another
  • Generally speaking, the more well-linked to a page/site/domain, the higher its traffic levels, but there will be a significant number of exceptions
  • Services like Alexa, Ranking.com, Compete.com & Netcraft are nearly useless when it comes to predicting traffic or comparing relative levels of popularity, even when used on a highly comparable set of sites in a similar field

Correlation coefficients can be slightly arbitrary in use, depending on how the analyst chooses to set the cut-off points. On most people's scales only Technorati is showing a strong positive correlation while more than half of these metrics would be classed as showing only a weak positive correlation.

Perhaps they should have added coin tossing to the list!



Post a Comment

Links to this post:

Create a Link

<< Home