I have recently started reading CopyBlogger as it provides a lot of valuable insights into the world of SEO, copywriting and online marketing (among other things). Brian Clark surely does know what he is doing.
Yesterday I came across a relatively new blog post at CopyBlogger which talked about the components of Google’s ranking algorithm i.e. how does Google decide on where and when to rank a certain webpage.
Now, before we look at the actual breakdown, I want to mention that this chart was provided by SEOMoz and for that reason I would hate not giving credit to them. SEOMoz is a great source of information on various SEO tactics.
But I digress.  Let’s take a look at the ranking algorithm chart:

SEO_Linkbuilding_chart

I really do not know what kind of testing or experimentation SEOMoz conducted in order to come up with different segments, but it (for the most part) directly reflects on what I my understanding was all along.

Lets get into the details:

1. Domain authority: This is by far the most important variable in the ranking algorithm. If you have a trusted domain with a lot of visitors, content pages and inbound links, you can actually get a junky, unimportant page to rank for very competitive terms. In essense you can actually beat many other low authority websites using this technique even if they have spent a lot more optimizing that page for the search engines. It is for this reason that Wikipedia ranks #1-#3 for extremely competitive generic keyword phrases which other companeis would spend thousands to rank for.

The lesson here must be obvious: Google loves long-term assets. If you can somehow turn your website into an authority domain, you can rank at the top of Google for very competitive phrases without a lot of effort. The problem however is the time and energy it takes to develop an authority domain.

2. Link popularity of the page: By this we mean the quality AND the quantity of inbound links to that page. You can thousands of junky links to a page but it will never help in the long-run. The idea here is to strike a balance between link popularity and link quality. Many successful SEOs can manipulate this factor (and the factor mentioned below) in order to get ranked for competitive terms. New webmasters usually have a hard time attending to this point properly either because they use too many junky links or they do not vary their anchor text a lot (among other factors).

3. External anchors to the page: This is the third most important factor. You can optimize the web-page for as many terms as you want but unless you get the right anchors from quality sources, you will never rank for your keyword term. Why? Simply because Google does not know where to rank you unless you have anchored links coming into your page.

4. On-page keyword usage: I must admit, I always ignore this factor when I do SEO. As far as I remember, I have never consciously taken into account how keyword rich my content is. I personally feel that this factor should get a little less weight as I have seen many webpages ranking for keywords which were not even mentioned on that page. Also, I have been able to rank many pages near the top without caring about this factor. But if you want to play it safe and cover all bases, go for a keyword density of 1%-2.5%.

5. Registration & hosting data: I believe they mean the domain registration stats over here. For example, a domain which is registered for 10 years might be weighted more by Google because spammers will NEVER register a domain for more than a year. Also, .info domains might be devalued and find it harder to rank because they are cheap to buy and have been expolited by spammers. By the hosting data they might been the bandwidth, the down-time of the website as well as any bottlenecks while accessing the website. So in essence, there are many different factors to look out for.

6. Traffic & click-through data: This factor needs more weight. I say that because I have had many websites ranking #2-#5 for competitive keyword terms. The only problem was, the info on them was not targeted which led to visitors instantly hitting the back button on their browser. The result? Those websites now rank #100-150. I can still get them back up by working on the content but I am lazy so I will wait. I feel that a horrible score on this factor can lead to a quick decrease in rankings even if all the other factors are adequately catered to.

7. Social graph metrics: Google is slowly (but surely) incorporating the social aspects into the ranking algo. This means that you need to start using Twitter, Facebook and websites like Digg, Propeller etc to gain traction for your content. Google knows when your content is liked by the masses and will give you a boost in rankings when that happens. The sad part is, I have not started using this factor to my benefit as it takes much more time and the results are a little uncertain.

So there you go. If you want your webpage to rank high, you have to do much more than what most people do. SEO takes time but it can give you free traffic if you do it right.

Did I miss anything? Let me know if you hold opposing views to what I have written here. Feel free to comment!