Our Findings: 5 Reasons Why Websites Get Penalized By Google In 2013

Under Google's bootFirst and foremost, I should stress that just because a site gets penalized by Google does not mean that it becomes worthless. Nor does it mean that you have committed any kind of a crime. Websites get delisted or dropped in the rankings all the time and this doesn’t mean your life is over. It does mean that if you want Google to love you again you’ll need to do something a little different.

Things Have Changed in the Past Few Years

The other thing to remember is that things have changed over the past few years and what was fine a few years ago is no longer fine today. In other words, I’m not going to discuss keyword stuffing your meta tags because that’s old news. Nor will I discuss other obvious kinds of black hat techniques such as hiding keywords in code and the like. The following five things are relatively new issues which can get your site penalized:

Junk Content

I know that spinning is still really, really popular these days. The Warrior Forum still gets lots of people talking about how to spin content they find online. The fact is though that spinning is generally a bad idea. It just doesn’t work the way it used to. Even content which is human spun but which is poorly written can get you penalized by the Google gods.

The way this works is that Google uses a system known as latent semantic indexing. The best way to understand this is to think about the IBM supercomputer Watson which beat a number of Jeopardy champs a few years back.

Watson was a development of IBM which was designed to do something truly revolutionary – it was designed to understand nuance. To us, we can figure out when someone asks, “did you hear about Paris Hilton?” that they are talking about the famous socialite. To a computer though, it could just as easily be a question about the Hilton hotel in Paris, France.

Watson was specifically designed to parse thousands of possible answers and to understand the nuance of the English language – it can understand that when someone asks a question like that, they are likely referring to the person and not the hotel. That’s why Watson mopped the floor with Jeopardy champions.

Google in essence uses a similar concept to weed out the junky content on the web. It is capable of reading what you wrote on your site and realizing that it is poorly written. It’s obviously not perfect. The system doesn’t have quite the same capacity as a human being.  This means that an errant sentence or two won’t trigger it.

The odds are good that even if you have whole articles on a blog which are poorly written they won’t get your site penalized as long as they are an extreme minority. But put up dozens of junk articles on your blog and watch the rankings drop regardless of how many backlinks you create. It’s a brave new world and spinning is simply not going to cut it anymore.

Junk Backlinksjunk backlinks

Speaking of junk, another thing which we are seeing sites get penalized for is heavy reliance on junky backlinks. This means that if you buy those cheap packages of junk links on Fiverr and never bother to do some quality backlinking, you could also end up being penalized by the good folks at Google.

Now I know that some people are thinking about the issue of Google Bowling. I wrote a piece on this a while back and to be honest, I’m still of two minds about the whole issue. The fact is that nobody knows for certain exactly how many bad backlinks will end up triggering the wrath of the Google gods. What we do know is that sites which rely heavily on such materials tend to fare badly with the Google SERPs.

Fortunately, there is a way to do something about this. Google unveiled a new tool called Disavow. In essence, it allows you to tell Google that specific links which point to your website should be utterly ignored because they are junk links which someone else posted (or which you posted yourself but which you now regret).

The bad news is that using the Disavow tool to find and neutralize all those bad links does take a great deal of time. It’s not going to be easy to get rid of all of them so you may need to hire a seasoned SEO professional to help you eliminate them if you’re not sure what you’re doing.

By the way, I can definitely see the opportunity for some hanky panky here. Not only could an unscrupulous SEO “expert” use the new Panda update to beat up on competitors of their clients but they could also use it to blackmail potential new customers.

The way it could work is that they would themselves create the junk backlinks and then contact the company who they did it to, offering to remove them for a fee. They wouldn’t even have to admit to having done it. They could simply claim to have found the junk links pointing to your site and be looking for new business.

I see no real way to stop this from happening either because there are plenty of legitimate businesses that do cold sales and which genuinely will have just found the bad links and be offering their help in eliminating them. It is something to keep in mind though if someone contacts you out of the blue and offers specifically to help with dealing with junk links on your behalf.

Building Too Many Links Too Quickly

This one is actually a bit tricky to explain because it’s not the sort of thing that is necessarily bad so you need to pay a bit of attention here to understand what the real issue is. In short, if you create too many links too quickly, Google’s servers are likely to know that something is up and penalize you for having done so. The thing is, it’s not quite as simple as all that.

First and foremost, there is the question of how new your website actually is. A brand new website which gets thousands upon thousands of backlinks all of a sudden is much more likely to get penalized by the folks at Google than a website which has been around for a while. The logic is that your brand new website is very unlikely to garner so much attention so quickly.

In addition to this, there is the question of link quality and link diversity. In essence, if you were to create thousands of blog comments and little else, you are much more likely to get flagged by Google. Remember that SEO is technically not something that Google wants to see. In theory at least, they would prefer that all links are completely natural.

In practice though, Google’s engineers know that people do artificially create links all the time. They tolerate it to a point. However, when they see very obvious efforts to game the system and use one type of link exclusively, they know very well that you have been cheating and they’ll penalize you for doing it. In addition to this, your links need to have good diversity on the anchor text.

I recently worked with another client of mine who had been writing articles (quality articles mind you, which were well written) and where every single one of the articles ended with an exhortation to contact the company if you needed their particular kind of service. The anchor text was always the name of the company.

I can’t reveal many details about it because of confidentiality concerns, but what I can say is that this company was spending a great deal of money to have these articles written and placed. I told them they were throwing money away if they were having the articles written that way and they changed their ways pretty quickly.

In essence, if all you do is use a single anchor text thousands of times, you may as well hold up a giant sign for Google’s spiders saying “look at me, look at me, I’m spam.”

Bottom line with this, create your links carefully. Stagger them so they don’t all appear at exactly the same time and make them into something worthwhile.

Using Blog Networks

This is another technique which was really popular for a number of years but which has lost its shine. Blog networks such as linklicious have gone out of style since Google has been cracking down on them lately. I have read that some of these old blog networks are trying to find ways to get around this by avoiding some of the footprints which have made them vulnerable but I’m not really sure about these.

What I think may be safe are the non network networks where you post on real blogs which have real content and which simply accept guest posts in exchange for being able to post guest posts on other sites. These seem a bit less spammy but again, you shouldn’t use such networks as your exclusive method of building backlinks.

By the way, speaking of guest posts, generally doing guest posts still works pretty well and I do recommend trying to place them.

Overdoing Your Advertisingadvertising

Finally, believe it or not, if you overdo it with the advertising, you will find that Google won’t like that and may penalize you for having done so. I know it’s tempting to try to squeeze every ounce of advertising money out of your site, especially if you run something like a newspaper, but Google’s system is designed to look for sites which are especially top heavy on advertising and they will penalize such sites.

Not to mention that if you have too much advertising, you’ll also lose some of your readers because people get turned off by the constant bombardment of ads all over your site. So tone it down and try other methods of monetization if you want to avoid being penalized by the Google gods in 2013.

2 thoughts on “Our Findings: 5 Reasons Why Websites Get Penalized By Google In 2013

  1. Hi,
    Nice topic you have discussed here.
    Most of the webmasters are facing difficulties and needs some relief. Your article helps them to find some way to analyze and improve their way of working to get better results for their websites on SERPs.

    Thanks

    1. Glad we could help. Really though, I think most people are having problems because they still think there is a way to ‘game the system.’ There really isn’t. Hard work, good content and real link building efforts are all vital.

Comments are closed.