The Answers to Google’s 2014 First Algorithm Update

How-I-Beat-Googles-First-2014-Penalty-Update

On the 8th of January when the first 2014 update came along, it hit one of my blog sites like a freight train. I dropped from Page 1 into the abyss (Page 13 – Spot 131 to be exact). There was absolutely no doubt that an update had occurred. Fortunately after some testing, I successfully took it back all the way from rank 131 to rank 6. Check out the image below:

SERP - main_small

Background

Before we get into exactly what I did, let’s give some background to anyone who hasn’t been paying attention.

On the 8th and 9th of January, we started seeing some irregular shifts in the SERPs. SERoundTable covered this when they noticed lots of chatter amongst the WebmasterWorld community. A week later, more people were noticing these changes and the fluctuations reached it’s peak on January 17th, which according to DejanSEO, this was the 6th largest fluctuation we’ve seen in a year.

The search engines fluctuate all the time, and has become more of a regularity over the last 2 years. Most of the time, you just can’t worry about it and you’ll have to move on because it’ll end up sucking all your time. This one was different though – I had skin in the game and so it mattered. Hence I started testing.

Quick Rant

Whenever an update like this comes along, people start asking why. The worst answer I always hear is:

“I can tell you that you should spend less time worrying about penguin and links, and more time focusing on giving people good reasons to link to your sites.”

What he’s implying is to work on your content quality, but as you’ll see, you can have the best content in the world and you’ll still get hit. That could be thousands of dollars for some businesses, and a response like that just doesn’t cut it. The devils in the details and they do matter.

(Note: I’m not saying it’s not important to have high quality content).

What got hit

Without giving away the exact keyword, I can give you some metrics that might be just as useful to you.

Keyword: product + “reviews”

Average Monthly Searches: 27,100 in the United States

For the sake of this post, let’s say the keyword was “Lagostina Skillet Reviews”. I should also mention that the whole site was not affected, but rather just that specific review page. This makes a huge difference.

Test 1 – Penguin Test

We all know that the penguin update was all about anchor texts. Perhaps this was just another penguin refresh – so I hopped into Ahrefs and checked out my anchor text profile. I had 1 out of 8 (12.5%) of anchor texts being an exact match, so naturally I checked out that one site first.

I checked out it’s Moz metrics and they looked decent. The majestic metrics weren’t so great though. Here they are for you in case you’re interested.

moz-metrics

Decent Moz metrics

majestic-metrics

Terrible Majestic metrics

The Trust and Citation flow are low by anyone’s standards. Not impressed.

I took a look at the site itself and noticed that it wasn’t the highest quality of sites, and that it was a PR0. Since there was a recent PR update – the fact that it was a PR0 made me feel like Google didn’t value this site very much.

This was it, it had all the signs – poor metrics + exact match anchor text. If I were Google looking into this, I’d definitely consider this a junk link. Maybe junk enough for it to even deserve a penalty?

So I contacted the webmaster and had the link removed thinking this was surely it.

SERP - Penguin Test_small

The link was removed on January 19th, and as you can see – no dice. At one point I drop back another 70 or so spots. Yikes.

Test 2 – Panda Test

I’m going to start off by saying that my affected page was definitely not featuring low quality content. It was a high quality written article (done by myself), with beautiful and unique images. There was no keyword stuffing (in fact, my keyword density is only about 0.53% for this page) or any of that junk from 2008. This page was definitely written with the audience in mind, and because of that, I had nothing to change – except one thing.

My header tags. Were they over-optimized? I’m calling my header tags the Title, H1, description and URL.

This is what I had prior to making any changes (Note: I’ve altered the image below using our sample keyword for this post).

Pre-change rendition

I’ve got the exact keyword in the Title, H1 tag and URL. It’s also partially in the description. Perhaps this is too much? Perhaps it’s too obvious that I’m targeting the keyword “Lagostina Skillet Reviews” – even though it reads perfectly well.

I decided to switch it up. I changed the URL, and did a redirect from the old URL to the new one so all my backlinks would flow over.

This is what it looked like now:

Post-change rendition

Not a big difference right?

The results are huge though.

SERP - panda test_small

Wow!

What does this mean?

The obvious: keep your header tags from being over-optimized.

The less-obvious: This change shows us where Google is heading. They want to reward the sites with the best content, and those in their mind happen to be those that are natural and aren’t trying to game the system.

We want to make our sites look like they aren’t doing any SEO or have even the slightest clue of what it means. None of this should be a surprise, since Google has been saying this since the beginning of time, but at least now we have a better indication of what they’re looking for.

Have you been hit by this penalty as well? Let me know your experience with it below!