Monday, March 31st, 2014

Did the Softer Panda Update Arrive on March 24, 2014? SMBs Showing Modest Recovery Across Industries

Softer Panda Update on March 24, 2014

As a consultant helping a number of companies with Panda recovery, I’ve been eagerly awaiting the March Panda update.  Based on the data I have access to, I was able to pick up and analyze Panda UJan14, UFeb14, and the infamous UFeb14Rev (where Google re-rolled out the algorithm update after mistakenly hammering movie blogs).  Needless to say, it’s been an interesting beginning to the year Panda-wise. And if you’re wondering what U{Month}{Date} is, that’s the naming convention I’m using for unconfirmed Panda updates.

And in case you forgot, Google announced in July of 2013 that they wouldn’t be confirming Panda updates anymore.  As I explained in a post soon after that, unconfirmed Panda updates can cause mass chaos and can drive webmasters dealing with mysterious traffic losses insane.  But, I also explained that if you have access to a lot of Panda data, you can sometimes pick up the updates.  And that’s where SEOs helping a lot of companies with Panda can come in very handy.  Those SEOs have turned into human Panda barometers and can help identify when the specific updates roll out. Remember, we know that Panda is supposed to roll out monthly, and can take about ten days to roll out.  It’s not real-time, but Google trusts the algorithm enough to unleash it once per month.

I have been able to identify a number of updates since July of 2013, including UJan14 from January 11th and the flawed UFeb14 from February 11th (which was the movie blog fiasco I mentioned earlier).  But it’s been relatively quiet since then from a Panda standpoint.  I’ve only seen some moderate movement around 3/11/14, but nothing that I could nail down as Panda.  But then the 24th arrived, and it quickly became clear that something widespread was taking place.  Just like a Panda update.

Upward Movement Across Panda Victims
During the week of March 24th, I was checking organic search trending across clients and quickly noticed a number of increases from Google Organic.  The first one that caught my attention was a website that had been battling Panda for a long time.  It’s an ecommerce site that has seen many ups and downs since February of 2011 when Panda first arrived (and more downs than ups if you know what I mean).  The 25th showed an 18% increase in traffic, and it has consistently remained higher since then.  Google Webmaster Tools now shows increases in impressions and clicks starting on the 24th.  Comparing the entire week to previous weeks reveals Google Organic traffic was up 15%.

An example of a bump in Google Organic traffic starting on 3/24/14:

Panda Recovery Begins on 3/24/14

And that site wasn’t alone.  I was seeing similar lifts in Google Organic traffic across a number of Panda victims I have been helping.  That lift ranges from 9% to 24%, with a few outliers that saw much larger increases (45%+).  Note, those sites seeing larger increases didn’t have massive amounts of traffic, so it was easier to show a much larger lift.  That being said, the lift was significant for them.  But overall, I mostly saw moderate recoveries versus significant ones during this update.  And that leads me to think we just might have seen the “softer” Panda update that was supposed to help small to medium sized businesses (SMBs).

 

Matt Cutts and a “Softer” Panda
At SMX West, Matt Cutts from Google explained that they were working on a “softer” version of Panda that would make it less of an issue with certain websites.  Matt said the “next generation” Panda would be aimed at helping small businesses that might be affected by Panda.  Well, based on what I’m seeing, it sure looks like the new Panda could have rolled out.  Almost all of the companies I analyzed that were positively impacted by the 3/24 update could be categorized as SMBs.  They aren’t big brands, major corporations, they don’t have a lot of brand recognition, and some are run by just a few people.

In addition, most of the recoveries fell into the 10-20% range, which were modest increases.  Don’t get me wrong, that’s still a nice lift for some of the companies that previously got hit by Panda, but it’s not a massive recovery like you might see during other Panda updates.  For example, a company I was helping that got hit by Phantom in May of 2013 ended up recovering in August and surged by 68% in Google Organic.  That’s a big recovery.  So, modest recoveries line up with a “softer” algo that could help small businesses (in my opinion).


March 24, 2014 – A Good Day for (SMB) Panda Victims
Below, I have included some screenshots of Google Organic trending for companies impacted by Panda UMarch14.  You can clearly see a lift starting on the 24th and remaining throughout the week.  Note, these companies span various industries, so it wasn’t tied to one specific niche.

Panda Recovery SMB on 3/24/14 GA Data

 

Panda Recovery on 3/24/14 Google Webmaster Tools

 

Panda Recovery on 3/24/14 Google Analytics

 

Panda Recovery on 3/24/14 GWT

 

Panda Recovery on 3/24/14 Searchmetrics

 

 

Common Characteristics and Drivers for Recovery
If you have been impacted by Panda in the past, or if you are simply interested in the algorithm update, then I’m sure you are wondering why these specific companies recovered on 3/24.  And no, not all companies I’m helping with Panda recovered.   Now, only Google knows the refinements they made to the Panda algorithm to soften its blow on small businesses.  That said, I think it’s important to understand what Panda victims have addressed in order to better understand how the algorithm works.

Below, I’ll cover some of the common problems I’ve been helping companies tackle over the past several months Panda-wise (the companies that recovered during UMarch14).  I’m not singling out certain factors as the trigger for this specific update and recovery.  But I do think it’s worth covering several of the factors that were causing serious problems Panda-wise, and that were rectified over the past few months leading up to the recoveries.

Over-optimized Thin Pages
Several of the websites that experienced recovery had serious problems with thin pages that were over-optimized.  For example, pages with very little content combined with over-optimized title tags, meta descriptions, body copy, etc.  And the body copy was typically only a paragraph or two and was clearly written for search engines.

Over-optimized Titles and Panda

Doorway Pages
Along the same lines, several of the companies employed doorway pages to try and gain organic search traffic across target keywords.  For example, they would reproduce pages and simply change the optimization to target additional keywords.  For some of the sites I was helping, this was rampant.

Duplicate Pages and Panda

Keyword Stuffing
Some of the companies that saw recovery were keyword stuffing pages throughout their sites.  For example, all core page elements excessively contained target keywords.  The copy was extremely unnatural, the on-page titles (which were often the h1s), were clearly targeting keywords, the navigation was all using exact match anchor text, and the footer was crammed with more keyword-rich content and exact match anchor text links.  And many times, the target keywords were repeatedly bolded throughout the content.  It was obvious what the goal was while analyzing the pages… it was all for SEO.

Keyword Stuffing and Panda

Excessive Linking Using Exact Match Anchor Text
Some of the websites that saw recovery were previously weaving exact match anchor text links into every page on the site.  So, you would visit a page and immediately find exact match or rich anchor text links from the copy to other pages on the site.  It was excessive, unnecessary, and made for a horrible user experience.  And as I explained above, several of the sites were also employing spammy footers with exact match anchor text links (and many of them).

Affiliate Links
Several of the companies that saw recovery were including followed affiliate links in the site content.  Those links should absolutely have been nofollowed.  During initial audits I would uncover followed affiliate links, flag them, and document them in a spreadsheet.  When sharing my findings with my clients, some of the links were so old that my clients didn’t even remember they were there!  “Affiliate creep” can cause big problems post-Panda.  Nofollowing all affiliate links or removing them was important for sure.

Followed Affiliate Links and Panda

Nuking Duplicate Content or Noindexing Thin Pages
Some of the companies that saw recovery had an excessive amount of duplicate or thin content.  Upon surfacing the problematic urls, my clients either removed or noindexed those pages.  In some cases, that impacted tens of thousands of pages (or more).  Addressing “low-quality” content is one of the most important things a company can do Panda-wise.  And that’s especially the case if some of those pages were ranking well in Google (prior to the Panda hit).  You can read more about the sinister surge in traffic before Panda strikes to learn more about that phenomenon.

Noindexing Content and Panda

 

Warning: Some Sites Slipped Through The Panda Cracks
I also wanted to quickly mention something that can happen with algorithm updates.  There were two sites I analyzed that showed a modest recovery that shouldn’t have recovered at all.  They were rogue sites that some of my clients had set up in the past that were simply sitting out there.  Those sites are not getting a lot of attention from my clients, and there has been very little work on those sites from a Panda standpoint.  Needless to say, I was surprised to see those sites positively impacted by Panda UMarch14.  Sure, they didn’t surge in traffic, but they definitely increased starting on the 24th.  This also leads me to believe that we saw the softer Panda update that Matt Cutts mentioned.

False Panda Recovery on 3/24/14
Summary – Be In It For The Long Haul
As I explained earlier, Matt Cutts promised a “softer Panda” at SMX West that could help small businesses.  Based on what I have seen, that new update might have rolled out on 3/24.  I saw a number of companies that were dealing with Panda problems recover to some extent starting on that date.

If you have been hit by Panda, then the recoveries I documented above should signal hope.  The companies that saw recovery have worked hard to rectify a range of “content quality” problems.  Audits were completed, problems were identified, and a lot of work was completed over the past few months.

The good news is that a number of the websites making significant changes saw a positive impact from Panda UMarch14.  I think it underscores a Panda philosophy I have been preaching for a long time.  You must be in it for the long haul.  Short-term thinking will not result in recovery.  You need to have the proper analysis completed, identify all content-related problems, and work hard to rectify them as quickly as  you can.  And Google crafting an algorithm update that softens the blow of Panda sure helps.  So thank you Matt Cutts.  From what I can see, there are companies seeing more traffic from Google today than they did a week ago.  And that’s never a bad thing.

GG

 

Thursday, March 27th, 2014

Smartphone Rankings Demotion in Google Search Results Based on Faulty Redirects [Case Study]

Smartphone Rankings Demotion in Google Search Results

In June of 2013, Pierre Far from Google explained that providing a poor mobile experience could impact a site’s rankings in the smartphone search results.  Basically, if a site is making mistakes with how it is handling mobile visits, then that site risks being demoted in the search results when users are searching from their smartphones.  And as smartphones boom, that message scared a lot of people.

Specifically, Pierre listed two common mistakes that could cause a poor user experience.  First, having faulty redirects could force users to irrelevant content, or to just to the mobile homepage of a website.  For example, imagine searching for a specific product, service, review, blog post, etc., and finding that in the search results.  But as you click through, the site redirects you to the mobile homepage.  That sounds really annoying, right?  But it happens more than you think.  And that’s especially true since the problem is hidden for desktop users.

But that June day in 2013 passed, and businesses moved on.  Sure, mobile is important, it’s taking over, blah blah blah.  In addition, I’m sure many wondered if Google would really demote a site in the smartphone search results.  I mean, why move a powerful site like yours down in the results when your pages really should rank highly (like they do on desktop)?  Google would probably only do that to low quality sites, right?..  I think you see where I’m going with this.

 

Faulty Redirects – An Example Caught in the Wild
Last week, I was checking Techmeme for the latest technology news and clicked through an article written by Electronista.  I forget which story the article was about, but Electronista was listed first for the news at hand.  So I clicked through and was immediately redirected to the mobile homepage.  I navigated back to Techmeme, clicked the listing again, and was promptly redirected again.  So I visited another site listed for the story on Techmeme and got the information I was looking for.

*ALERT* – That’s exactly the user experience Google is trying to avoid happening to people searching Google.  And that’s one of the core scenarios that Pierre listed that could result in a rankings demotion.  So that got me thinking.  What about other pages on Electronista?  Were they also redirecting mobile users to the mobile homepage?  And if this problem was widespread, were they being demoted in the smartphone search results?  And so I dug in.

Side note: I’m not targeting Electronista by writing this.  Actually, I hope this post helps them.  I can only imagine that if they fix the problem, then their traffic from smartphone users on Google will skyrocket.


An Example of a Faulty Redirect on Electronista

I’m sure you are wondering how this looks.  Here’s a quick example.  Let’s say I was researching a Nexus 7 tablet and comparing it to an iPad mini.  Electronista has an article focused on that topic.  On desktop or tablet, I visit that url and can view the entire post.  But on my smartphone, visiting that url redirects me to the mobile homepage (via a 302 redirect).

Desktop URL resolves correctly:

Desktop URL on Electronista.com

When searching on my smartphone, the site incorrectly redirects me to the mobile homepage:

Redirect to Mobile Homepage on Electronista.com

 

Here is the 302 redirect in action:
302 Redirect to Mobile Homepage on Electronista.com

 

Examples of Demoted Smartphone Rankings
Electronista.com has 143K pages indexed in Google.  And every url I checked on my smartphone is redirecting to the mobile homepage.  So it wouldn’t take Google very long to pick up the problem, and across many pages on the site.  But now I needed evidence of rankings being demoted based on this problem.

So I fired up SEMRush and checked the organic search reporting for Electronista.com.  I started picking keywords that the site ranked highly for (on page 1 on Google).  Then I started searching on my desktop and smartphone using Chrome for Android (incognito mode).  And low and behold, I noticed the problem almost immediately.  Smartphone rankings were either much lower or non-existent for content that was ranking highly on desktop.  Almost all of the keyword/ranking combinations I checked revealed the demotion in the smartphone search rankings.

Note, not every Electronista listing was being demoted.  There were a few outliers where the page still ranked well (as well as it did on desktop search).  But the user experience was still horrible.  I was redirected to the mobile homepage and forced to fend for myself.  Needless to say, I wasn’t going to start searching the mobile site for the url I expected to see.  I just bounced.  And again, Google doesn’t want its users to have to deal with this situation.  Instead, Google will just demote the search rankings on smartphones.

A picture is worth a thousand words, so let’s take a look at some examples.  Below, I have provided screenshots of the demotion in action.  You’ll see the desktop search results first and then the smartphone search results below that.

Red Camera For Sale (ranks #8 on desktop and N/A on smartphone)

Desktop search results:
Desktop Search for Red Camera on Google

Mobile search results:
Mobile Search for Red Camera on Sale on Google

 

LTE Microcell (ranks #10 on desktop and N/A on smartphone)

Desktop search results:
Desktop Search for LTE Microcell on Google

Mobile search results:
Mobile Search for LTE Microcell on Google

 

HTC Vivid Radar (ranks #3 on desktop and #20 on smartphone)

Desktop search results:
Desktop Search for HTC Vivid Radar on Google

Mobile search results:
Mobile Search for HTC Vivid Radar on Google

 

Google Nexus 7 Versus ipad mini (ranks #8 on desktop and #18 on smartphone)

Desktop search results:
Desktop Search for Google Nexus 7 Versus iPad Mini on Google

Mobile search results:
Mobile Search for Google Nexus 7 Versus iPad Mini on Google

Skullcandy Pipe Review (ranks #5 on desktop and #10 on smartphone)

Desktop search results:
Desktop Search for Skullcandy Pipe Review on Google

Mobile search results:
Mobile Search for Skullcandy Pipe Review on Google

 

And here are a few where the rankings were not demoted.  They should be demoted, but they weren’t (at least for now):

 

Commodore 64 remake
Mobile Search for Commodore 64 Remake Review on Google 

 

Windows 8 touch screen requirements
Mobile Search for Windows 8 Touch Screen Requirements on Google 

 

 

How To Avoid Demoted Smartphone Search Rankings (Listen up Electronista)

The solution to this problem is fairly straightforward.  If you are using separate webpages for mobile content, then you should redirect your desktop pages directly to the mobile url for that content.  Do not redirect all requests from smartphones to the mobile homepage.  As Google explains, “This kind of redirect disrupts a user’s workflow and may lead them to stop using the site and go elsewhere.”  And by the way, Google also says that it’s better to show smartphone users the desktop content versus implementing a faulty redirect to the mobile homepage.  I completely agree.

In addition, make sure you use rel alternate on your desktop pages pointing to your mobile pages.  And then use rel canonical on your mobile pages pointing to your desktop pages.  You can read Google’s documentation for handling various mobile setups here.

Update: Pierre Far from Google provided some feedback based on reading this case study.  I asked Pierre how quickly Google would remove the demotion once the redirect problem was fixed.  Here is what Pierre said:
“When a fix is implemented, we’d detect it as part of the usual crawling and processing of each URL.”

So, it seems that once the redirects are corrected, Google will detect the proper setup as it recrawls the site.  As it does that, the pages should return to their normal rankings.  If Electronista makes the necessary changes, I’ll try and figure out how quickly their smartphone rankings return to normal. Stay tuned.

Avoid Smartphone-only Errors
I covered faulty redirects and the impact they can have on search rankings, but there’s another scenario that can get you in trouble.  Google also explains that smartphone-only errors can also result in demoted smartphone rankings.  And in my experience with auditing websites, these types of errors can go unnoticed for a long time.

For example, if you incorrectly handle Googlebot for smartphones, then you could incorrectly present error pages to users.  In addition, the code that handles mobile pages could be bombing, which would also present errors to smartphone users.  Needless to say, I highly recommend testing your setup thoroughly via a number of devices, checking your site via mobile emulators, and crawling your site as Googlebot for smartphones.  The combination will often reveal problems lying below the mobile surface.

Note, Google Webmaster Tools also recently added smartphone crawl errors.  The report provides a wealth of information about the errors that Googlebot for Smartphones is running into.  And that includes server errors, 404s, soft 404s, faulty redirects, and blocked urls.  I highly recommend you check out your reporting today.  You never know what you’re going to find.

Smartphone Crawl Errors Reporting in Google Webmaster Tools

 

Summary – Turning Demotions into Promotions
As mobile booms, more and more people are searching from their smartphones.  Google is well aware of the problems that mobile users can face while searching for, and viewing, content on their phones.  And in response to those problems, Google will demote your rankings in the smartphone search results.  Electronista is currently implementing faulty redirects, and based on that setup, its rankings are being heavily demoted.  Don’t let this happen to you.  Check your setup, view your reporting in Google Webmaster Tools, and then quickly fix any problems you are presenting to mobile users.  Think of all the traffic you might be losing by not having the right mobile setup in place.  The good news is it’s a relatively easy fix.  Now fire up those smartphones and visit your site.  :)

GG

Sunday, March 2nd, 2014

Flawed Google Algorithm Updates, Movie Blogs, and Copyright Infringement – Tracking The Panda Updates From February 2014

Summary:  Google ended up rolling out two major algorithm updates in February of 2014.  The first, which seemed like the monthly Panda update, caused serious collateral damage with a number of prominent movie blogs. After hearing from one movie blog owner, Matt Cutts of Google got involved, and Google refined the algorithm.  Two days later, Google Organic traffic surged back to the movie blogs, signaling that Google rolled out the algorithm update again. This post provides details, analysis, and findings based on analyzing movie blogs impacted by the UFeb14 and UFeb14Rev algorithm updates.

Flawed Panda Update from February 2014

I’ve been following a fascinating SEO situation over the past ten days.  And based on my analysis, I might have found something big.  As in Panda big.  So if you’re the type of person that’s interested in algorithm updates, or if you have been impacted by algo updates in the past, then this post is for you.

As I explained in my last post about the UJan14 update, Panda rolls out monthly, but Google won’t confirm those updates anymore.  In addition, it could take ten days to fully roll out.  That combination makes for a confusing situation for webmasters dealing with Panda attacks.  But, SEOs neck deep in Panda work can often see those updates, as they are helping a number of companies recover, while they also have companies with fresh hits reach out to them.

Those SEOs can act as human barometers for Panda updates.  And since I’m helping a number of companies deal with Panda hits, and I often have companies hit by algo updates reach out to me, I’m fortunate to have access to a lot of Panda data.  And that data can often signal when Panda rolls out each month.  In my last post, I documented the UJan14 update, based on seeing several companies recover and hearing from those on the flip side (the ones that got hit).  Those websites unfortunately got hammered, typically by 25-40% – overnight.

At the end of that post, I mentioned that the February Panda update looked like it was rolling out (right around February 11, 2014).  That made a lot of sense, since it was exactly one month from the previous Panda update.  By the way, I am using the naming convention U{Month}{Year} to track unconfirmed updates by Google, so February’s update would be UFeb14.

Well, it seems I was right.  After my post, I saw a number of companies impacted heavily by UFeb14, and most saw that impact beginning around February 11th through the 14th.  Based on seeing those hits and recoveries in early February, it was already a big deal that Panda was rolling out.  But little did I know what was coming…  and it was big.

Peter Sciretta of SlashFilm Tweets and Matt Cutts Responds

On February 21, 2014, Peter Sciretta from SlashFilm tweeted the following to Matt Cutts:

 

Boy, that got my attention for sure.  I always look for common themes when analyzing Panda to see if any new factors have been added to the algo, if there was collateral damage, etc.  As many in SEO know, Matt definitely responds to some people reaching out with messages, so I waited to see if he would respond.  And he did, on February 22nd, Matt responded with the following tweet:

 

OK, so that response was very interesting.  First, he hopes to dig into this soon… Really?  Wow, so Matt is digging into an SEO situation based on a tweet?  That was the first signal that Panda could have gone rogue.  Second, he apologized for the delay in responding.  Yes, another sign that Panda could have eaten some bad bamboo and went ballistic on sites that weren’t actually Panda targets.

So I ran to SEMRush and SearchMetrics to check out the damage.  And there was damage all right… Serious damage.  Check out the trending from SEMRush for SlashFilm:

SlashFilm Drop on February 14, 2014

Which led me to check out other sites in that niche.  And I found ScreenRant also had a huge drop.

SlashFilm Drop on February 14, 2014

And they weren’t alone.  A number of prominent movie blogs got absolutely crushed during UFeb14.  Based on SEMRush data, the movie blogs I analyzed lost between 40% and 50% of their Google Organic traffic overnight.  Boom.

U-Shaped Trending – The Blogs Bounce Back
What happened up to that point was already enough to have me heavily analyze the movie blogs, but the story was about to get better.  Each morning following Matt’s tweet, I planned to quickly check the trending for the movie blogs I was monitoring to see if there were any signs of recovery.  If Matt was checking on the situation, and if it was indeed a flaw in the algorithm, then Google could possibly roll out that algorithm update again.

The 23rd was quiet.  No changes there.  And then the 24th arrived, and what I saw blew me away.  SlashFilm’s trending popped.  Yes, it absolutely looked like they started to recover.  Check it out below:

SlashFilm Recovery on February 24, 2014

And ScreenRant showed the same exact jump.  Wow, this was big.  We just witnessed a flaw in the algo get rolled out, cause serious collateral damage, get re-analyzed, tweaked, and then rolled out again less than two days later.  And then the movie blogs recover.  I don’t know about you, but that’s the fastest Panda recovery in the history of Panda!  :)  Fascinating, to say the least.

So, I tweeted Barry Schwartz and Peter from SlashFilm about what I saw, and Peter did confirm they were seeing a big recovery.  He also said the following, which I thought was interesting:

And that’s some error… It’s a great example of how catastrophic major algorithm updates can be, especially when there’s a flaw in the algorithm that causes collateral damage.  Losing 40-50% of your organic search traffic overnight could end some companies.  And then there’s the most important question that Panda victims have been asking themselves since this happened.  What if Peter didn’t complain to Matt Cutts?  Would Google have picked up the problem on its own?  How long would that have taken?  And how much damage would those movie blogs would have experienced traffic-wise, business-wise, etc?  All great questions, and only Google knows.

Digging Into the Panda Data
For those of you that are familiar with my work, my blogging, etc., you probably know what’s coming next.  There’s no way in heck I would let this situation run by without heavily analyzing those movie blogs that experienced a serious drop in traffic.  I had many questions.  Why did they get hit?  Were there any consistencies across the websites?  What factors could have led to the flawed drop on 2/14/14?  And what was the flaw in the algorithm that triggered Panda hits on the movie blogs?

So I started collecting data immediately, and I would refresh that data each day.  That’s until I had time in my schedule to analyze the situation (which based on my chaotic schedule wasn’t until 5AM on Saturday morning).  But I’ve now spent a lot of time going through data from movie blogs that got hammered on 2/14/14 and that recovered on 2/24/14.  I’ve also dug into sites that only saw changes on one of those dates (to give me even more data to analyze).

I used SEMRush to uncover all of the keywords that dropped significantly in the rankings starting on February 14, 2014.  I also was able to export the landing pages for each of the keywords.  That was key, as Panda targets low-quality content.  Analyzing that content could help me uncover problems that could have caused the Panda attack.  I did this heavily for both SlashFilm and ScreenRant, as they both experienced a heavy drop on 2/14 and then a large recovery on 2/24.  But I also analyzed other sites in that niche that experienced problems and recoveries during February.  As I mentioned earlier, there were a number of movie websites impacted.

Analysis-wise, I heavily analyzed both sites manually and via crawls.  The crawls were used to flag certain problems SEO-wise, which could lead me down new paths.  My manual analysis was based on my extensive work with helping companies with Panda (knowing certain criteria that can cause Panda attacks).  The combination of the two helped me identify some very interesting factors that could have led to the faulty Panda hits.

Here’s what I found… and stick with me.  I’ll take you through some red flags before explaining what I think the actual cause was.   Don’t jump to conclusions until you read all of the information.
 
1. Thin Content
It was hard to overlook the overwhelming amount of thin content I was coming across during my analysis.  And when Panda targets low-quality content, which can often be extremely thin content, that had my attention for sure.  For example, pages with simply an image, blog posts that were just a few sentences, etc.

Thin Content on Movie Blogs

But, this was not a unique factor for February (or for just movie blogs), which is what I was looking for.  Previous Panda updates could have absolutely crushed these blogs for thin content already… so why now?  That led me to believe that thin content, although a big problem with the movie blogs I was analyzing, wasn’t the cause of the UFeb14 hit they took on 2/14/14.   It met the “consistency” factor, since it was across the movie blogs, but wasn’t unique to this update.  Let’s move on.

 

2. Affiliate Links
I’ve helped a number of companies with Panda that were engaged in affiliate marketing.  Unfortunately, many affiliate marketers have gotten crushed since February of 2011 when Panda first rolled out.  So, it was interesting to see what looked to be followed affiliate links to Amazon on a number of pages I analyzed.  Those pages were thin, provided a quick mechanism to send along affiliate traffic to Amazon, and could absolutely get a website in trouble SEO-wise.

Affiliate Links on Movie Blogs

But two things stuck out…  First, compared to overall indexation on the sites I was analyzing, the number of pages with affiliate links was low (at least for the affiliate links I picked up).  Second, I did not find the same type of links across movie blogs that were hit.  So, the “consistency” factor was not there.  Time to move on (although I would caution the movie blogs that providing followed affiliate links violates Google Webmaster Guidelines).

3. Zergnet and Other Content Syndication Networks
Moving from inconsistency to consistency, I found a common thread across almost every movie blog I analyzed.  I found Zergnet links at the bottom of each post.  Zergnet is a content syndication network (similar to Outbrain, Zemanta, etc.)  On the surface, and in their most current form, these networks shouldn’t impact SEO negatively.  The links are nofollowed, which they should be.

But, in the past some of the networks were used to gain followed links from relevant websites across the web.  And that violates Google Webmaster Guidelines.  Actually, I’m helping several companies right now try to clean up followed links from older pages that still have followed links via Zemanta.  Here’s what the Zergnet links look like on the movie blogs:

Zergnet Links on Movie Blogs

But, like I explained above, the current implementation of Zergnet links is fine right now.  All of the links are nofollowed, which should shield the sites from any Google damage.  Let’s move on.

4. Videos, Trailers, and Copyright Infringement – Bingo
When the movie blogs got hit, a number of people in SEO (including myself) started making the connection between YouTube, copyright infringement, and the algo hits.  As movie blogs, one could only imagine that there were a lot of posts about movies that contain video clips, trailers, etc.  So, I was interested in seeing how much video footage I would come across during my analysis, and how much of that was problematic copyright infringement-wise.

And since we live in an embeddable world (with YouTube and other video networks making it easy to embed video clips on your own website), questions started to arise about how Google could treat the various parties involved in copyright infringement. In other words, who is the source of copyright infringement?  And how can you police others SEO-wise that might be part of the problem?  What about websites that simply embed public YouTube clips?  All good questions, and I was eager to dig in.

It wasn’t long before I came across webpages with video clips that had copyright infringement problems.  Now, those clips were typically sourced at YouTube or other video networks like Yahoo Video.  The great part about YouTube clips that were taken down is that they will literally provide a message in the clip that the user associated with the account has been removed due to copyright problems.  That made my analysis easier, to say the least.

So, trailer by trailer, video clip by video clip, I came across more and more examples of videos and users removed due to copyright infringement.  Here are some screenshots based on my research.  Notice the copyright infringement messages on pages that got hammered during the UFeb14 algorithm update:

Copyright Infringement Notice for YouTube Videos

 

More Copyright Infringement Notice for YouTube Videos

And ladies and gentlemen, this is where I think the flawed algo incorrectly targeted the movie blogs.  SlashFilm, ScreenRant, and others weren’t the source of copyright infringement.  They were simply end users that embedded those clips in their own posts.  So, if YouTube originally let the clips reside on its own network, and freely let users embed those clips on their own sites, could Google actually penalize those destination websites?

That wouldn’t be right… The penalty should simply be a bad user experience for visitors of the blogs, since the clips won’t play.  Now, if the movie blogs were creating their own videos that violated copyright laws, then I get it.  But shouldn’t that damage come via the Pirate Update?  I heavily analyzed the Pirate Algorithm recently, and you can read more about my findings by following that link.

So, it was interesting to see copyright-driven factors severely impact websites during what seemed to be the monthly Panda update.  Is Google incorporating more Pirate factors into Panda?  Are we seeing the maturation of Pirate into a rolling monthly update like Panda?  Was this the first time Google tried to incorporate Pirate into the monthly update?  All good questions.

Back to Video & More Embed Problems…
Visit after visit, page after page, I came across all types of video embed problems on the movie blogs.  For example, I saw copyright notices, blank videos (like the videos were removed from the services being used), embed code actually on the page versus the videos, messages that a video was now marked as private, etc.  All of this could very well be tied to copyright infringement.

Video Embed Problems on Movie Blogs

 

More YouTube Embed Problems on Movie Blogs

CinemaBlend and Recovery During UFeb14Rev
And the plot thickens…  The examples listed earlier were based on analyzing sites that experienced a major hit on 2/14 and then recovered on 2/24.  But what about other movie sites during that timeframe?  Did any experience unusual declines or surges?  Yes, they did.  I started checking many sites in the movie blog niche, and one in particular caught my attention. Check out the trending for CinemaBlend.com:

CinemaBlend Panda Recovery

Wow, they experienced a huge surge in traffic once UFeb14Rev rolled out (the revised algorithm update that rolled out once Matt Cutts got involved).  It looks like they originally got hit in January (I’m assuming by UJan14).  Connecting the dots, if CinemaBlend recovered during the revised February update, then could they have been wrongly impacted in January?  Why would they recover during UFeb14Rev and not just UFeb14?  Yes, I had to dig in.  Down the rabbit hole I went…  And yes, this was becoming my own version of Inception.  One SEO rabbit hole led to another rabbit hole.  Maybe I should create a movie trailer about it and embed it here.  :)

I began digging into the data, based on CinemaBlend’s recovery and was eager to see if video clips, trailers, and copyright infringement would surface.  But what I found was really interesting… The pages looked clean… but almost too clean.  There were pages optimized for trailers, when in fact, there were no videos embedded on the page.  At least now.  The more pages I checked, the stranger that situation became…   Many trailer pages either contained blank spots where videos once resided, or the pages just contained no videos at all.  Very strange.

So it begs the question, did CinemaBlend quickly deal with their Panda hit from January?  Did they analyze their landing pages seeing a drop and remove dead video clips, videos that were flagged for copyright infringement, etc?  I can’t say for sure, but the crime scene looked too pristine to me.

Video Trailer Page on CinemaBlend

What This Means For SEOs, Panda Victims, Movie Sites, and Google
Based on what happened during February, there are some important points I wanted to list.  If you are dealing with a Panda situation, if you are susceptible to Panda, if you are an SEO helping others with Panda, or if your business simply relies on Google Organic traffic, then the bullets below should be of extreme importance to you.

  • Google Does Make Mistakes
    And when those mistakes are tied to major algorithm updates, collateral damage could occur (in grand ways).  Based on what happened with the movie blogs, I think all of the website owners owe Peter Sciretta a drink (or a bonus).  Without Peter speaking up, it’s hard to say how long that ugly situation would have gone on.  Instead, it was only ten days.
  • Know What You Post (and the source of that information)
    As more and more algorithm updates are being crafted in Google labs, and subsequently injected into the real-time algorithm, it’s more important than ever to know your site inside and out.  Know what you are posting, where it’s from, if it’s original, if you are breaking any copyright laws, if it’s scraped, etc.  If you don’t, you are leaving yourself susceptible to future Panda and Pirate attacks.  Talk about a shot across your bow.  :)  Be vigilant.
  • Unconfirmed Updates Create Madness in Webmasters
    I called this when it was first announced, but Panda updates without confirmation can be disastrous for webmasters.  It’s hard enough for SEOs neck deep in Panda work to decipher what’s going on, but it’s exponentially harder for people outside of SEO to know what happened.  It’s one of the reasons I’ve been writing more and more about Panda updates.  I want to make sure we document major algo updates that seem to be Panda (roll out once per month, target low-quality content, etc.)  Without some form of identification, we’ll be living in a quasi, post-apocalyptic web.  Queue another trailer, this time with Matt Cutts standing in for Mad Max.  :)
  • Track and Document Everything You Can (And Speak Up)
    It’s more important than ever to analyze your website, your analytics reporting, Google Webmaster Tools data, etc.  Use annotations in Google Analytics to mark dips and surges in traffic, add information about confirmed and unconfirmed algorithm updates, export your data regularly, and monitor the competition.  If you end up in a situation like the movie blogs, you’ll have a lot of data to analyze, to hand SEOs that are helping you, and even to provide Google if it comes to that.

 

A Quick Note About UFeb14 and UFeb14Rev
I know a number of people have reached out to me since UFeb14 rolled out on 2/11/14 asking for more details.  I focused this post on the movie blog situation, based on how unique and fascinating it was.  But, I do plan to write more about the latest Panda update (so stay tuned).  As I said earlier, it’s important to document as many algorithm updates as we can so webmasters impacted by those updates can have some idea what hit them, what the root causes of their problems are, etc.

Summary – Flawed Algorithms, Movie Blogs, and Collateral Damage
Based on my experience with Panda, the past ten days have been fascinating to analyze.  Needless to say, you don’t often see algorithm updates roll out, only to get refined days later before a second rollout.  But that’s exactly what we saw here with UFeb14 and UFeb14Rev.  On the one hand, it was great to see Google move quickly to rectify a flaw in the UFeb14 algorithm update.  But on the other hand, it makes you wonder how many other flaws are out there, and how many sites have been wrongly impacted by those flaws.

For Peter Sciretta, and his fellow movie bloggers, they dodged a serious bullet.  Actually, it was a magic bullet.  One that first passed right through their hearts, pulled a 180, headed back to Google, was taken apart and refined, and then shot back out across the web.  But how many other flawed bullets have been shot?  Wait, it sounds like a great storyline for a new movie.  Maybe Peter can connect me with some movie producers.  :)

GG

Monday, February 17th, 2014

Panda UJan14 – Uncovering the Panda Update from January 11, 2014 (with a note about Expedia)

Panda Update on January 11, 2014

As of July 2013, Google will not confirm Panda updates anymore.  And as I explained in my post about unconfirmed Panda updates, this can lead to serious confusion for webmasters.  For example, if Panda updates are not documented, then it becomes that much harder to understand why a serious drop in organic search traffic occurred.  Was it Panda, a smaller algo change, was the drop due to links, or other factors?  Even when Panda updates were confirmed, it was still a a confusing topic for business owners.  And now it’s even more confusing since those updates are cloaked.

According to John Mueller of Google (via a webmaster hangout video), the Panda algorithm is now trusted enough that Google feels comfortable rolling it out once per month.  That link should jump you to 22:58 in a video where John speaks about Panda.  It’s not real-time like some people think, it’s simply trusted more than it once was (and Google can bypass some of the testing it used to implement prior to rolling out Panda).  The new Panda can take ten days to fully roll out, and again, Google will not provide confirmation of the updates.  So yes, Panda updates have been occurring since the last confirmed update, but it’s just harder to pinpoint those exact dates.

Human Panda Barometers
In my post about unconfirmed Panda updates, I explained that SEOs well-versed in Panda can typically shed some light on new updates.  That’s because they have access to a lot of data.  And not just any data, but Panda data.  The more companies an SEO is helping with Panda, the more that SEO has visibility into when Panda actually rolls out.  In addition, SEOs heavily working with Panda might have more companies reach out to them that were impacted by subsequent Panda updates.  That’s even more Panda data to analyze.

That’s why I believe SEOs heavily involved with algorithm updates can act like human Panda barometers, and can help determine when new updates roll out.  Based on my work with Panda, I’ve had the opportunity to see when some cloaked Panda updates rolled out (like the August and September Panda updates that I documented in my post from November).  The reason I can identify some of the newer Panda updates is because some of the companies I’m helping see recovery, while other companies that were just hit by Panda reach out to me for help.  The combination of the two enables me to pick up when some Panda updates roll out.


Welcoming 2014 with a Panda Update – January 11th Specifically
So, 2014 kicked off and I was wondering when the first major algorithm update would happen.  And it didn’t take long… as January 11th was a tough day for many webmasters.  Right around the 11th, I noticed an uptick in webmaster chatter about an update occurring, which quickly led me to Google Analytics to trend Google organic search traffic across several websites dealing with Panda problems.   Low and behold, there was significant movement.

Check out the SEO visibility for a company that got hit by the January 2014 update:

Website Impacted by Panda UJan14

In addition to companies I am currently helping, my inbox also confirmed something was going on.  I had several new companies reaching out to me after the 11th explaining that they saw a major hit starting on that date.  Upon checking their reporting, you could clearly see a significant drop beginning on January 11, 2014.  And digging deeper revealed that a number of those companies had battled with Panda in the past.  A few had also exchanged blows with Phantom on May 8, 2013.

This led me to believe that we were witnessing our first Panda update of 2014.  And since I’m a big believer in naming updates to document them specifically, I’m going to name this one too.  I’m calling it Panda UJan14, for “Unconfirmed January 2014″.  I think this naming convention works extremely well, since the new Panda is supposed to roll out monthly.  Providing the month and year in the update will help clarify when those updates rolled out.

And based on the data I have analyzed since July, here are the Panda updates I believe have rolled out since the last confirmed update in July 2013.  Notice how they are approximately one month apart:

  • Panda UAug2013 – on August 26th
  • Panda USep2013 – on September 16th
  • Panda UNov2013 – on November 18th  - (Note, I don’t have a lot of data backing this update, but several sites I analyzed saw significant movement on the 18.)
  • Panda UDec2013 – on December 17th
  • The latest – Panda UJan2014 – on January 11, 2014

The Impact of Panda UJan14
Let’s start with the negative impact of the latest Panda update.  The companies that reached out to me after getting hit by Panda UJan14 saw a big drop from Google Organic search traffic.  That ranged from 20-35% and began on January 11, 2014.  Here’s the SEO visibility of another site hit by the January Panda update:

Negative Impact from Panda UJan2014

As mentioned earlier, a number of those companies had previous battles with Panda.  Clearly, they had content quality issues from a Panda standpoint.  When speaking with the business owners about the drop, they all explained implementing changes over the years when dealing with previous Panda updates.  But as I explained in a post about the grey area of Panda, if you don’t significantly tackle the content quality situation, you could very well get hit again (or not recover in the first place).  It’s extremely important to make significant changes in order to exit the grey area.  If you don’t, you could sit in the grey area of Panda forever, never knowing how close you are to recovery.  Several of the companies that were hit in January did not do enough to clean up their Panda issues, and were subsequently hit with another Panda update.

Now the positive impact from UJan2014.  On the flip side of the Panda hits were some positive stories.  A few companies I have been helping saw increases ranging from 15-25% based on the January 11, 2014 update.  These were companies that experienced previous Panda and/or Phantom hits, have been working on fixing their content problems, and saw an increase in Google organic traffic during the UCJan14 update.

Notice the uptick in impressions and clicks starting on January 11th:

Positive Impact from Panda UJan2014

It’s important to note that several of the companies did not recover fully to pre-Panda or pre-Phantom levels, but they definitely saw a nice increase.  Remember, there’s a reason the sites got hit by Panda in the first place.  The content that was once ranking well and driving traffic shouldn’t have been ranking that well in the first place…  which led to a lot of traffic with serious engagement issues.  And serious engagement issues (like extremely low dwell time), can cause a Panda attack.  More context about that situation in my Search Engine Watch column about the sinister surge before Panda strikes.

Others Benefiting From the Drop
In addition to websites recovering from Panda, I noticed a number of companies simply benefiting from the hits others were taking.  For example, if certain companies drop out of the rankings, then others take their place.  Those companies were simply benefiting from the drop in rankings of January Panda victims.

For example, here’s the trending for a site that directly competes with a Panda victim I analyzed.  Notice the jump starting around January 11, 2014.

A website benefiting from Panda UJan2014

A Note About Expedia – Was it Panda Versus a Manual Action?
It sure looks that way to me.  Nobody knows for sure other than Google and Expedia, but the drop occurred exactly when I saw the January Panda update.  Check out the trending below based on Search Metrics data.

Was Expedia Hit by Panda UJan2014?

That’s just something to think about since many people believe that Expedia was hit by an unnatural links penalty.  I tend to think it was Panda instead.  That said, I would have to heavily analyze the keywords that were impacted, the content that was once ranking, etc. to better determine if that was the case.

Summary – The Importance of Monitoring Cloaked Panda Updates
As I explained above, it’s getting extremely difficult to identify Panda updates.  They are supposed to roll out monthly, take ten days to fully roll out, but Google won’t confirm when the updates occur.  For the average business owner, this is a recipe for serious confusion when organic search trending takes a dive.

My goal with posts like this one is to provide as much data as I can with regard to major algorithm updates so webmasters can take the appropriate actions to rectify the problems at hand.  Without understanding the specific algorithm update that hits a website, companies could struggle with deciphering the root cause of the problem.  And that could easily lead to spinning wheels, or in a worst case scenario, implementing changes that actually make the situation worse SEO-wise.  Moving forward, I’ll try and document subsequent Panda updates the best I can.

But hold on… has the next Panda update already rolled out??  There’s a lot of chatter about a February update and I am seeing movement across sites hit by Panda (starting around February 11th).  It very well could be Panda UFeb14.  The timing makes a lot of sense as well, since the last update was exactly one month ago.  I’ll know more in a few days once more data comes in.  Stay tuned.  :)

GG

 

 

Monday, January 27th, 2014

In-SERP Hover Cards – How Google Could Surface Your Answers, Products, Downloads, Reviews, Events, and More Directly in the Search Results

New Google Hover Card in SERPs for Beats Music
Last Wednesday, Google rolled out new functionality in the search results, which sure got the attention of SEOs across the industry.  Now when searching for information, you will sometimes see an additional link directly in the search results for specific organizations and/or websites.  Users can click on that link to view additional information about that organization right in the search results (via data from Google’s Knowledge Graph).

Google states that this can occur for websites that are “widely recognized as notable online, when there is enough information to show or when the content may be handy for you.”  When clicking the link next to the URL in the search snippet, a small window opens providing the information.  It’s basically a hover card that provides additional information.  This is an important move by Google, since users don’t need to leave the search results to find more information.

Here’s an example of the info card for Netflix:
New Google Hover Card in SERPs for Netflix

The information displayed in the hover card is based on Google’s Knowledge Graph, or data that Google has collected about “real world things”.  Knowledge Graph data comes from a variety of trusted sources, including Freebase (which Google acquired), Wikipedia, CIA World Factbook, etc. As of July of 2012, Google had collected information about 570 million entities, including 18 billion facts and connections.

To quickly summarize the new addition to the search engine results pages (SERPs), if you are searching for answers, and Google has information in its Knowledge Graph about the sites ranking in the search results, you just might see that new link appear directly within the search listing.  And if you click that link, you’ll see Knowledge Graph data in a small window directly in the search results.

Hover Creep: Your Content, Answers, Products, and Downloads Directly in the Search Results?
As I was testing these new “Info Cards”, I started to think deeper about what was occurring, and how this might be the next phase of a monumental shift for Google.  Over the past few years, SEOs have seen Google provide more and more information directly in the search results.  For example, check out all of the types of answers Google will provide right in the results (courtesy of Pete Meyers).  Based on this shift to the all-knowing SERP, many SEOs believe that at some point, Google won’t need to drive users to third party websites anymore.  Instead, maybe it could provide all the information directly in the results.

Don’t believe me?  How about this search for “calories in bananas”:
Nutrition Information in the Search Results

 

Expanding Hover Cards – Coming Soon to a SERP Near You
Based on how much information Google is already providing in the search results (driven by Knowledge Graph data), combined with new hover card functionality in the search results, is it really far-fetched to think Google could expand this approach?  Sure, it won’t happen overnight, but as Google collects and trusts more information from trusted third parties, it could absolutely start providing that data right in the search results.

And that little popup window (hover card) is the first sign that Google isn’t afraid to add more information directly in the SERPs for specific listings.  Let’s face it, providing author details (based on authorship markup) is one thing.  But using a hover card to provide more content per search listing is another.

And maybe this is just a test to see how users react before rolling out more and more content directly in the search results.  And maybe it’s not limited to content… maybe other types of functionality are coming, like ecommerce functionality, downloads, sign-ups, etc.  Now that would be interesting, unless of course, you’re the owner of that content, download, etc. who gets cut out of the process.  Yes, beware the hover card.

So, let’s have some fun and explore what this could look like and how it could work.  It just might be closer than you think.


Trusted Sources, and a Note About Publishership
Some of you reading this post might be wondering how Google could verify typical websites, especially since it’s using trusted data for the recent release of “info cards”.   For example, Google trusts the data in its Knowledge Graph, so it’s comfortable providing the popup window with more information about certain entities.  But will it do this for the average site on the web?  If Google is going to provide more information directly in the search results, then it’s going to have to trust those third party websites, and their content, to do so.

Although many website owners have been focused on authorship markup, where author details can show up in the search results, there is publishership as well.  By claiming publishership (rel=publisher), Google can connect a website to an entity in Google Plus (similar to the way an author is tied to a G+ profile).  That connection could possibly be the basis for providing more content in the search results.  And yes, this could drive even more people to Google+ over the next few years.

By the way, just last year Google tested out showing publisher images in the search results (similar to author details).  I saw the test live, and others did too.  I almost fell out of my seat when I saw client logos in the search results.  That test was removed quickly once word started getting out, but here’s a screenshot of what that looked like.  Check out the publisher image in the search results below:

Publisher Markup in the Search Results

So, if Google understands more about a website via publishership, maybe it can use data from the website to provide more information directly in the SERPs.  Hey, it’s entirely possible.

Now, if this was the case, at least website owners could remove publishership from their sites (if they didn’t like Google providing more data directly in the search results).  But that could be a double-edged sword for content owners.  Sure, you could stop Google from providing your answers in the search results, but maybe Google won’t rank your listings highly anymore (since it’s getting more engagement from listings that provide the in-SERP functionality).    Who knows, I’m just thinking out loud here…

Now let’s take a look at what could potentially appear in the SERPs if this comes to fruition.

Hover Cards and Google – The Various Types of Content and Functionality That Could Appear Directly in the Search Results
Based on what I explained above, how could Google implement additional content or functionality directly in the search results?  And what would it look like?  I started brainstorming a few different ways this could happen and have provided some possibilities below.  Note, these are just some logical options based on what I’ve seen happening with Google and its search results over the past few years.  There are definitely more possibilities than what I’m listing below, but this is a good start.

And yes, in-SERP content and functionality could have a huge impact on websites and businesses.  I’ll cover more about that later in the post.

1. Direct Answers (From Your Site)
There are a lot of companies receiving traffic from users based on queries for direct answers to questions.  Again, Google is already providing many answer boxes for various topics (as covered earlier).  But that’s not per listing in the search engine results pages…  it’s usually via an answer box at the top of the search results.  That’s much different than a hover card per search listing (or for certain listings in the SERPs).

Let’s use my website as an example.  How about a search for “how many dmca requests google impact”?  That’s a search related to the Pirate Update, which I covered extensively in a post in December.  If Google provides the answer in the SERP via an “Answer Card”, it could look like this:

Google Answer Card in the Search Results

If this type of answer card rolls out, and the hover card provides enough of the answer, users will never hit your site.  So, if you are hoping that users visit your site to find the answer, and then take some other action on your website, good luck.  You better start thinking of another way to get that to happen.

2. How-Tos  or Tutorial Segments
If someone searches for how to perform a certain task, and that task is limited in steps, then maybe that information could show up in the search results via a “Tutorial Card”.  Or maybe someone is searching for a specific step in a tutorial.  Google could provide just that step in a hover card directly in the SERPs.

Google Tutorial Card in the Search Results

3. Product or Service Information
If someone is interested in a certain product category or service, then maybe that information is pulled directly from sites in that niche.  For example, if someone searches for “IT consulting” or “computer science” or “4K television”, Google could provide that information directly in the SERPs via a “Product or Service Card”.  For example:

Google Category Card in the Search Results

4. Ecommerce – Fighting Amazon via the “Ecommerce Card”
Information is great, but let’s talk about ecommerce.  Google and Amazon battle heavily in the ecommerce space.  Sure, Google doesn’t sell anything directly, but they make a boatload of money via paid search.  And product listing ads (PLAs) are at the heart of that growth right now.  On the flipside, many people go directly to Amazon to search for products.  That’s the result of a huge inventory, a boatload of review data, and Prime membership (with free, two-day shipping).

But, what if Google decided to provide one-click ecommerce functionality directly in the SERPs?  This could be handled by connecting your Google profile to Google Wallet and buying products directly in the SERPs via the “Ecommerce Card”.  This would be amazing for people that already know which product they want to buy.  It could look like this:

Google Ecommerce Card in the Search Results

And yes, this would be like AdWords on steroids since Google could generate revenue via the organic listings by earning a percentage of the sale.  Holy cow.  :)  More about the ecommerce impact later in this post.

 

5. Reviews
Going even further with our ecommerce example, if someone searched for reviews of a product or service, Google could surface that information and provide it directly in the “Review Card”.   For some people, the review snippet below would be enough.  And that could drastically impact the downstream traffic to pcmag.com.

Google Review Card in the Search Results

6. Downloads
Along the same lines, what if you were looking to download content via pdfs (or other formats)?  Imagine Google provided this download functionality via a “Download Card” directly in the search results.  Google could scan each file for malware and tee it up for users to download.  And if you want to charge for that file, then you can combine the “Ecommerce Card” with the “Download Card”.  That would be a smart combination for sure.

Google Download Card in the Search Results

7. Sign-ups/Registration
Looking to sign up for a webinar, join an email list, or confirm you’ll be attending an event?  Registration functionality could also be provided directly in the search results.  Actually, Google has already been testing functionality for joining email lists in AdWords (via ads in the search results).  This could easily be included in a “Registration Card” directly in the organic search results.

Google Registration Card in the Search Results

I can keep going here… but I think you get the picture.  And these hover cards don’t have to be limited to Knowledge Graph data.  If Google can verify certain entities, then it can feel comfortable providing more information to users directly in the search results.  That data could be answers, information, coupon codes, medical information, pricing, reviews, downloads, list signups, ecommerce functionality, and more.

 

What Happens if this Rolls Out?
Website owners will riot in the streets.  :)  Ok, maybe not literally, but this could cause serious problems for many business owners.

Publishers with an Ad-Driven Model
Let’s start with websites earning advertising revenue based on traffic.  Well, if a site is charging a CPM (or cost per thousand impressions), and 40% of its traffic goes away, their revenue will take a huge hit.  And as their traffic numbers plummet, so will their ability to sell advertising on the site.  Publishers will once again need to figure out other ways to monetize, which is no easy feat.

Ecommerce Retailers
Next on the list are ecommerce retailers.  The once pure, ROI-driven organic results will now be asking for a commission.  If Google does roll out the ability to buy directly from the search results via one-click “ecommerce cards”, then it will surely want a cut of the sale.  Remember, advertising comprises a huge percentage of Google’s revenue and product listing ads are doing extremely well for them (via AdWords).  But having the ability to sell via the much larger set of organic listings could be huge for Google.

Blogs and Resources
For those writing great content on blogs and resource websites, then the possibility of having that content surfaced in “answer cards” could be a big problem (although not as big of a problem as large publishers and ecommerce retailers).  The real downside here would be users gaining answers based on your hard work, without needing to visit your site.

And if they don’t visit your site, they can’t find out more about you, they can’t subscribe to your feed, find your social accounts, or contact you.  I’m sure some users will decide to visit the site, but a certain percentage surely won’t.  This could lead to a drop in awareness, which could impact multiple channels for content owners.  i.e. less subscribers, twitter followers, facebook fans, etc.  And of course, this could impact leads and new business for the organizations publishing content.

Hover Card Extensions – A Note About Ads
It’s hard to write about Google without bringing up advertising.  Again, advertising drives ~96% of Google’s revenues, so these new hover cards would probably have some type of advertising component.  I already mentioned the revenue that ecommerce cards could drive (via a percentage of the sale), but Google could absolutely add sponsored information to hover cards.

For example, imagine having the ability to promote certain pages on your site (to increase click through), provide the ability to subscribe to a feed, follow you on Google+, etc. right from the various hover cards.  This type of ad extension could easily be included in the AdWords platform.  And if that happens, Google could expand AdWords-like functionality to the organic listings.  As long as it’s clearly labeled, and it’s actually helpful to users, then it’s a huge win-win for Google.  Users get what they are looking for, and Google just added a massive new source of revenue.

Hover Card Ad Extensions in Google

 

Summary – Hover Cards and the All-Powerful SERP
The addition of “info cards” in the search results caught serious attention last week across the industry.  But is this just the beginning?  Is it merely a test to see how users react to providing more information directly in the search results per listing?  And if it works well, it’s hard to say how much information and functionality Google could provide in the SERPs.

Time will tell how much of what I listed above becomes a reality.  Until then, I recommend continuing to diversify your digital efforts.  If not, you run the risk of transforming from a website with a lot of traffic into a hover card sitting in the search results.  And there’s not much room to play with there.

GG

 

 

Tuesday, January 7th, 2014

Rap Genius Recovery: Analyzing The Keyword Gains and Losses After The Google Penalty Was Lifted

Rap Genius Recovers From Google Penalty

On Christmas Day, Rap Genius was given a heck of a gift from Google.  A penalty that sent their rankings plummeting faster than an anvil off the Eiffel tower.  The loss in traffic has been documented heavily as many keywords dropped from page one to page five and beyond.  And many of those keywords used to rank in positions #1 through #3 (or prime real estate SEO-wise).  Once the penalty was in place, what followed was a huge decrease in visits from Google organic, since most people don’t even venture to page two and beyond.  It’s like Siberia for SEO.

Gaming Links
So what happened that Google had to tear itself away from eggnog and a warm fire to penalize a lyrics website on Christmas Day?  Rap Genius was gaming links, and badly.  No, not just badly, but with such disregard for the consequences that they were almost daring Google to take action.  And that’s until Matt Cutts learned of the matter and took swift action on Rap Genius.

That was Christmas Day. Ho, ho, ho.  You get coal in your lyrical stocking.   I won’t go nuts here explaining the ins and outs of what they were doing.  That’s been documented heavily across the web.  In a nutshell, they were exchanging tweets for links.  If bloggers added a list of rich anchor text links to their posts, then Rap Genius would tweet links to their content.  The bloggers get a boatload of traffic and Rap Genius got links (and a lot of them using rich anchor text like {artist} + {song} + lyrics).  Here’s a quick screenshot of one page breaking the rules:

Rap Genius Unnatural Links

A 10 Day Penalty – LOL
Now, I help a lot of companies with algorithmic hits and manual actions.  Many of the companies contacting me for help broke the rules and are seeking help in identifying and then rectifying their SEO problems.  Depending on the situation, recovery can take months of hard work (or longer).  From an unnatural links standpoint, you need to analyze the site’s link profile, flag unnatural links, remove as many as you can manually, and then disavow the rest.  If you only have 500 links leading to your site, this can happen relatively quickly.  If you have 5 million, it can be a much larger and nastier project.

Rap Genius has 1.5 million links showing in Majestic’s Fresh Index.  And as you start to drill into the anchor text leading to the site, there are many questionable links.  You can reference their own post about the recovery to see examples of what I’m referring to.  Needless to say, they had a lot of work to do in order to recover.

So, you would think that it would take some time to track down, remove, and then disavow the unnatural links that caused them so much grief.  And then they would need to craft a serious reconsideration request documenting how they broke the rules, how they fixed the problem, and of course, offer a sincere apology for what they did (with a guarantee they will never do it again).   Then Google would need to go through the recon request, check all of the removals and hard work, and then decide whether the manual action should be lifted, or if Rap Genius had more work to do.  This should take at least a few weeks, right?  Wrong.  How about 10 days.

Rap Genius Recovers After 10 Days

Only 10 days after receiving a manual action, Rap Genius is back in Google.  As you can guess, the SEO community was not exactly thrilled with the news.  Screams of special treatment rang through the twitterverse, as Rap Genius explained that Google helped them to some degree understand how to best tackle the situation, or what to target.  Believe me, that’s rare.  Really rare…

Process for Removing and Disavowing Links
Rap Genius wrote a post about the recovery on January 4th, which included the detailed process for identifying and then dealing with unnatural links.  They had thousands of links to deal with, beginning with a master list of 178K.  From that master list, they started to drill into specific domains to identify unnatural links.   Once they did, Rap Genius removed what they could and disavowed the rest using Google’s Disavow Tool.   Following their work, Google removed the manual action on January 4th and Rap Genius was back in Google.

But many SEOs wondered how much they came back, especially since Rap Genius had to nuke thousands of links.  And many of those links were to deeper pages with rich anchor text.  Well, I’ve been tracking the situation from the start, checking which keywords dropped during the penalty, and now tracking which ones returned to high rankings after the penalty was lifted.  I’ll quickly explain the process I used for tracking rankings and then provide my findings.

My Process for Analyzing Rankings (With Some Nuances)
When the penalty was first applied to Rap Genius, I quickly checked SEMRush to view the organic search trending and to identify keywords that were “lost” and ones that “declined”.  Rap Genius ranks for hundreds of thousands of keywords according to SEMRush and its organic search reporting identified a 70K+ keyword loss based on the penalty.

Note, you can’t compare third party tools to a website’s own analytics reporting, and SEMRush won’t cover every keyword leading to the site.  But, for larger sites with a lot of volume, SEMRush is a fantastic tool viewing the gains and losses for a specific domain.  I’ve found it to be extremely thorough and accurate.

Checking the lost and declined keywords that SEMRush was reporting lined up with manual checks.  Those keywords definitely took a plunge, with Rap Genius appearing on page five or beyond.  And as I mentioned earlier, that’s basically Siberia for organic search.

When the penalty was lifted, I used the same process for checking keywords, but this time I checked the “new” and “improved” categories.  The reporting has shown 43K+ keywords showing in the “new” category, which means those keywords did not rank the last time SEMRush checked that query.

I also used Advanced Web Ranking to check 500 of the top keywords that were ranking prior to the penalty (and that dropped after the manual action was applied).  The keywords I checked were all ranking in the top ten prior to the penalty.  Once the penalty was lifted, I ran the rankings for those keywords.  I wanted to see how much of an improvement there was for the top 500 keywords.

Then I dug into the data based on both SEMRush and Advanced Web Ranking to see what I could find.  I have provided my findings below.   And yes, this is a fluid situation, so rankings could change.  But we have at least a few days of data now.  Without further ado, here’s what I found.

 

Branded Keywords
This was easy. Branded keywords that were obliterated during the penalty returned quickly with strong rankings.  This was completely expected.  For example, if you search for rap genius, rapgenius, or any variation, the site now ranks at the top of the search results.  And the domain name ranks with sitelinks. No surprises here.

Rap Genius Branded Keywords

Category Keywords
For category keywords, like “rap lyrics”, “favorite song lyrics”, and “popular song lyrics”, I saw mixed results after the recovery.  For example, the site now ranks #1 for “rap lyrics”, which makes sense, but does not rank well for “favorite song lyrics” and “popular song lyrics”.  And it ranked well for each of those prior to the penalty.  Although specific song lyric queries are a driving force for rap genius (covered soon), category keywords can drive a lot of volume.  It’s clear that the site didn’t recover for a number of key category keywords.

Rap Genius Category Keywords

 

Artist Keywords
I noticed that the site ranked for a lot of artists prior to the penalty (just the artist name with no modifiers).  For example, “kirko bangz”, “lil b”, etc.  Similar to what I saw with category keywords, I saw mixed results with artists.  Searching for the two artists I listed above does not yield high rankings anymore, when they both ranked on page one prior to the penalty.  Some increased in rankings, but not to page one.  For example, “2 chainz” ranks #12 after the penalty was lifted.  But it was MIA when the penalty was in effect.  Another example is “Kendrick Lamar”, which Rap Genius ranked #8 for prior to the penalty.  The site is not ranking well at all for that query now.  So again, it seems that Rap Genius recovered for some artist queries, but not all.

Rap Genius Artist Keywords

Lyrics Keywords
Based on my research, I could clearly see the power of {song} + lyrics queries for Rap Genius.  It’s a driving force for the site.  And Rap Genius is now ranking again for many of those queries.  When the penalty was first lifted, I started checking a number of those queries and saw Rap Genius back on page one, and sometimes #1.  But when I started checking in scale, you could definitely see that not all keywords returned to high rankings.

Rap Genius High Rankings for Lyrics Keywords

For example, “hallelujah lyrics”, “little things lyrics”, and “roller coaster lyrics” are still off of page one.  Then there are keywords that skyrocketed back up the charts, I mean search rankings.  For example, “swimming pool lyrics”, “marvins room lyrics”, and “not afraid lyrics” all returned after the penalty after being buried.  So, it seems that many song lyrics keywords returned, but there are some that rank page two and beyond.

Rap Genius Low Rankings for Lyrics Keywords

What About Keywords That Were Gamed?
I’m sure some of you are wondering how Rap Genius fared for keywords that were gamed via unnatural links.  For example, “22 two’s lyrics” yields extremely strong rankings for Rap Genius, when it was one of the songs gamed via the link scheme.  Actually, rap genius ranks twice in the top 5.  Go figure.

Rap Genius Rankings for Gamed Links - Jay Z

Ditto for “timbaland know bout me”, which was also one of the songs that made its way into the spammy list of links at the end of articles and posts.  Rap Genius ranks #3 right now.

Rap Genius Rankings for Gamed Links - Timbaland

And then there’s Justin Bieber, which I can’t cover with just one sentence.  Rap Genius currently ranks on page 3 for “Justin Bieber song lyrics”, when it used to rank #8!  And then “Justin Bieber baby lyrics” now ranks #12 on page 2, when it used to rank #8.  But for “Justin Bieber lyrics”, Rap Genius is #10, on page one.

Rap Genius Rankings for Justin Bieber Lyrics

Overall, I saw close to 100 Justin Bieber keywords pop back into the top few pages of Google after the penalty was lifted.  But, many were not on page one anymore… I saw many of those keywords yield rankings on page two or beyond for Rap Genius.  See the screenshot below:

Rap Genius Keywords for Justin Bieber

 

Summary – Rap Genius Recovers, But The Scars Remain
So there you have it.  A rundown of where Rap Genius is after the penalty was lifted.  Again, I can’t see every keyword that was lost or gained during the Christmas Day fiasco, but I could see enough of the data.  It seems that Rap Genius came back strong, but not full-blast.  I saw many keywords return, but still a number that remain buried in Google.

But let’s face it, a 10 day penalty is a slap on the wrist for Rap Genius.  They now have a clean(er) platform back, and can build up on that platform.  That’s a lot better than struggling for months (or longer) with horrible rankings.  As I explained earlier, too many business owners aren’t as lucky as Rap Genius.  10 days and help from Google can quicken up the recovery process.  That’s for sure.

I’ll end with one more screenshot to reinforce the fact that Rap Genius is back.  And it’s a fitting query. :)

Rap Genius I'm Sorry

GG

 

 

Wednesday, December 18th, 2013

Panda Report – How To Find Low Quality Content By Comparing Top Landing Pages From Google Organic

Top Landing Pages Report in Google Analytics

Note, this tutorial works in conjunction with my Search Engine Watch column, which explains how to analyze the top landing pages from Google Organic prior to, and then after, Panda arrives.  With the amount of confusion circling Panda, I wanted to cover a report webmasters can run today that can help guide them down the right path while on their hunt for low-quality content.

My Search Engine Watch column covers an overview of the situation, why you would want to run the top landing pages report (with comparison), and how to analyze the data.  And my tutorial below covers how to actually create the report.  The posts together comprise a two-headed monster that can help those hit by Panda get on the right track.   In addition, my Search Engine Watch column covers a bonus report from Google Webmaster Tools that can help business owners gather more information about content that was impacted by the mighty Panda.

Why This Report is Important for Panda Victims
The report I’m going to help you create today is important, since it contains the pages that Google was ranking well and driving traffic to prior to a Panda attack.  And that’s where Google was receiving a lot of intelligence about content quality and user engagement.  By analyzing these pages, you can often find glaring Panda-related problems.  For example, thin content, duplicate content, technical problems causing content issues, low-quality affiliate content, hacked content, etc.  It’s a great way to get on the right path, and quickly.

There are several ways to run the report in Google Analytics, and I’ll explain one of those methods below.  And remember, this should not be the only report you run… A rounded analysis can help you identify a range of problems from a content quality standpoint.  In other words, pages not receiving a lot of traffic could also be causing Panda-related problems.  But for now, let’s analyze the top landing pages from Google Organic prior to a Panda hit (which were sending Google the most data before Panda arrived).

And remember to visit my Search Engine Watch column after running this report to learn more about why this data is important, how to use it, red flags you can identify, and next steps for websites that were impacted.  Let’s get started.

How To Run a Top Landing Pages Report in Google Analytics (with date comparison): 

  • First, log into Google Analytics and click the “All Traffic” tab under “Acquisition”.  Then click “Google / Organic” to isolate that traffic source.
    Accessing Google Organic Traffic in Google Analytics
  • Next, set your timeframe to the date after Panda arrived and extend that for a decent amount of time (at least a few weeks if you have the data).  If time allows, I like to set the report to 4-6 weeks after Panda hit.  If this is right after an algorithm update, then use whatever data you have (but make sure it’s at least one week).  I’m using a date range after the Phantom update hit (which was May 8th).
    Setting a Timeframe in Google Analytics
  • Your next move is to change the primary dimension to “Landing Page” to view all landing pages from Google organic search traffic.  Click the “Other” link next to “Primary Dimension” and select “Acquisition”, and then “Landing Page”.  Now you will see all landing pages from Google organic during that time period.
    Primary Dimension to Landing Page in Google Analytics
  • Now let’s use some built-in magic from Google Analytics.  In the timeframe calendar, you can click a checkbox for “Compare to” and leave “Previous period” selected.  Once you click “Apply”, you are going to see all of the metrics for each landing page, but with a comparison of the two timeframes.  And you’ll even have a nice trending graph up top to visualize the Panda horror.
    Comparing Timeframes in Google Analytics
  • As you start going down the list of urls, pay particular attention to the “% Change” column.  Warning, profanity may ensue.  When you start seeing pages that lost 30%, 40%, 50% or more traffic when comparing timeframes, then it would be wise to check out those urls in greater detail.  Again, if Google was sending a lot of traffic to those urls, then it had plenty of user engagement data from those visits.  You might just find that those urls are seriously problematic from a content quality standpoint.
    Viewing The Percent Change in Traffic in Google Analytics

 

Bonus 1: Export to Excel for Deeper Analysis

  • It’s ok to stay within Google Analytics to analyze the data, but you would be better off exporting this data to Excel for deeper analysis.  If you scroll to the top of the Google Analytics interface, you will see the “Export” button.  Click that button and then choose “Excel (XLSX)”.  Once the export is complete, it should open in Excel.  Navigate to the “Dataset” worksheet to see your landing page data (which is typically the second worksheet).
    Exporting A Report In Google Analytics
  • At this point, you should clean up your spreadsheet by deleting columns that aren’t critical for this report.  Also, you definitely want to space out each column so you can see the data clearly (and the data headers).
    Clean Up Google Analytics Export in Excel
  • You’ll notice that each url has two rows, one for the current timeframe, and one for the previous timeframe.  This enables you to see all of the data for each url during both timeframes (the comparison).
    Two Rows For Each URL Based on Timeframe
  • That’s nice, but wouldn’t it be great to create a new column that showed the percentage decrease or increase for visits (like we saw in Google Analytics?)  Maybe even with highlighting to show steep decreases in traffic  Let’s do it.  Create a new column to the right of “Visits” and before “% New Visits”.  I would title this column “% Change” or something similar.
    Creating a New Column for Percent Change in Excel
  • Next, let’s create a formula that provides the percentage change based on the two rows of data for each url.  Find the “Visits” column and the first landing page url (which will have two rows).  Remember, there’s one row for each timeframe.  If your visits data is in column C, then the post-Panda data is in row 2, and the pre-Panda data is in row 3 (see screenshot below).  You can enter the following formula in the first cell for the new column “% Change”.=(C3-C2)/C3.Again, C3 is the traffic levels from the previous timeframe, C2 is the traffic levels from the current timeframe (after the Panda hit), and you are dividing by the previous traffic levels to come up with the percentage change.  For example, if a url dropped from 5,450 visits to 640 visits, then your percentage drop would be 88%.  And yes, you would definitely want to investigate that url further!
    Creating a Formula to Calculate Percent Change in Excel
  • Don’t worry about the floating decimal point.  We’ll tackle that soon.  Now we need to copy that formula to the rest of the column (but by twos).  Remember, we have two records for each url, so you’ll need to highlight both cells before double clicking the bottom right corner of the second cell to copy the formula to all rows.  Once you do, Excel automatically copies the two rows to the rest of the cells in that column.  Now you should have percentage drops (or increases) for all the urls you exported.  Note, you can also highlight the two cells, copy them, and then highlight the rest of that column, and click paste.  That will copy the formula to the right cells in the column as well.
    Copying a Formula to All Rows in Excel
  • Now, you will see a long, floating decimal point in our new column labeled “% Change”.  That’s an easy fix, since we want to see the actual percentage instead.  Highlight the column, right click the column, and choose “Format Cells”.  Then choose “Percentage” and click “OK”.  That’s it.  You now have a column containing all top landing pages from Google organic, with the percentage drop after the Panda hit.
    Formatting Cells in Excel

 

Bonus 2: Highlight Cells With A Steep Drop in Red

  • If you want the data to “pop” a little more, then you can use conditional formatting to highlight cells that exceed a certain percentage drop in traffic.  That can easily help you and your team quickly identify problematic landing pages.
  • To do that, highlight the new column we created (titled “% Change”), and click the “Conditional Formatting” button in your Home tab in Excel (located in the “Styles” group).  Then select, “Highlight Cells Rules”, and then select, “Greater Than”.  When the dialog box comes up, enter a minimum percentage that you want highlighted.  And don’t forget to add the % symbol!  Choose the color you want to highlight your data with and click “OK”.  Voila, your problematic urls are highlighted for you.  Nice.
    Applying Conditional Formatting in ExcelApplying Conditional Formatting by Percentage in Excel

 

Summary – Analyzing Panda Data
If you made it through this tutorial, then you should have a killer spreadsheet containing a boatload of important data.  Again, this report will contain the percentage increase or decrease for top landing pages from Google Organic (prior to, and then after, a Panda hit).  This is where Google gathered the most intelligence based on user engagement.  It’s a great place to start your analysis.

Now it’s time to head over to my Search Engine Watch column to take a deeper look at the report, what you should look for, and how to get on the right track with Panda recovery.  Between the tutorial and my Search Engine Watch column, I hope to clear up at least some of the confusion about “content quality” surrounding Panda updates.  Good luck.

GG

 

 

Monday, December 9th, 2013

Google’s Pirate Algorithm and DMCA Takedowns | Exploring the Impact Threshold

Google Pirate Algorithm

In August of 2012, Google announced an update to its search algorithm that targeted websites receiving a high number of DMCA takedown requests.  The update was unofficially called “The Pirate Update”, based on the concept of pirating someone else’s content like music, movies, articles, etc.  With the update, Google explained that “sites receiving a lot of removal notices may appear lower in our results.”   For most websites, this wasn’t a big deal.  But for others, this was more than just a proverbial “shot across the bow”.  It was a full-blown cannon shot right through the hull of a ship.

I do a lot of algorithm update work, including Panda, Penguin, and Phantom work, so it’s not unusual for website owners to contact me about major drops in traffic that look algorithmic.  And I’ve had several companies contact me since August 2012 that believed the Pirate update could be the cause of their drop.   Regarding dates, the update first rolled out in August of 2012, and the impact could be seen almost immediately.  I’ll cover more about how I know that soon.

My goal with this post is to introduce the Pirate update, explain how you can analyze DMCA takedowns requests (via data Google provides), and explore the threshold of removal requests that could get a site algorithmically impacted (or what I’m calling “The Threshold Impact”).

So without further ado, it’s time to sail into dangerous waters.

 

DMCA Takedowns
So, what’s a DMCA takedown?  It’s essentially a notice sent to an online service provider explaining that infringing material resides on its network, and that the infringing url(s) or website should be taken down.  As you can imagine Google receives many of these takedown requests on a regular basis, and it provides a simple process for filing takedowns.  Actually, Google provides a complete transparency report where it lists a slew of data regarding removal requests, copyright owners, domains specified in DMCA notices, etc.  I’ll explain more about that data next.

Google Transparency Report

For the purposes of this post (focused on the Pirate update), DMCA takedowns are sent to Google when someone or some entity believes urls on your website contain their copyrighted material.  And of course, those urls are probably ranking for target queries.  So, companies can go through the process of filing a copyright complaint, Google will investigate the issue, and take action if warranted (which means Google will remove the url(s) from the search results).  In addition, every request is documented, so Google can start to tally up the number of DMCA notices that target your domain.  And that’s extremely important when it comes to the Pirate algorithm.

And jumping back to Google’s original post about the Pirate Update, Google says, “Sites with high numbers of removal notices may appear lower in our results.” So every time a new takedown notice comes in, you have one more strike against you.  Now, we don’t know how many strikes a site needs to receive before the Pirate algorithm kicks in, and I try and shed some light on that later in this post.

Google Transparency Report – Requests to Remove Content
I mentioned earlier that Google provides a Transparency Report, where it lists requests to remove content from its services (from governments, and due to copyright).  The section of the Transparency Report focused on copyright requests provides a wealth of data regarding takedown notices, domains being specified in those takedowns, top copyright owners, etc.  You can see on the site that over 5M urls were requested to be taken down by Google just last week, and 24M in the past month!  Yes, it’s a big problem (and a huge undertaking by Google).

Copyright Removal Requests

Being a data nut, I was like a kid in a candy store when I started going through this data.  This is the “smoking gun”, so to speak, when analyzing sites that could have been hit by Pirate.  By clicking the “Domain Specified” link in the left navigation, you can scroll through a list of the domains being targeted via DMCA takedown notices.  You can see the number of copyright owners that have filed notices, the number of reporting organizations (which work on behalf of copyright owners), and the number of urls submitted (that allegedly contain copyrighted material).  You can filter this data by week, month, year, or “all available”.  And more importantly, you can download the data as a .csv file.  This is where it gets interesting.

Domains Listed in Google Transparency Report

Working with the .csv file
First, and most importantly, the file holding domains contains 14M records. So if you try and simply open the file in Excel, you won’t get very far. Each worksheet in Excel can only contain 1M rows, so you have far too much data to run a simple import.  To get around this issue, I imported the file into Access, so I could work with the data in various ways.  Note, Access is a database program that enables you to import larger sets of data, and then query that data based on various criteria.  It’s a robust desktop database program from Microsoft that comes with certain versions of Microsoft Office.  So, you might already have Access installed and not even know it.

Using Microsoft Access to Analyze DMCA Takedown Requests

My goal was to analyze the domains getting hit by the Pirate algorithm, and then also try to identify the threshold Google is using when choosing to target a domain.  For example, how many requests needed to be filed, how many urls needed to be targeted, and what’s the “url to total indexed ratio”?  More about that last metric soon.

Tracking The Pirate Update via Data
Now that I had Pirate data, it was time to start analyzing that data.   I began to take a look at the top domains in the list, and cross-reference their organic search trending via SEMRush.  I wanted to make sure I could spot the impact from the Pirate algorithm for these specific domains.   That turned out to be easier than I thought. Check out the trending below for several of the websites that topped the list:

Website Impacted by Pirate Update - Example 1

Website Impacted by Pirate Update - Example 2

Website Impacted by Pirate Update - Example 3

Website Impacted by Pirate Update - Example 4

And I saw many more just like this…

Searching For The Impact Threshold – The Connection Between DMCA Takedowns and Algo Hits
Based on heavily reviewing the organic search trending for sites on the list, I wanted to see if there was a threshold for getting algorithmically impacted.  For example, did there have to be a certain number of complaints before Google impacted a site algorithmically?  Or was that too rudimentary?  Were there other factors involved that triggered the algo hit?  These are all good questions and I try to answer several of them below.

In addition to straight removal notices, it’s hard to overlook a specific metric Google is providing in the transparency report for DMCA takedowns.  It’s listed on the site as “percentage of pages requested to be removed based on total indexed pages”.  Now that metric makes sense! (theoretically anyway).  Understanding the total package could yield better decisions algorithmically than just the pure number of takedown requests.

For example, if the percentage is 1% or less for certain sites, they might be treated differently than a site with 5%, 10% (or even higher).  Note, I saw some sites with greater than 50%!  Based on my research, I saw a strong correlation with sites showing 5% or greater and what looked to be Pirate algorithm hits (i.e. 5% of the total urls on the site were requested to be removed via DMCA takedown requests).  And for the domains that dropped sharply after Pirate was first introduced, the percentage was often higher.  For example, I saw percentages of “<50%” often, and even a few “>50%”.

Website With High Percentage Of Removal Requests Based On Total Indexed Pages

I know this sounds obvious, but if half of your indexed urls have been requested to be taken down, you’ve probably got a serious Pirate problem. :)  And it should be no surprise that you’ve been hit by the Pirate update.

DMCA Takedowns and Google – What To Do If You Have Been Contacted
If a DMCA takedown request has been filed with Google about infringing url(s) on your site, you should receive a message in Google Webmaster Tools explaining the situation, along with links to the infringing content.  At that point, you can file a counter notice, remove the content, or choose to ignore the problem (which I don’t recommend).  If you do remove the content, then you can fill out the “content removed notification form”.   Once you complete the process of removing urls and notifying Google, then you will need to wait to see how your site rebounds.  Note, Google provides links to the forms I mentioned above in their messages via Webmaster Tools.

Example of a DMCA notice in Google Webmaster Tools:
In case you were wondering what a DMCA takedown request from Google looks like, here’s a link to a webmaster forum thread that shows a GWT DMCA message.

Example of DMCA Notice in Google Webmaster Tools

Also, and this is more related to the algorithmic hit you can take, I recommend visiting the transparency report and analyzing the data.  You can search by domain by accessing the search field in the copyright section of the transparency report.  You can also download and import the data into Access to identify the status of your domain (as mentioned earlier).

For example, you can figure out how many requests have been filed and review the % column to see how Google understands your entire domain based on alleged copyright violations.  If you see a large number of urls, and a high(er) percentage of infringing urls based on total indexation, then it could help you determine the cause of the latest algorithm hit that impacted your site.  Or if you’re lucky, you could thwart the next attack by being aggressive with copyright cleanup.

Summary – Walking The Plank With The Pirate Update
I hope this post explained more about the Pirate update, how it can impact a website, how you can research your domain via Google’s Transparency Report, and what to do if you have received a DMCA message from Google.  My recommendation to webmasters is to (obviously) avoid breaking copyright laws, take swift action if you are contacted by Google with DMCA notices (remove content or file a counter notice), and to research your domain to better understand the big picture (% of urls requested to be removed based on total indexation).

If not, you could very well be walking the plank into search oblivion.  And let’s face it, nobody wants to sleep in Davy Jones’ locker.  :)

GG

 

Thursday, November 21st, 2013

How To Track Unconfirmed Panda Updates in a Quasi Real-time World

Cloaked Panda Updates

In June I wrote an important blog post, based on the amount of Panda work I help companies with.  The post was about the maturing of Panda, and how Google planned to roll out the algorithm update once per month, while taking up to ten days to fully roll out.  Google also explained that it will not confirm future Panda updates.  After hearing both statements, I couldn’t help but think the new Panda could lead to serious confusion for many webmasters.  And I was right.

Let’s face it, Panda was already confusing enough for the average business owner.  Whenever I speak with Panda victims (which is often), I joke that Panda should have been titled “Octopus” instead.  That’s because there are many tentacles to Panda.  There are a number of reasons a site could get hit, and a deep analysis is often needed to determine what happened, why, and how to rectify the situation.  Sure, Panda focuses on “content quality”, but that could mean several things based on the nature of the website.

For example, I’ve seen affiliate content get hit, doorway pages, scraped content, heavy cross-linking of company-owned domains, duplicate content, thin content with over-optimized titles and metadata, etc.   And then you have technical problems that could cause content problems.  For example, code glitches that replicate content across large sites.  Those technical problems could impact thousands of pages (or more), and it’s one of the reasons I start every Panda engagement with a deep SEO technical audit.  What I find often helps me track down Panda problems, while having the added benefit of identifying other technical problems that can be fixed (and sometimes quickly).

Webmaster Confusion – My Prediction About The New Panda Was Unfortunately Spot-on
Believe me, I don’t want to pat myself on the back about my prediction, because I wish I was wrong.  But I have received many emails from webmasters since June that signal there is serious confusion with the new algorithm update.  And I totally get it.

An example of a Panda hit:
A Typical Panda Hit

For example, if you wake up one morning and see a big drop in Google organic traffic, but have no idea why, then you might start researching the problem.  And when Google doesn’t confirm a major update like Panda, stress and tension increase.   That leads you to various blog posts about Google traffic drops, which only cause more confusion.  Then you grab your SEO, your marketing team, and developers, and hit a war room in your office.  Diagrams are drawn on large white boards, finger pointing begins, and before you know it, you’re in the Panda Zone, or a state of extreme volatility that can drive even the toughest CEO mad.  I know this because I have seen it first-hand many times with companies hit by Panda, from large companies to small.

 

Have There Been Additional Panda Updates As Expected?
Yes, there have been.  They’re just not easy to spot unless you have access to a lot of data.  For example, it’s easier to see the pattern of Panda updates when you are helping a number of companies that were impacted by our cute, black and white friend.  If those companies have been working hard on rectifying their Panda problems, then some may recover during the new, stealthy Panda updates.  Fortunately, I help a number of companies that were impacted by Panda, so I’ve been able to catch a glimpse of the cloaked Panda.

The Last Confirmed Panda Update in July 2013

The last confirmed Panda update was July 18, 2013, even though that was after Google said it wouldn’t confirm any more Panda updates.  Go figure.  And by the way, I have data showing the update began closer to July 15.  Regardless, that was the last confirmed date that Panda rolled out.  But we know that wasn’t the last Panda update, as the algorithm graduated to quasi real-time.  I say “quasi” real-time, but some people incorrectly believe that Panda is continually running (part of the real-time algorithm).  That’s not true, and a recent webmaster hangout video explained more about the new Panda. Check 22:58 through 25:20 in the video to watch John Mueller from Google explain more about how Panda is currently handled.

In the video, John explains that Panda is not real-time.  Yes, read that again.  It is not real-time, but it has progressed far enough along that Google trusts the algorithm more.  That means the typical, heavier testing prior to rollout isn’t necessary like it once was.  Therefore, Google feels comfortable unleashing Panda on a more regular basis (once per month), and Matt Cutts explained that it could take ten days to fully roll out.

This is important to understand, since you cannot be hit by Panda at any given time during the month.  But, you can be impacted each month (positively or negatively) during the ten day rollout.  Based on what I have seen, Panda seems to roll out in the second half of each month.  July was closer to the middle of the month, where the August update was closer to the end of the month.  Here’s a quick timeline, based on Panda clients I have been helping.

Recent Undocumented Panda Sightings:
Cloaked Panda 1 – Monday, August 26

Unconfirmed Panda Update in August 2013


Cloaked Panda 2 – Monday, September 16

Unconfirmed Panda Update in September 2013

Those are two dates I saw recoveries with several Panda clients.  Also, if we start with the confirmed July update (which I saw starting on the 15th), you can see all three updates were during the second half of the month.  That could be random, but it might not be.

Regarding your own website and identifying impact from the new Panda, we need to remember the details of our new, stealthy friend.  If Google is telling us the truth, then it could take ten days for sites to see the impact from Panda.  So if you took a major hit near one of those dates, then you very well could have been hit by Panda.  And again, someone reviewing your site through the lens of Panda would be able to confirm if any content factors were at play (like I mentioned earlier).  That’s why a thorough Panda audit is so important.

Also, Panda hits are typically very apparent.  They aren’t usually slight increases or decreases.  Remember, Google is continually pushing smaller updates to its real-time algorithm, so it’s natural to see slight increases or decreases over time.  But significant changes on a specific date could signal a major algorithm update like Panda or Penguin.

Tips for Tracking Cloaked Panda Updates:
Now, you might be reading this post and saying, “Thanks for the dates Glenn, but how can this help me in the future?”  Great question, and there’s no easy answer.  Remember, the new Panda is hard to spot, and the Panda gatekeepers (Google) won’t tip you off about when it’s released.  But there are some things you can do to monitor the situation, and to hopefully understand when Panda rolls out.  I have provided some tips below.

1. Know Your Dates
First and foremost, identify the exact date of a drop or boost in traffic in order to tie that date with potential algorithm updates.  This goes for Panda, and other algorithm updates like Penguin.   It’s critically important to know which algorithm update hit you, so you can target the correct path to recovery.  For example, Panda targets content quality issues, while Penguin targets unnatural inbound links.  And then there was Phantom, which also targeted low-quality content.

Moz has an algorithm change history, which can be very helpful for webmasters.  But, it’s hard for Moz to add the stealthy Panda updates, since Google isn’t confirming them.  Just keep that in mind while reviewing the timeline.

Moz Algorithm Update History

2. Visit Webmaster Forums
Monitor Google Webmaster Forums to see if others experienced similar effects during the same date.  For example, when Penguin hits, you can see many webmasters explaining the same situation in the forums.  That’s a clear sign the algorithm update was indeed rolled out.  Now, Google is continually updating its algorithm, and sites can be impacted throughout the month or year.  So you must identify the same date and the same type of update.  It’s not foolproof, but might help you track down the Loch Ness Panda.

Google Webmaster Forums

3. Connect With SEOs Well-Versed in Panda
Follow and engage SEOs that are focused on algorithm updates and who have access to a lot of data.  And keep an eye on the major industry blogs and websites.  SEOs that are well-versed in algorithm work have an opportunity to analyze various updates across industries and geographic locations.  They can often see changes quickly, and confirm those changes with data from similar websites and situations.

4. Take Action with an SEO Audit
Have an SEO Audit conducted through the lens of a specific algorithm update. The audit can help you confirm content quality problems that Panda could have targeted.  I’ve said this a thousand times before, but a thorough technical SEO audit is worth its weight in gold.  Not only can it help you understand the problems impacting your site Panda-wise, but you will undoubtedly find other issues that can be fixed relatively quickly.

So, you can better identify what happened with your site, you’ll have a roadmap for Panda recovery (if applicable), while cleaning up several other technical problems that could also be causing SEO issues.  During my career, I’ve seen many webmasters spinning their wheels working on the wrong SEO checklist.  They spent months trying to fix the wrong items, only to see no change at all in their Google organic trending.  Don’t let this happen to you.

5. Check Algorithm Tracking Tools
Monitor the various algorithm weather report tools like MozCast and Algoroo, which can help you identify SERP volatility over time.  The tools by themselves won’t fix your problems, but they can help you identify when the new Panda rolls out (or other major algorithm updates).

Mozcast Algorithm Weather Report

It’s Only Going to Get Worse and More Confusing
I wish I could tell you that the situation is going to get better.  But it isn’t.  Panda has already gone quasi real-time, but other algorithm updates will follow.  I do a lot of Penguin work, and once Google trusts that algorithm more, it too will launch monthly and without confirmation.  And then we’ll have two serious algorithm updates running monthly with no confirmation.

And who knows, maybe Panda will actually be part of the real-time algorithm at that point.  Think about that for a minute… two major algo updates running throughout the month, neither of them confirmed, and webmasters losing traffic overnight.   Yes, chaos will ensue.  That’s even more reason for business owners to fix their current situation sooner than later.

By the way, if you take a step back and analyze what Google is doing with Panda, Penguin, Pirate, Above the Fold, etc., it’s incredibly powerful.  Google is crafting external algorithms targeting various aspects of webspam and then injecting them into the real-time algorithm.  That’s an incredibly scalable approach and should scare the heck out of webmasters that are knowingly breaking the rules.

Summary – Tracking Cloaked Pandas Can Be Done
Just because Google hasn’t confirmed recent Panda updates doesn’t mean they aren’t occurring.  I have seen what looks to be several Panda updates roll out since July.  Unfortunately, you need to be analyzing the right data (and enough data) in order to see the new, cloaked Panda.  The tips I provided above can help you better track Panda updates, even when Google won’t confirm each release.  And knowing a major algorithm update like Panda rolled out is critically important to understanding what’s impacting your website.  That’s the only way to form a solid recovery plan.

So, from one Panda tracker to another, may the algorithmic wind always be at your back, keep your eyes peeled, stay caffeinated, and monitor the wounded.  Good luck.

GG

 

 

 

 

Tuesday, November 12th, 2013

A Double Penguin Recovery (via 2.1 Update) – But Does It Reveal A Penguin Glitch?

Summary: I analyzed the first double Penguin recovery I have come across during my research (after the Penguin 2.1 update). But what I found could reveal a glitch in the Penguin algorithm. And that glitch could be providing a false sense of security to some business owners.

Double Penguin Recovery is a Double-Edged Sword

If you have followed my blog and Search Engine Watch column, then you know I do a lot of Penguin work.  I started heavily analyzing Penguin 1.0 on April 24, 2012, and have continued to analyze subsequent Penguin updates to learn more about our icy friend.  I’ve had the opportunity to help many companies deal with Penguin hits, and have helped a number recover (and you can read more about those recoveries via several case studies I have written).  It’s been fascinating for sure.  But it just got even more interesting, based on analyzing a site that recovered during Penguin 2.1.   Read on.

Penguin 2.1 rolled out on October 4, 2013, and based on my analysis, it was bigger and badder than Penguin 2.0.   Matt Cutts confirmed that was the case during Pubcon (which was great to hear, since it backed up what I was seeing).  But as I documented in one of my recent Search Engine Watch columns, Penguin 2.1 wasn’t all bad.  There were recoveries, although they often get overshadowed by the carnage.  And one particular recovery during 2.1 caught my attention and deserved further analysis. That’s what I’ll cover in this post.

Ladies and Gentlemen, Introducing The Double Penguin Recovery
I believe it’s important to present the good and the bad when discussing Penguin updates, since there are still some people in the industry who don’t believe you can recover.  But you definitely can recover, so it’s important to document cases where companies bounce back after completing hard Penguin recovery work.

An example of a Penguin recovery:
Example of a Penguin Recovery

Now, there is one thing I hadn’t seen during my past research, and that’s an example of a company recovering twice from Penguin.  I’m not referring to a company that recovers once, gets hit again, and recovers a second time.  Instead, I’m referring to a company that initially recovers from Penguin, only to gain even more during a subsequent Penguin update.

Now that would be an interesting case to discuss… and that’s exactly what I saw during Penguin 2.1.  Interested?  I was too.  :)

Double Penguin Recoveries Can Happen
After Penguin 2.1, I analyzed a website that experienced its second Penguin recovery.  The site was first hit by Penguin 1.0 on April 24, 2012, and recovered in the fall of 2012.  And now, with 2.1 on 10/4/13, the site experienced another surge in impressions and clicks from Google Organic.

The second Penguin recovery on October 4, 2013:
Second Penguin Recovery During 2.1 Update

I’ve done a boatload of Penguin work since 2012, and I have never seen a double Penguin recovery.  So as you can guess, I nearly fell out of my seat when I saw the distinct bump on October 4, 2013.

Penguin Recoveries Lead To Penguin Questions
Based on the second recovery, the big questions for me (and I’m sure you as well), revolve around the reason(s) for the double recovery.  Why did this specific site see another surge from Penguin, when they already did in the past (after hard recovery work)?  Were there any specific factors that could have led to the second recovery?  For example, did they build more natural links, add high quality content, disavow more links, etc?  Or was this just an anomaly?  And most importantly, did Penguin help this website a second time, when it never should have?  In other words, was this a false negative (with the added benefit of a recovery)?  All good questions, and I hope to answer several of them below.

The Lack of Penguin Collateral Damage
I’ve always said that I’ve never seen collateral damage with Penguin.  Every site I’ve analyzed hit by Penguin (now 312), should have been hit.  I have yet to see any false positives.  But with this double recovery, we are talking about another angle with Penguin.  Could a site that shouldn’t see a recovery, actually recover?  And again, this site already recovered during a previous Penguin update.  Could this second recovery be a glitch in Penguin, or were there other factors at play?

History with Penguin
Let’s begin with a quick Penguin history for the website at hand.  It’s an ecommerce website that was devastated by Penguin 1.0 on April 24, 2012.   The site lost close to 80% of its Google Organic traffic overnight.

Initial Penguin Hit on April 24, 2012:
Initial Penguin Hit on April 24, 2012

The site had built thousands of exact match and rich anchor text links over years from spammy directories.  The link profile was riddled with spam.  After the Penguin hit on 4/24/12, their staff worked hard on removing as many links as they could, contacted many directory owners (with some success), and then disavowed what they could not manually remove.  Yes, the disavow tool was extremely helpful for this situation.

The site recovered relatively quickly from Penguin (within two months of finishing the recovery work). The site recovered to about 40% of its original traffic from Google Organic after the Penguin recovery.  That made sense, since the site had lost a majority of links that were once helping it rank for competitive keywords.  Now that the unnatural links were removed, the site would not (and did not) recover to full power.  That’s because it never should have ranked highly for many of the keywords in the first place.  And this is where the site remained until Penguin 2.1.

Initial Penguin recovery in 2012:
Initial Penguin Recovery in 2012

And Along Came Penguin 2.1
After Penguin 2.1 hit, the site experienced an immediate surge in impressions and traffic from Google Organic (and this was crystal clear to see in Google Webmaster Tools).  I’m not sure anyone was expecting a second Penguin recovery, but there it was…  as clear as day.

Impressions were up over 50% and clicks were up close to 60% (when comparing the timeframe after Penguin 2.1 to the timeframe prior).   Checking webmaster tools revealed extremely competitive keywords that were once targeted by Penguin, now gaining in average position, impressions, and clicks.  Certain keywords jumped by 10-15 spots in average position.  Some that were buried in Google were now on page one or page two.  Yes, Penguin 2.1 was providing a second shot in the arm for the site in question.

Impressions and Clicks Increased Greatly After Penguin 2.1 Recovery:
Increase in Impressions and Clicks After Penguin 2.1 Recovery

It was amazing to analyze, but I couldn’t stop several key questions from overpowering my brain.  What changed recently (or over time) that sent the right signals to Google?  Why would the site recover a second time from Penguin?  And could other websites learn from this in order to gain the infamous double Penguin recovery?  I dug into the site to learn more.

What Changed, Why a Second Recovery?
What you’re about to hear may shock you.  It sure shocked me.  Let’s start with what might be logical.  Since Penguin is hyper-focused on links, I reviewed the site’s latest links from across Google Webmaster Tools, Majestic SEO, and Open Site Explorer.

If the site experienced a second Penguin recovery, then I would assume that new links were built (and that they were a heck of a lot better than what got the site initially hit by Penguin).  Google Webmaster Tools revealed a doubling of inbound links as compared to the timeframe when the site first got hammered by Penguin (April 2012).  Majestic SEO and Open Site Explorer did not show as much movement, but did show an increase.

I exported all of the new(er) links and crawled them to double check anchor text, nofollow status, 404s, etc.  And I paid special attention to the links from Google Webmaster Tools, since it showed the greatest number of new links since the first Penguin recovery.  It’s also worth noting that Majestic showed a distinct increase in backlinks in early 2013 (and that includes both the raw number of links being created and the number of referring domains).

Backlinks History Reveals More Unnatural Links Built in Early 2013:
New Unnatural Links Built in Early 2013

Surely the natural, stronger linkbuilding was the reason the site experienced a double Penguin recovery, right?  Not so fast, and I’ll explain more about this next.  It seems Penguin might be glitchy.

More Unnatural Links = Double Penguin Recovery?  Crazy, But True
Believe me, I was really hoping to find stronger, natural links when checking the site’s latest link reporting.  But that wasn’t the case.  I found more spammy links from similar sources that got the site initially hit by Penguin in 2012.  Spammy directories were the core problem then, and they are the core problem now.  Actually, I could barely find any natural links in the new batch I checked.  And that was disturbing.

With all of my Penguin work (having now analyzed 312 websites hit by Penguin), I have yet to come across a false positive (a site that was hit that shouldn’t be hit).  But how about a site recovering that shouldn’t recover?  That’s exactly what this case looks like.  The site built more spammy links after initially recovering from Penguin, only to experience a surge in traffic during Penguin 2.1.  That’s two Penguin recoveries, and again, it’s the first time I have seen this.

The Danger of Heavily Relying on the Disavow Tool
To clarify, I don’t know if the site’s owner or marketing staff meant to build the newer spammy links.  Unnatural links tend to have an uncanny way of replicating across other low-quality sites.  And that’s especially the case with directories and/or article marketing.  So it’s possible that the older, spammy links found their way to other directories.

When You Disavow Links, They Still Remain (and Can Replicate):
The Danger of Relying on the Disavow Tool

This is why I always recommend removing as many links as possible versus relying on the disavow tool for all of them.  If you remove them, they are gone.  If you disavow them, they remain, and can find their way to other spammy sites.

What Does This Tell Us About Penguin?
To be honest, I’m shocked that Penguin was so wrong.  The initial Penguin recovery in 2012 was spot on, as the company worked hard to recover.  They manually removed a significant percentage of unnatural links, and disavowed the rest.  Then they recovered.  But now they experienced a second recovery, but based on the site building more unnatural links (and from very similar sources to the original unnatural links that got them hit in 2012).

So, is this a case of Penguin not having enough data on the new directories yet?  Also, did the company really test the unnatural link waters again by building more spammy links?  As mentioned above, I’ve seen spammy links replicate themselves across low-quality sites before, and that’s especially the case with directories and/or article marketing.  That very well could have happened, although it does look like the links were built during a specific timeframe (early 2013).  It’s hard to say exactly what happened.

Also, will the company eventually get hit by Penguin again (for a second time)?  It’s hard to say, but my guess is the surge in traffic based on Penguin 2.1 will be short-lived.  I cannot believe that the newer, unnatural links will go undetected by our icy friend.  I’m confident the site will get hit again (unless they move quickly now to remove and/or disavow the latest unnatural links).  Unfortunately, the site is teed up to get hit by Penguin.

Summary – Penguin 2.1 Was Wrong (for Now)
This was a fascinating case to analyze.  I have never seen a double Penguin recovery, and I have analyzed hundreds of sites hit by Penguin since April of 2012.  The website’s second recovery looks to be a mistake, as Penguin must have judged the new links as “natural” and “strong”.  But in reality the links were the same old spammy ones that got the site hit from the start.  They were just on different websites.

But as I said earlier, the site is now teed up to get hit by Penguin again. And if that happens, they will lose the power and traffic they have built up since recovering from the first Penguin attack.  If that’s the case, the site will have done a 360 from Penguin attack to Penguin recovery to second Penguin recovery and back to Penguin attack.  And that’s never a good place to be.

GG