Friday, May 23rd, 2014

Panda 4.0 Analysis | Nuclear Option Rewarded, Phantom Victims Recover, and Industry Experts Rise

Panda 4.0 Rolls Out

On May 20th, 2014 Google’s Matt Cutts announced that Panda 4.0 was rolling out.  Leading up to that tweet, there was a lot of chatter across the industry about an algorithm update rolling out (based on reports of rankings volatility and traffic gains/losses).  I was also seeing lots of movement across clients that had been impacted by previous algorithm updates, while also having new companies contact me about massive changes in rankings and traffic.  I knew something serious was happening, but didn’t know exactly what it was.  I thought for a while that it could be the pre-rollout and testing of Penguin, but it ended up being a new Panda update instead.

Matt Cutts Announces Panda 4.0

When Panda 4.0 was officially announced, I had already been analyzing sites seeing an impact (starting on Saturday May 17th, 2014).  I was noticing major swings in rankings and traffic with companies I’ve been helping with previous algo trouble.  And like I said above, several companies started reaching out to me via email about new hits starting that weekend.

And I was glad to hear a confirmation from Matt Cutts about Panda 4.0 rolling out.  That enabled me to hone my analysis.  I’ve mentioned in the past how unconfirmed Panda updates can drive webmasters insane.  When you have confirmation, it’s important to analyze the impact through the lens of a specific algorithm update (when possible).  In other words, content quality for Panda, unnatural links for Penguin, ad ratio and placement for Top Heavy, etc.

And by the way, since Google named this update Panda 4.0, we must assume it’s a new algorithm.  That means new factors could have been added or other factors refined.  Needless to say, I was eager to dig into sites that had been impacted to see if I could glean any insights about our new bamboo-eating friend.

Digging into the Panda 4.0 Data (and the Power of Human Barometers)
I’ve written before about the power of having access to a lot of Panda data.  For example, working with many sites that had been previously impacted by Panda.  It’s often easier to see unconfirmed Panda updates when you can analyze many sites impacted previously by the algorithm update.  I’ve helped a lot of companies with Panda hits since February of 2011 when Panda first rolled out.  Therefore, I can often see Panda fluctuations, even when those updates aren’t confirmed.  That’s because I can analyze the Panda data set I have access to in addition to new companies that reach out to me after getting hit by those Panda updates.  The fresh hits enable me to line up dates with Panda recoveries to better understand when Google rolls out unconfirmed updates.   I’ve documented several of the unconfirmed updates here on my blog (in case you wanted to go back and check the dates against your own data).

So, when Google announced Panda 4.0, I was able to quickly start checking all the clients I have helped with Panda recovery (in addition to the ones I was already seeing jump in the rankings).  And it didn’t take long to see the impact.  A number of sites were clearly being positively impacted by P4.0.

Panda 4.0 Recovery

Then, I analyzed new sites that were negatively impacted, based on those companies reaching out to me after getting hit (starting on 5/17/14).  Together, I have been able to analyze a boatload of Panda 4.0 data.  And it’s been fascinating to analyze.

I have now analyzed 27 websites impacted by Panda 4.0.  The sites I analyzed ranged from large sites receiving a lot of Google Organic traffic (1M+ visits per month) to medium-sized ecommerce retailers and publishers (receiving tens of thousands of visits per month) to niche blogs focused on very specific topics (seeing 5K to 10K visits per month).  It was awesome to be able to see how Panda 4.0 affected sites across industries, categories, volume of traffic, etc.  And as usual, I was able to travel from one Panda 4.0 rabbit hole to another as I uncovered more sites impacted per category.

 

What This Post Covers – Key Findings Based on Heavily Analyzing Websites That Were Impacted by Panda 4.0
I can write ten different posts about Panda 4.0 based on my analysis over the past few days, but that’s the not the point of this initial post.  Instead, I want to provide some core findings based on helping companies with previous Panda or Phantom hits that recovered during Panda 4.0.  Yes, I said Phantom recoveries. More on that soon.

In addition, I want to provide findings based on analyzing sites that were negatively impacted by Panda 4.0.  The findings in this post strike a nice balance between recovery and negative impact.  As many of you know, there’s a lot you can learn about the signature of an algorithm update from fresh hits.

Before I provide my findings, I wanted to emphasize that this is simply my first post about Panda 4.0.  I plan to write several additional posts focused on specific findings and scenarios.  There were several websites that were fascinating to analyze and deserve their own dedicated posts.  If you are interested in learning about those cases, the definitely subscribe to my feed (and make sure you check my Search Engine Watch column).  There’s a lot to cover for sure.  But for now, let’s jump into some Panda 4.0 findings.    

 

Panda 4.0 Key Findings

The Nuclear Option – The Power of Making Hard Decisions and Executing
When new companies contact me about Panda, they often want to know their chances of recovery.  My answer sometimes shocks them.  I explain that once the initial audit has been completed, there will be hard decisions to make.  I’m talking about really hard decisions that can impact a business.

Beyond the hard decisions, they will need to thoroughly execute those changes at a rapid pace (which is critically important).  I explain that if they listen to me, make those hard decisions, and execute fully, then there is an excellent chance of recovery.  But not all companies make hard decisions and execute thoroughly.  Unfortunately, those companies often sit in the grey area of Panda, never knowing how close they are to recovery.

Well, Panda 4.0 reinforced my philosophy (although there were some anomalies which I’ll cover later).  During P4.0, I had several clients recover that implemented HUGE changes over a multi-month period.  And when I say huge changes, I’m talking significant amounts of work.  One of my Panda audits yielded close to 20 pages of recommendations in Word.  When something like that is presented, I can tell how deflated some clients feel.  I get it, but it’s at that critical juncture that you can tell which clients will win.  They either take those recommendations and run, or they don’t.

To give you a feel for what I’m talking about, I’ve provided some of the challenges that those clients had to overcome below:

  • Nuking low-quality content.
  • Greatly improving technical SEO.
  • Gutting over-optimization.
  • Removing doorway pages.
  • Addressing serious canonicalization problems.
  • Writing great content. Read that again. :)
  • Revamping internal linking structure and navigation.
  • Hunting down duplicate content and properly handling it.
  • Hunting down thin content and noindexing or nuking it.
  • Removing manual actions (yep, I’ve included this here).
  • Stop scraping content and remove the content that has been scraped.
  • Creating mobile friendly pages or go responsive.
  • Dealing with risky affiliate marketing setups.
  • Greatly increasing page speed (and handling bloated pages, file size-wise).
  • Hunting down rogue risky pages and subdomains and properly dealing with that content.
  • And in extreme cases, completely redesigning the site. And several of my clients did just that. That’s the nuclear option by the way.  More about that soon.
  • And even more changes.

Now, when I recommend a boatload of changes, there are various levels of client execution. Some clients implement 75% of the changes, while some can only implement 25%.  As you can guess, the ones that execute more have a greater chance at a quicker recovery.

But then there are those rare cases where clients implement 100% of the changes I recommend.  And that’s freaking awesome from my standpoint.  But with massive effort comes massive expectations.  If you are going to make big changes, you want big results.  And unfortunately, that can take time.

Important Note: This is an incredibly important point for anyone dealing with a massive Panda or Penguin problem.  If you’ve been spamming Google for a long time (years), providing low-quality content, that’s over-optimized, using doorway pages to gain Google traffic, etc., then you might have to wait a while after changes have been implemented.  John Mueller is on record saying you can expect to wait 6 months or longer to see recovery.  I don’t think his recommendation is far off.  Sure, I’ve seen some quicker recoveries, but in extreme spamming cases, it can take time to see recovery.

Fast forward to Panda 4.0.  It was AWESOME to see clients that made massive changes see substantial recovery during P4.0.  And several of those clients chose the nuclear option of completely redesigning their websites.  One client is up 130% since 5/17, while another that chose the nuclear option is up 86%.  Here’s a quick screenshot of the bump starting on 5/17:

A Second Panda 4.0 Recovery

 

Side Note: The Nuclear Option is a Smart One When Needed
For some of the companies I was helping, there were so many items to fix that a complete redesign was a smart option.  And no, that doesn’t come cheap.  There’s time, effort, resources, and budget involved versus just making changes to specific areas.  It’s a big deal, but can pay huge dividends down the line.

One client made almost all of the changes I recommended, including going responsive.  The site is so much better usability-wise, content-wise, and mobile-wise.  And with Panda 4.0, they are up 110% since 5/18 (when they first started seeing improvement).

I’ve mentioned before that for Panda recovery, SEO band-aids won’t work.  Well, the clients that fully redesigned their sites and are seeing big improvements underscore the point that the nuclear option may be your best solution (if you have massive changes to make).  Keep that in mind if you are dealing with a massive Panda problem.

 

Phantom Victims Recover
On May 8th, 2013, I picked up a significant algorithm update.  After analyzing a number of websites hit by the update, I decided to call it “Phantom”.  It simply had a mysterious, yet powerful signature, so Phantom made sense to me.  Hey, it stuck. :)

Phantom was a tough algorithm update.  Some companies lost 60% of their traffic overnight.  And after auditing a number of sites hit by Phantom, my recommendations were often tough to hear (for business owners).  Phantom targeted low-quality content, similar to Panda.  But I often found scraped content being an issue, over-optimized content, doorway pages, cross-linking of company-owned domains, etc.  I’ve helped a number of Phantom victims recover, but there were still many out there that never saw a big recovery.

The interesting part about Panda 4.0 was that I saw six Phantom victims recover (out of the 27 sites I analyzed with previous content quality problems).  It’s hard to say exactly what P4.0 took into account that led to those Phantom recoveries, but those victims clearly had a good day.  It’s worth noting that 5 out of the 6 sites impacted by Phantom actively made changes to rectify their content problems.

One of the sites did nothing to fix the problems and ended up recovering anyway.  This could be due to the softening of Panda, which is definitely possible.  There were definitely some sites I analyzed that showed increases after Panda 4.0 that didn’t necessarily tackle many problems they were facing.  But in this situation, the site was a forum, which I cover next.  Note, you can read my post about the softening of Panda and what I saw during the March 24, 2014 Panda update to learn more about the situation.

Phantom Victim Recovers During Panda 4.0

Forums Rebound During Panda 4.0
My next finding was interesting, since I’ve helped a number of forums deal with previous Panda and/or Phantom hits.  I came across four different forums that recovered during Panda 4.0.  Three were relatively large forums, while one was a smaller niche forum run by an category expert.

One of the larger forums (1M+ visits per month) made a boatload of changes to address thin content, spammy user-generated content, etc.   They were able to gut low-quality pages, noindex thinner ones, and hunt down user-generated spam.  They greatly increased the quality of the forum overall (from an SEO perspective).  And they are up 24% since Panda 4.0 rolled out.

Noindexing Low Quality Content on a Forum

A second forum (1.5M visits per month) tackled some of the problems I picked up during an audit, but wasn’t able to tackle a number of items (based on a lack of resources).  And it’s important to know that they are a leader in their niche and have some outstanding content and advice.  During my audit I found they had some serious technical issues causing duplicate and thin content, but I’m not sure they ever deserved to get hammered like they did.  But after Panda 4.0, they are up 54%.

And the expert-run forum that experienced both Panda and Phantom hits rebounded nicely after Panda 4.0.  The site has some outstanding content, advice, conversations, etc.  Again, it’s run by an expert that knows her stuff.  Sure, some of the content is shorter in nature, but it’s a forum that will naturally have some quick answers.  It’s important to note that the website owner did nothing to address the previous Panda and Phantom problems.  And that site experienced a huge uptick based on Panda 4.0.  Again, that could be due to the softening of Panda or a fix to Panda that cut down on collateral damage.  It’s hard to say for sure.  Anyway, the site is up 119% since May 17th.

Forums Recover During Panda 4.0

Industry Experts Rise
During my research, I saw several examples of individual bloggers that focus heavily on niche areas see nice bumps in Google Organic traffic after Panda 4.0 rolled out.  Now, Matt Cutts explained Google was looking to boost the rankings of experts in their respective industries.  I have no idea if what I was seeing during my research was that “expert lift”, but it sure looked like it.

Here’s an example of a marketing professional that saw a 38% lift after Panda 4.0:
Bloggers Recover During Panda 4.0

And here’s a sports medicine expert that has shown a 46% lift:
Niche Expert Recovers During Panda 4.0

It was great to see these bloggers rise in the rankings, since their content is outstanding, and they deserved to rank higher!  They just didn’t have the power that some of the other blogs and sites in their industries had.  But it seems Google surfaced them during Panda 4.0.  I need to analyze more sites like this to better understand what’s going, but it’s worth noting.

Update: I reached out to Matt Cutts via Twitter to see if Panda 4.0 incorporated the “authority” algo update I mentioned earlier.  Matt replied this afternoon and explained that they are working on that independently.  So, it doesn’t seem like the bloggers I analyzed benefited from the “authority” algo, but instead, benefited from overall quality signals.  It was great to get a response from Matt.  See screenshot below.

Matt Cutts Tweet About Subject Matter Expert Algorithm

 

An Indexation Reality Check – It’s Not The Quantity, But the Quality That Matters
After conducting a laser-focused Panda audit, it’s not uncommon for me to recommend nuking or noindexing a substantial amount of content.  That is usually an uncomfortable decision for clients to make.  It’s hard to nuke content that you created, that ranked well at one point, etc.  But nuking low-quality content is a strong way to proceed when you have a Panda problem.

So, it was awesome to see clients that removed large amounts of content recover during Panda 4.0. As an extreme example, one client removed 83% of their content from Google’s index.  Yes, you read that correctly.  And guess what, they are getting more traffic from Google than when they had all of that low-quality and risky content indexed.  It’s a great example about quality versus quantity when it comes to Panda.

Indexation Impact and Panda 4.0

On the other hand, I analyzed a fresh Panda 4.0 hit, where the site has 40M+ pages indexed.  And you guessed it, it has serious content quality problems.  They got hammered by Panda 4.0, losing about 40% of their Google organic traffic overnight.

If you have been impacted by Panda, and you have a lot of risky content indexed by Google, then have a content audit completed now.  I’m not kidding.  Hunt down thin pages, duplicate pages, low-quality pages, etc. and nuke them or noindex them.  Make sure Google has the right content indexed.

 

Engagement and Usability Matter
While analyzing the fresh hits, it was hard to overlook the serious engagement issues I was coming across.  For example, stimulus overload on the pages that were receiving a lot of Google organic traffic prior to the hit.  There were ads that expanded into or over the content, double-serving of video ads, stacked “recommended articles” on the page, lack of white space, a long and confusing navigation, etc.  All of this led to me wanting to bounce off the page faster than a superball on concrete.  And again, high bounce rates and low dwell times can get you killed by Panda.  Avoid that like the plague.

Check out the bounce rates and pages per session for a site crushed by Panda 4.0:

Low Engagement Invites Panda


Side Note: To hunt down low-quality content, you can run this Panda report in Google Analytics.  My post walks you through exporting data from GA and then using Excel to isolate problematic landing pages from Google Organic.

Downstream Matters
While analyzing fresh Panda 4.0 hits, it was also hard to overlook links and ads that drove me to strange and risky sites that were auto-downloading software, files, etc.  You know, those sites where it feels like your browser is being taken over by hackers.  This can lead to users clicking the back button twice and returning to Google’s search results.  And if they do, that can send bad signals to Google about your site and content.  In addition, risky downstream activity can lead to some people reporting your site to Google or to other organizations like Web of Trust (WOT).

And as I’ve said several times in this post, Panda is tied to engagement.  Engagement is tied to users.  Don’t anger users.  It will come back to bite you (literally).

 

Summary – Panda 4.0 Brings Hope
As I said earlier, it was fascinating to analyze the impact of Panda 4.0.  And again, this is just my first post on the subject.  I plan to write several more about specific situations I’ve analyzed.  Based on what I’ve seen so far, it seems Panda 4.0 definitely rewarded sites that took the time to make the necessary changes to improve content quality, engagement, usability, etc.  And that’s awesome to see.

But on the flip side, there were sites that got hammered by P4.0.  All I can say to them is pull yourself up by your bootstraps and get to work.  It takes time, but Panda recovery is definitely possible.  You just need to make hard decisions and then execute.  :)

GG

 

Monday, May 12th, 2014

How To Remarket 70+ Ways Using Segments and Conditions in Google Analytics

Remarketing in Google Analytics Using Conditions and Segments

I know what you’re thinking. Can you really remarket more than 70 different ways using segments in Google Analytics?  Yes, you can!  Actually, when you combine the methods I’ll cover today, there are many more types of Remarketing lists you can build!  So the total number is much greater than 70.

My post today is meant to introduce you to segments in Google Analytics (GA), explain how you can use them to remarket to people who already visited your site, and provide important Remarketing tips along the way.  I hope once you read this post, you’re ready to kick off some Remarketing campaigns to drive more sales, leads, phone calls, etc.

What Are Segments in Google Analytics?
Many digital marketers know about Remarketing already.  That’s where you can reach people that already visited your website via advertising as they browse the web.  For example, if John visited Roku’s website, browsed various products, and left, then Roku could use Remarketing to advertise to John as he browses the Google Display Network (GDN).  The Google Display Network is a massive network of sites that run Google advertising, and includes Google-owned properties like YouTube, Google Maps, Gmail, etc.  According to Google, the GDN reaches 90% of internet users worldwide.

Remarketing via The Google Display Network (GDN)

By the way, if you’ve ever visited a website and then saw ads from that website as you browsed the web, then you’ve been remarketed to.  As you can guess, this can be an incredibly powerful way to drive more sales, leads, etc.  It can also be extremely frustrating and/or shocking to users.  So be careful when crafting your Remarketing strategy!

When Remarketing first rolled out, you could only set up Remarketing lists in the AdWords interface.  That was ok, but didn’t provide a massive amount of flexibility.  That’s when Google enabled marketers to set up Remarketing lists via Google Analytics.  That opened up an incredible amount of opportunity to slice and dice visitors to create advanced-level Remarketing lists.  For example, you could create Remarketing lists based on users who visited a certain section of your website, or lists based on users completing a certain conversion goal, etc.  Needless to say, tying Google Analytics to Remarketing was an awesome addition.

Now, I started using Google Analytics Remarketing functionality immediately to help clients build advanced Remarketing lists, but I had a feeling that Google was going to make it even more powerful.  And they did.

Along Came Segments… Remarketing Options Galore
You might already be familiar with segments in Google Analytics, which was originally named “Advanced Segmentation”.  In July of 2013, Google released a new version in Google Analytics and simply called it “Segments”.  But don’t get fooled by the simpler name.  Segments enable marketers to slice and dice their users and traffic to view reporting at a granular level.  For example, I often set up a number of segments for clients, based on their specific goals. Doing so enables me to quickly view granular reporting by removing a lot of the noise residing in standard reports.

Using Segments to Create Remarketing Lists in Google Analytics

But starting in January of 2014, Google rolled out an update that enabled marketers to use those segments to create Remarketing lists.  Yes, now marketers had an incredible number of options available when creating Remarketing lists.  In addition, you could easily import segments you are already using! This means you could leverage the hard work you’ve already put in when creating segments in Google Analytics.

Although I thought I had a lot of flexibility in creating Remarketing lists leading up to that point, the ability to use segments opened the targeting flood gates.  I remember checking out the list of options when segments for Remarketing first launched, and I was blown away.

For example, using segments you could create Remarketing lists based on:

  • Demographics like age, gender, language, location, and more.
  • Technology options like operating system, browser, device category, mobile device model or branding, and more.
  • Behavior like the number of sessions per user, days since last session, transactions, and session duration.
  • “Date of First Session” where you could create lists based on the initial session date or a range (sessions that started between two dates).
  • Traffic Sources based on campaign, medium, source, or keyword.
  • Ecommerce options like transaction id, revenue, days to transaction, product purchased, or product category.
  • And you can combine any of these options to create even more advanced Remarketing lists.

 

Now, the options listed above are based on the major categories of segments in Google Analytics.  But you can also set Remarketing lists based on conditions.  Using conditions, you could leverage many of the dimensions or metrics available in Google Analytics to build advanced Remarketing lists.  Actually, there are so many options via “conditions” that I can’t even list them all here in this post.

For example, there are eight major categories of dimensions and metrics you could choose from, including Acquisition, Advertising, Behavior, Custom Variables, Ecommerce, Time, Users, and Other.  And each category has a number of dimensions or metrics you can select to help craft your Remarketing lists.

Using Conditions to Create Remarketing Lists in Google Analytics

Note, it can definitely be overwhelming to review the list of options when you first check this out.  Don’t worry, I provide some tips for getting started later in this post.  For now, just understand that you can use segments and conditions in Google Analytics to craft Remarketing lists based on a number of factors (or a combination of factors).  Basically, you have the power to remarket however you like.  And that’s awesome.

Examples of What You Can Do
Enough with the introduction.  Let’s get specific.  I’m sure you are wondering how segments in Google Analytics can be used in the real-world.  I’ll provide a few examples below of Remarketing lists you can build to get back in front of people who already visited your website.  Note, the lists you build should be based on your specific business and website.  I’m just covering a few options below so you can see the power of using segments to build Remarketing lists.

Example 1: Remarket to users who came from a specific referral path (page).
Imagine you knew that certain referring webpages drove a lot of high-quality traffic on a regular basis.  Based on the quality of traffic coming through those referring pages, you decide that you would love to remarket to those users as they browse the web (since you have a strong feel for the type of user they are based on the content at hand).

Using segments, you could create a Remarketing list based on the original referral path (i.e. the referring pages).  And once that list reaches 100 members, then you can start getting targeted ads in front of those users and driving them to your preferred landing page (whether that’s current content, campaign landing pages, etc.)

Using Referring Path to Create Remarketing Lists

And if you find several referring pages that target similar categories of content, then you could use Boolean operators to combine those pages from across different websites.  For example, {referring path A} AND {referring path B}.  For example, if three referring pages are all about Category A, then you could combine them to create a Remarketing list.  You can also use regular expressions to match certain criteria.  Yes, the sky’s the limit.

Using Boolean Operators to Create Advanced Remarketing Lists

Example 2: Reach a certain demographic that has visited your website.
Let’s say you just launched a new product targeting 18-25 year olds and wanted to remarket to users who already visited your website that fit into this category.  You know they showed some interest in your company and products already (since they already visited your site), so you want to reach them via display advertising as they browse the web.

Using segments, you could create a Remarketing list based on age using the Demographics category.  Simply click the checkbox next to the age category you want to target.

Creating Remarketing Lists Based on Demographics

Or to get even more targeted, you could combine age with gender to test various messaging or visuals in your ads.  Going even further, you could add location as another selection to target users based on age, gender, and geographic location (down to the city level if you wanted).

Combining Demographics to Create Advanced Remarketing Lists

Example 3: Target users of specific campaigns, ad groups, or keywords.
Let’s say you are already using AdWords to drive targeted users to your website.  Using segments in Google Analytics, you could build a Remarketing list based on specific campaigns, ad groups, or keywords.  For example, if you have an ad group targeting a specific category or product, then you could create a list containing the users that already searched Google and clicked through your ads related to that category.  It’s a great way to get back in front of a targeted audience.

Creating Remarketing Lists Based on Previous Campaigns

And by combining the targeting listed above with ecommerce conditions like the number of transactions or amount of revenue generated, you could create advanced Remarketing lists targeting very specific types of users.

Creating Remarketing Lists Based on Revenue

Example 4: Pages or Page Titles
If you have been building a lot of new content and want to reach those visitors as they browse the web, then you could create a Remarketing list based Pages or Page Titles.  For example, let’s say you have 25 blog posts about a certain category of content.  They rank very well, have built up a nice amount of referral traffic, etc.  You could build a Remarketing list by select a grouping of pages via urls or via page titles. Then you could reach those users as they browse the web and drive them to a targeted landing pages, knowing they were interested in a certain post (or group of posts) about a certain subject.

Creating Remarketing Lists Based on Page Titles

And you can combine those pages with conversion goals to add users to a list that completed some type of important action on the site.  For example, users that signed up for your email newsletter, users that triggered an event, downloaded a study, etc.

Creating Remarketing Lists Based on Page Titles and Conversion

Remarketing Tips

Based on the examples listed above, I hope you see the power in using segments and conditions to craft Remarketing lists.  But as I said earlier, it can quickly become overwhelming (especially for marketers new to Remarketing).  Below, I’ve listed several important tips to keep in mind while crafting your campaigns.

  1. Remarketing Lists Require 100 Members
    A list requires at least 100 members before you can start showing ads to users.  Keep this in mind when building lists to ensure you can reach that number.  If not, you will never get back in front of those users.
  2. Start Simple, Then Increase in Complexity
    Based on the 100 member requirement, start with simpler Remarketing lists and increase your targeting as you get more comfortable with Remarketing.  Don’t start with the most granular targeting possible, only to have a list of 3 people.
  3. Refine Your Tracking Snippet
    Google requires that you refine your Google Analytics tracking code in order take advantage of Remarketing.  Review the documentation to ensure you have the proper technical setup.
  4. Craft a Strategy First, and Your Lists Should Support Your Strategy
    Don’t create lists for the sake of creating lists. Always start by mapping out a strong Remarketing strategy before jumping into list creation. Your strategy should dictate your Remarketing lists, and not the other way around.  Spend time up front mapping out who you want to target, and why.  And once you have a solid plan mapped out, you can easily build your lists via Google Analytics segments and conditions.
  5. Use Display Advertising In Addition to Text Ads
    Remarketing enables you to use both image ads and text ads.  Definitely use both when crafting your campaigns.  There are a number of sizes and formats you can use.  I recommend hiring a designer to build your ads unless you have in-house staff that is capable of designing high-quality ads.  Use image ads where possible to grab the user’s attention and provide text ads as a backup when a site doesn’t support image ads.  You don’t have to choose one or the other.
  6. Measure Your Results! Don’t “Set It and Forget It”.
    Remarketing is advertising.  And advertising campaigns should have a goal.  Don’t simply set up Remarketing without knowing the intended action you want users to take.  Instead, make sure you set up conversion goals to track how those users convert.  Do not set up the campaign and let it run without analyzing the results.  Understand the ROI of the campaign.  That’s the only way you’ll know if it worked, if the campaign should keep running, and if you should base other campaigns on the original.

 

Summary – New and Powerful Ways to Remarket
After reading this post, I hope you see the power in using segments and conditions for creating Remarketing lists.  In my opinion, too many marketers keep going after new eyeballs and easily forget about the eyeballs that already showed an interest in their company, products, or services.  I believe that’s a mistake.  Instead, marketers can craft advanced Remarketing lists to get back in front of a targeted audience.  Doing so provides another chance at converting them.

Remember, a warm lead is always more powerful than a cold call.  Good luck.

GG

 

Wednesday, April 23rd, 2014

April 2014 Google Algorithm Updates Heavily Targeted Song Lyrics and MP3 Websites (4/05 and 4/18)

Summary: Google has rolled out multiple algorithm updates in April that heavily impacted song lyrics and mp3 websites. This post provides more information about those updates, documents specific sites that were hit, and provides some possible problems that the algo targeted. I plan to update this post as I analyze more sites impacted by the UApril14 updates. 

Google Algorithm Updates From April 2014

I was not planning on writing a post this week, since my schedule is crazy right now.  In addition to my client work, I’ve been building my presentation for the Weber Shandwick Data Salon on Thursday about Google Algorithm Updates, how to recover from them, etc.  That’s ironic because I just stumbled across yet another fascinating algorithm update by Google that has done some serious damage (a set of updates actually).

If you’ve been following my posts, then you probably remember the flawed algorithm update from February.  That update severely impacted movie blogs based on an upstream copyright infringement issue at YouTube.  Google subsequently rolled out a second update in late February, which fixed the problem and returned traffic to normal levels (for the lucky ones).  Some never recovered.

Well, here we go again.  But this time it’s song lyrics websites that got hammered.  I received an email from the owners of songmeanings.com, which provides lyrics, meanings, etc.  I could tell by the messages I received that something serious had gone down.  And it didn’t take long to see the damage.  I fired up SEMRush and saw the massive drop in traffic starting on 4/18.  It looked like they lost 50% of their Google traffic overnight.

Songmeanings.com Impacted by Google Algo Update

And they weren’t alone.  Upon checking other lyrics websites, I saw a number of them had gotten hit just like songmeanings.com.  More about the destruction of lyrics websites soon.  Let’s take a step back and talk Panda for a second.

Claims of Panda Updates in Early April
To take a step back, there was a lot of webmaster chatter in early April about a potential Panda update.  I documented the March Panda update, which looked like the softer Panda that Matt Cutts had mentioned during SMX West.  And once the guys at songmeanings.com reached out to me, it was clear that April was an extremely volatile month as well.  I am seeing multiple updates based on the analysis I have conducted.

First, it was crystal clear that an algorithm update was rolled out on 4/18 (based on analyzing songmeanings.com and the song lyrics niche).  A number of websites all seeing massive drops in traffic overnight is a clear signal that Google rolled something out.  In addition, a lot of websites in one niche getting hit signals that Google was targeting something very specific with the update.  So I told the owners of songmeanings.com to sit tight.  I needed a midnight work session to analyze the site (and the niche).  They signed off and I started burning the midnight oil.  What I found was fascinating, complex, and sometimes confusing.  But it’s important to document this, so webmasters that are impacted can start troubleshooting the situation.

Song Lyrics Niche Heavily Targeted
Just like when the movie blog niche was targeted in February, this update seemed to heavily target song lyrics websites.  Songmeanings.com was not alone when 50% of its Google traffic exited stage right on 4/18.  I quickly saw that others lost significant traffic as well, including lyricsfreak.com, azlyrics.com, lyricsmode.com, sing365.com, etc.

Lyricsfreak.com Impacted by Google Algo Update

Lyricsmode.com Impacted by Google Algo Update

And here’s sing365.com which got hammered on 4/5, only to get hit even more on 4/18:

Sing365.com Impacted Twice by Google Algo Updates

And one really caught my eye.  It showed the same exact trending that slashfilm.com experienced in February with the flawed algo update!  Anysonglyrics.com got hammered on 4/5, only to recover on 4/17.  Check out the screenshot below.

Flashback to SlashFilm – Check out this trending!
Anysonglyrics.com Impacted by Google Algo Updates and then Recovers

Now, I thought February would be a rare occurrence.  It’s not often you see Google roll out an update, only to refine and re-roll that update out just a few weeks later.  But it seems that’s exactly what happened again!  Is this a trend?  Is Google rolling out updates that aren’t fully baked, only to refine and re-roll them back out?  If so, that’s freaking scary.  Just ask Peter from slashfilm.com how business was going during the ten day downturn in traffic.  I’m sure he lost a few nights of sleep, to say the least.

And just like I wondered when the flawed UFeb14Rev came rolling back out, how many other sites were wrongly targeted?  How many won’t recover like SlashFilm did, and how many will ultimately go out of business based on the algo update?  All good questions and only Google knows.  But one thing is for sure.  One algo update can rock your world.  Losing 50%+ of your traffic overnight, and possibly due to a flawed algo, is a tough pill to swallow.

Not All Lyrics Websites Were Negatively Impacted.
Similar to the movie blog situation, not all websites in the niche were negatively impacted.  Some actually increased in traffic during the 4/18 update.  And of course, that got me wondering about the signature of this algorithm update.  What was it targeting?  Why did some websites get slammed while others remained intact?  It was time to roll up my sleeves and research some song lyrics.  Maybe “Sympathy for the Devil” by the Stones or “Free Fallin” by Tom Petty?

Let’s Add More Complexity – mp3 Sites Also Impacted, But Starting on 3/30
OK, now I’m starting to sound crazy, right? Can you see why algorithm updates without confirmation of algo updates can be maddening?  While analyzing several lyrics websites, I found several had relationships with mp3 websites (you know, the ones that illegally let you download music).  Well, checking the trending for those sites revealed big drops starting around 3/30, which was a few days before the lyrics sites started getting hit (on 4/5).

For example, I saw relationships with mp3raid.com, which has 426K DMCA takedowns filed (urls requested to be taken down). I also saw links to 49mp3.com, which has 385K urls requested to be taken down via DMCA.  Yes, that’s a lot of DMCA takedowns, especially compared to some of the song lyrics websites (which often revealed just a handful).  I’m not sure if Google is hammering upstream sites linking to those mp3 websites, or if there’s something else at play.  That said, it’s very interesting to see those mp3 sites get hammered just days before the lyrics websites got hit (and again, the sites are connected via links, and possibly affiliate relationships).

MP3Raid.com Impacted by Google Algo Update on 3/30

An Important Note About Quickly Rebounding
I mentioned anysonglyrics.com earlier and how it rebounded already (dropping on 4/5, but recovering on 4/17, presumably as Google rolled out a second update).  Well, they weren’t alone. I saw that trending a few times during my analysis.  That got me thinking that the update was targeting something that could be turned off pretty quickly by the websites that were impacted.  Now, I’m not saying that’s 100% the case, but it could be.

For example, were they linking to websites or downloads that Google didn’t like?  I did notice many links to toolbars like RadioRage, which has a horrible WOT score (see screenshot below).  It sounds like malware has been a big issue with RadioRage (and similar products).  If Google feels sites are heavily driving users to malware, or a conduit for malware, then I could definitely see them taking action.  And for the sites that rebounded, was there something they did or changed during that downturn?  Hard to say.

Linking to Toolbars That Distribute Malware

 

Identifying Common Traits That Could Be Targeted
So as the midnight oil burned, I started digging into song lyrics websites. My goal was to identify common traits across sites negatively impacted, while also checking out the sites that were spared.  I had no idea if I would find a smoking gun, but I had hopes of identifying several possible causes.

Disclaimer: Now is a good time to run through a quick disclaimer. Only Google knows what it targeted during the updates in April. I can only give my best guess based on helping many companies with Panda and other algorithm updates. With that out of the way, here are some interesting issues that surfaced during my analysis.

DMCA Takedown Notices
Based on the nature of the websites, I quickly checked Google’s Transparency Report for DMCA takedowns filed against the domains.  Several of the sites were listed, but some only had a few.  For example, songmeanings.com only had three urls listed.  Others had more like lyricsmode.com with 993, but there wasn’t a consistent high number associated with all that were hit.  Also, some that were spared also had DMCA takedowns filed against them (like DirectLyrics.com with 14).

But the mp3 sites that were targeted had many DMCA takedowns filed (as I mentioned earlier).  And if the lyrics websites are affiliates, or are simply driving users to illegally download files, then maybe Google targeted that.  Hard to say, but it was an interesting find. Now you would think that issue would be taken care of via the Pirate update, and not necessarily Panda or a separate update, but it’s entirely possible.  Let’s move on.

DMCA Takedown Notices and Algo Updates

Followed Affiliate Links and Heavy Cross-linking
Analyzing the sites hit by the 4/18 update revealed a number of affiliate links.  And some were definitely followed affiliate links (which violates Google Webmaster Guidelines).  But, this didn’t look like a new issue, and there wasn’t much consistency.  For example, there were sometimes followed affiliate links on sites that weren’t hit by the update.  Therefore, I’m not sure the affiliate links were the cause of the algo hit (although I would recommend to all the lyrics websites that they nofollow all affiliate links).

Beyond the obvious affiliate links, there was a boatload of cross-linking going on between lyrics websites.  I’m not sure if many are owned by the same network, but it was pretty clear that some were trying to drive traffic and SEO power to the others.  And many of those links were followed.  Without digging into the history of all the domains, it’s hard to identify all of the relationships (which websites are owned by one company, which have long-standing affiliate relationships, etc.)  But I saw this enough across lyrics websites that I wanted to bring it up here.

Affiliate Links and Algo Updates

Duplicate Content and Thin Content
We know that Pandas love eating duplicate content, thin content, etc.  I can’t say whether this was a Panda update, or something more sinister, but I did notice some typical Panda issues across several sites. I definitely found duplicate content issues across lyrics websites (and some were relatively extreme).  I also found many thin pages, with some containing almost no content at all (beyond the site template).  But, this was not a new issue, and I ran into the consistency problem again.  Not all sites hit had the same level of duplicate or thin content, and some sites were unscathed that had those problems.

Therefore, I’m not confident that duplicate content was the cause.  But again, I would definitely fix the content problems asap.  Just because I don’t think it was the cause of this hit doesn’t mean it couldn’t cause another hit.  Like I said in my last Search Engine Watch column, make your site the anti-bamboo.  :)

Duplicate and Thin Content and Panda

Page Speed and Serious Performance Issues
Now here’s an interesting problem I saw across a number of lyrics websites negatively impacted by the 4/5 and 4/18 updates.  Many were experiencing serious performance issues.  I’m not talking about taking a few seconds to load.  I’m talking about NEVER fully loading.  You could see chrome and firefox still trying to load something even a full minute or two into rendering the page.

And when I tried to running page speed tests, they wouldn’t even run!  I can tell you, I rarely come across that during my audits.  So, could extreme performance issues have caused the algo hit?  Hard to say, since I’m not analyzing the sites on a regular basis.  But let’s face it, Google definitely doesn’t want to send users to sites that take forever to load.  I’ll mark this down as “maybe”.  But if I were the owners of the lyrics websites, I would definitely take a hard look at performance and try to rectify the excessive load times.

Page Speed and Algo Updates

YouTube Upstream Copyright Issues Again?
I noticed that several of the sites negatively impacted had video sections (or contained videos on the lyrics pages for each song).  Based on what I saw with SlashFilm and the movie blog niche, it wouldn’t shock me if the same upstream copyright infringement issue was at play here.  For example, videos that had been taken down or flagged for copyright infringement that are being embedded on the lyrics sites.

Just like I said with the movie blog situation, that’s not really the fault of the websites that are embedding the videos… since it’s more of a YouTube problem.  But I saw this heavily during my movie blog analysis and I know the lyrics websites contain YouTube videos.  It’s worth looking into if you’re a lyrics website impacted by these recent updates.

Video Copyright Infringement and Algo Updates


A Note About Unnatural Links and/or Paid Text Links
Checking the link profiles of various lyrics websites revealed an unnatural links problem. I won’t go into too much detail here, but you could see red flags for sure.  But based on what I’m seeing trending-wise, it’s hard to believe this was some type of an unnatural links algo update.  Some sites rebounded just a few weeks later (or even days later), so I’m not sure this reflects some type of algorithmic move by Google to hammer sites gaming links.

From a manual actions standpoint, I don’t know how many of these sites have manual actions, but I do know several of them don’t.  So, I’ll just leave the unnatural links discussion here… But a warning to lyrics websites about Penguin and unnatural links, I’d probably tackle that situation sooner than later.

Unnatural Links and Song Lyrics Websites

 

Moving Forward – Next Steps for Lyrics Websites Impacted
Like I said earlier, it’s been fascinating to analyze the latest algo updates pushed out in April. As you can see, song lyrics websites were hit pretty hard.  Some have recovered, but a number of them still remain impacted.  Also, mp3 websites were hit hard too, but it looks like that update started closer to 3/30.  Remember what I said about the complexity of algorithm updates?

For sites that have been impacted, I recommend moving quickly to track down all possible problems.  Then I would begin fixing them asap.  The quicker you can get your site in order, the quicker you can experience recovery.  And since some sites have recovered already, it’s possible that can happen to your site as well.  Since I couldn’t identify a smoking gun, I would review all of the problems I documented in this post.  That’s a great place to start.  Good luck.

GG

 

Wednesday, April 16th, 2014

I’m Speaking at the Weber Shandwick Data Salon on April 24th – Learn About Google Algorithm Updates, Manual Penalties, and More

Weber Shandwick Data Salon on April 24, 2014

I’m excited to announce that I’ll be speaking at the Weber Shandwick Data Salon on Thursday, April 24th in New York City (from 6:00PM to 7:30PM).  Each month, Weber Shandwick invites leaders from various areas of digital marketing to speak, to spark conversation, and to share ideas.  I’m thrilled to be presenting next week to speak about the latest in SEO.

My presentation will cover some extremely important topics that I’m neck deep in on a regular basis, including Google algorithm updates, manual penalties, and the war for organic search traffic that’s going on each day.  I’ll be introducing various algorithm updates like Panda and Penguin, explain what manual actions are, and provide case studies along the way.  I’ll also introduce the approach that Google is using to fight webspam algorithmically, while also covering how manual penalties work, how to recover from them, and how to ensure websites stay out of the danger zone.

My goal is to get the audience thinking about content quality, webspam, unnatural links, and webmaster guidelines now before any risky tactics being employed can get them in trouble.  Unfortunately, I’ve spoken with hundreds of companies over the past few years that were blindsided by algo updates or manual actions simply because they never thought about the repercussions of their tactics, didn’t understand Google’s stance on webspam, or the various algorithm updates it was crafting.  Many of them learned too late the dangers of pushing the envelope SEO-wise.

So join me next Thursday, April 24th at 6PM for a deep dive on algorithm updates, manual penalties, and more from the dynamic world of SEO.  You can register today via the following link:

Register for Weber Shandwick’s Data Salon on April 24th:
https://www.surveymonkey.com/s/N6G5K5B

Below I have provided the session overview.  I hope to see you there!

Weber Shandwick Data Salon #3
April 24, 2014 from 6:00PM to 7:30PM
Speaker: Glenn Gabe of G-Squared Interactive
Moderator: Kareem Harper of Weber Shandwick
909 Third Avenue, 5th Floor
*Refreshments will be available starting at 6:00pm
  


Front Lines of SEO

The Frontlines of SEO – Google Algorithm Updates, Penalties, and the Implications for Marketers
Explore Google’s war on webspam, learn about key changes and updates occurring in Search right now, and fully understand the implications for digital marketers.

There’s a battle going on every day in Search that many people aren’t aware of.  With millions of dollars in revenue on the line, some businesses are pushing the limits of what’s acceptable from an SEO perspective.  In other words, gaming Google’s algorithm to gain an advantage in the search results.

Google, with its dominant place in Search, is waging war against tactics that attempt to manipulate its algorithm.  From crafting specific algorithm updates that target webspam to applying manual actions to websites, Google has the ability to impact the bottom line of many businesses across the world.  And that includes companies ranging from large brands to small local businesses.  This session will introduce the various methods Google is using to address webspam in order to keep its search results as pure as possible. Specific examples will be presented, including case studies of companies that have dealt with algorithm updates like Panda and Penguin.  Manual penalties will be discussed as well.

Beyond battling webspam, the major search engines have been innovating at an extremely rapid pace.  The smartphone and tablet boom has impacted how consumers search for data (and how companies can be found).  And now the wearable revolution has begun, which will add yet another challenge for marketers looking to reach targeted audiences.   Glenn will introduce several of the key changes taking place and explain how marketers can adapt.  Glenn is also a Glass Explorer and will provide key insights into how Google Glass and other wearables could impact marketing and advertising.

Register today to learn more about Google’s war on webspam, to better understand the future of Search, and to prepare your business for what’s coming next.

You can register online by clicking the following link:
https://www.surveymonkey.com/s/N6G5K5B

 

 

Monday, March 31st, 2014

Did the Softer Panda Update Arrive on March 24, 2014? SMBs Showing Modest Recovery Across Industries

Softer Panda Update on March 24, 2014

As a consultant helping a number of companies with Panda recovery, I’ve been eagerly awaiting the March Panda update.  Based on the data I have access to, I was able to pick up and analyze Panda UJan14, UFeb14, and the infamous UFeb14Rev (where Google re-rolled out the algorithm update after mistakenly hammering movie blogs).  Needless to say, it’s been an interesting beginning to the year Panda-wise. And if you’re wondering what U{Month}{Date} is, that’s the naming convention I’m using for unconfirmed Panda updates.

And in case you forgot, Google announced in July of 2013 that they wouldn’t be confirming Panda updates anymore.  As I explained in a post soon after that, unconfirmed Panda updates can cause mass chaos and can drive webmasters dealing with mysterious traffic losses insane.  But, I also explained that if you have access to a lot of Panda data, you can sometimes pick up the updates.  And that’s where SEOs helping a lot of companies with Panda can come in very handy.  Those SEOs have turned into human Panda barometers and can help identify when the specific updates roll out. Remember, we know that Panda is supposed to roll out monthly, and can take about ten days to roll out.  It’s not real-time, but Google trusts the algorithm enough to unleash it once per month.

I have been able to identify a number of updates since July of 2013, including UJan14 from January 11th and the flawed UFeb14 from February 11th (which was the movie blog fiasco I mentioned earlier).  But it’s been relatively quiet since then from a Panda standpoint.  I’ve only seen some moderate movement around 3/11/14, but nothing that I could nail down as Panda.  But then the 24th arrived, and it quickly became clear that something widespread was taking place.  Just like a Panda update.

Upward Movement Across Panda Victims
During the week of March 24th, I was checking organic search trending across clients and quickly noticed a number of increases from Google Organic.  The first one that caught my attention was a website that had been battling Panda for a long time.  It’s an ecommerce site that has seen many ups and downs since February of 2011 when Panda first arrived (and more downs than ups if you know what I mean).  The 25th showed an 18% increase in traffic, and it has consistently remained higher since then.  Google Webmaster Tools now shows increases in impressions and clicks starting on the 24th.  Comparing the entire week to previous weeks reveals Google Organic traffic was up 15%.

An example of a bump in Google Organic traffic starting on 3/24/14:

Panda Recovery Begins on 3/24/14

And that site wasn’t alone.  I was seeing similar lifts in Google Organic traffic across a number of Panda victims I have been helping.  That lift ranges from 9% to 24%, with a few outliers that saw much larger increases (45%+).  Note, those sites seeing larger increases didn’t have massive amounts of traffic, so it was easier to show a much larger lift.  That being said, the lift was significant for them.  But overall, I mostly saw moderate recoveries versus significant ones during this update.  And that leads me to think we just might have seen the “softer” Panda update that was supposed to help small to medium sized businesses (SMBs).

 

Matt Cutts and a “Softer” Panda
At SMX West, Matt Cutts from Google explained that they were working on a “softer” version of Panda that would make it less of an issue with certain websites.  Matt said the “next generation” Panda would be aimed at helping small businesses that might be affected by Panda.  Well, based on what I’m seeing, it sure looks like the new Panda could have rolled out.  Almost all of the companies I analyzed that were positively impacted by the 3/24 update could be categorized as SMBs.  They aren’t big brands, major corporations, they don’t have a lot of brand recognition, and some are run by just a few people.

In addition, most of the recoveries fell into the 10-20% range, which were modest increases.  Don’t get me wrong, that’s still a nice lift for some of the companies that previously got hit by Panda, but it’s not a massive recovery like you might see during other Panda updates.  For example, a company I was helping that got hit by Phantom in May of 2013 ended up recovering in August and surged by 68% in Google Organic.  That’s a big recovery.  So, modest recoveries line up with a “softer” algo that could help small businesses (in my opinion).


March 24, 2014 – A Good Day for (SMB) Panda Victims
Below, I have included some screenshots of Google Organic trending for companies impacted by Panda UMarch14.  You can clearly see a lift starting on the 24th and remaining throughout the week.  Note, these companies span various industries, so it wasn’t tied to one specific niche.

Panda Recovery SMB on 3/24/14 GA Data

 

Panda Recovery on 3/24/14 Google Webmaster Tools

 

Panda Recovery on 3/24/14 Google Analytics

 

Panda Recovery on 3/24/14 GWT

 

Panda Recovery on 3/24/14 Searchmetrics

 

 

Common Characteristics and Drivers for Recovery
If you have been impacted by Panda in the past, or if you are simply interested in the algorithm update, then I’m sure you are wondering why these specific companies recovered on 3/24.  And no, not all companies I’m helping with Panda recovered.   Now, only Google knows the refinements they made to the Panda algorithm to soften its blow on small businesses.  That said, I think it’s important to understand what Panda victims have addressed in order to better understand how the algorithm works.

Below, I’ll cover some of the common problems I’ve been helping companies tackle over the past several months Panda-wise (the companies that recovered during UMarch14).  I’m not singling out certain factors as the trigger for this specific update and recovery.  But I do think it’s worth covering several of the factors that were causing serious problems Panda-wise, and that were rectified over the past few months leading up to the recoveries.

Over-optimized Thin Pages
Several of the websites that experienced recovery had serious problems with thin pages that were over-optimized.  For example, pages with very little content combined with over-optimized title tags, meta descriptions, body copy, etc.  And the body copy was typically only a paragraph or two and was clearly written for search engines.

Over-optimized Titles and Panda

Doorway Pages
Along the same lines, several of the companies employed doorway pages to try and gain organic search traffic across target keywords.  For example, they would reproduce pages and simply change the optimization to target additional keywords.  For some of the sites I was helping, this was rampant.

Duplicate Pages and Panda

Keyword Stuffing
Some of the companies that saw recovery were keyword stuffing pages throughout their sites.  For example, all core page elements excessively contained target keywords.  The copy was extremely unnatural, the on-page titles (which were often the h1s), were clearly targeting keywords, the navigation was all using exact match anchor text, and the footer was crammed with more keyword-rich content and exact match anchor text links.  And many times, the target keywords were repeatedly bolded throughout the content.  It was obvious what the goal was while analyzing the pages… it was all for SEO.

Keyword Stuffing and Panda

Excessive Linking Using Exact Match Anchor Text
Some of the websites that saw recovery were previously weaving exact match anchor text links into every page on the site.  So, you would visit a page and immediately find exact match or rich anchor text links from the copy to other pages on the site.  It was excessive, unnecessary, and made for a horrible user experience.  And as I explained above, several of the sites were also employing spammy footers with exact match anchor text links (and many of them).

Affiliate Links
Several of the companies that saw recovery were including followed affiliate links in the site content.  Those links should absolutely have been nofollowed.  During initial audits I would uncover followed affiliate links, flag them, and document them in a spreadsheet.  When sharing my findings with my clients, some of the links were so old that my clients didn’t even remember they were there!  “Affiliate creep” can cause big problems post-Panda.  Nofollowing all affiliate links or removing them was important for sure.

Followed Affiliate Links and Panda

Nuking Duplicate Content or Noindexing Thin Pages
Some of the companies that saw recovery had an excessive amount of duplicate or thin content.  Upon surfacing the problematic urls, my clients either removed or noindexed those pages.  In some cases, that impacted tens of thousands of pages (or more).  Addressing “low-quality” content is one of the most important things a company can do Panda-wise.  And that’s especially the case if some of those pages were ranking well in Google (prior to the Panda hit).  You can read more about the sinister surge in traffic before Panda strikes to learn more about that phenomenon.

Noindexing Content and Panda

 

Warning: Some Sites Slipped Through The Panda Cracks
I also wanted to quickly mention something that can happen with algorithm updates.  There were two sites I analyzed that showed a modest recovery that shouldn’t have recovered at all.  They were rogue sites that some of my clients had set up in the past that were simply sitting out there.  Those sites are not getting a lot of attention from my clients, and there has been very little work on those sites from a Panda standpoint.  Needless to say, I was surprised to see those sites positively impacted by Panda UMarch14.  Sure, they didn’t surge in traffic, but they definitely increased starting on the 24th.  This also leads me to believe that we saw the softer Panda update that Matt Cutts mentioned.

False Panda Recovery on 3/24/14
Summary – Be In It For The Long Haul
As I explained earlier, Matt Cutts promised a “softer Panda” at SMX West that could help small businesses.  Based on what I have seen, that new update might have rolled out on 3/24.  I saw a number of companies that were dealing with Panda problems recover to some extent starting on that date.

If you have been hit by Panda, then the recoveries I documented above should signal hope.  The companies that saw recovery have worked hard to rectify a range of “content quality” problems.  Audits were completed, problems were identified, and a lot of work was completed over the past few months.

The good news is that a number of the websites making significant changes saw a positive impact from Panda UMarch14.  I think it underscores a Panda philosophy I have been preaching for a long time.  You must be in it for the long haul.  Short-term thinking will not result in recovery.  You need to have the proper analysis completed, identify all content-related problems, and work hard to rectify them as quickly as  you can.  And Google crafting an algorithm update that softens the blow of Panda sure helps.  So thank you Matt Cutts.  From what I can see, there are companies seeing more traffic from Google today than they did a week ago.  And that’s never a bad thing.

GG

 

Thursday, March 27th, 2014

Smartphone Rankings Demotion in Google Search Results Based on Faulty Redirects [Case Study]

Smartphone Rankings Demotion in Google Search Results

In June of 2013, Pierre Far from Google explained that providing a poor mobile experience could impact a site’s rankings in the smartphone search results.  Basically, if a site is making mistakes with how it is handling mobile visits, then that site risks being demoted in the search results when users are searching from their smartphones.  And as smartphones boom, that message scared a lot of people.

Specifically, Pierre listed two common mistakes that could cause a poor user experience.  First, having faulty redirects could force users to irrelevant content, or to just to the mobile homepage of a website.  For example, imagine searching for a specific product, service, review, blog post, etc., and finding that in the search results.  But as you click through, the site redirects you to the mobile homepage.  That sounds really annoying, right?  But it happens more than you think.  And that’s especially true since the problem is hidden for desktop users.

But that June day in 2013 passed, and businesses moved on.  Sure, mobile is important, it’s taking over, blah blah blah.  In addition, I’m sure many wondered if Google would really demote a site in the smartphone search results.  I mean, why move a powerful site like yours down in the results when your pages really should rank highly (like they do on desktop)?  Google would probably only do that to low quality sites, right?..  I think you see where I’m going with this.

 

Faulty Redirects – An Example Caught in the Wild
Last week, I was checking Techmeme for the latest technology news and clicked through an article written by Electronista.  I forget which story the article was about, but Electronista was listed first for the news at hand.  So I clicked through and was immediately redirected to the mobile homepage.  I navigated back to Techmeme, clicked the listing again, and was promptly redirected again.  So I visited another site listed for the story on Techmeme and got the information I was looking for.

*ALERT* – That’s exactly the user experience Google is trying to avoid happening to people searching Google.  And that’s one of the core scenarios that Pierre listed that could result in a rankings demotion.  So that got me thinking.  What about other pages on Electronista?  Were they also redirecting mobile users to the mobile homepage?  And if this problem was widespread, were they being demoted in the smartphone search results?  And so I dug in.

Side note: I’m not targeting Electronista by writing this.  Actually, I hope this post helps them.  I can only imagine that if they fix the problem, then their traffic from smartphone users on Google will skyrocket.


An Example of a Faulty Redirect on Electronista

I’m sure you are wondering how this looks.  Here’s a quick example.  Let’s say I was researching a Nexus 7 tablet and comparing it to an iPad mini.  Electronista has an article focused on that topic.  On desktop or tablet, I visit that url and can view the entire post.  But on my smartphone, visiting that url redirects me to the mobile homepage (via a 302 redirect).

Desktop URL resolves correctly:

Desktop URL on Electronista.com

When searching on my smartphone, the site incorrectly redirects me to the mobile homepage:

Redirect to Mobile Homepage on Electronista.com

 

Here is the 302 redirect in action:
302 Redirect to Mobile Homepage on Electronista.com

 

Examples of Demoted Smartphone Rankings
Electronista.com has 143K pages indexed in Google.  And every url I checked on my smartphone is redirecting to the mobile homepage.  So it wouldn’t take Google very long to pick up the problem, and across many pages on the site.  But now I needed evidence of rankings being demoted based on this problem.

So I fired up SEMRush and checked the organic search reporting for Electronista.com.  I started picking keywords that the site ranked highly for (on page 1 on Google).  Then I started searching on my desktop and smartphone using Chrome for Android (incognito mode).  And low and behold, I noticed the problem almost immediately.  Smartphone rankings were either much lower or non-existent for content that was ranking highly on desktop.  Almost all of the keyword/ranking combinations I checked revealed the demotion in the smartphone search rankings.

Note, not every Electronista listing was being demoted.  There were a few outliers where the page still ranked well (as well as it did on desktop search).  But the user experience was still horrible.  I was redirected to the mobile homepage and forced to fend for myself.  Needless to say, I wasn’t going to start searching the mobile site for the url I expected to see.  I just bounced.  And again, Google doesn’t want its users to have to deal with this situation.  Instead, Google will just demote the search rankings on smartphones.

A picture is worth a thousand words, so let’s take a look at some examples.  Below, I have provided screenshots of the demotion in action.  You’ll see the desktop search results first and then the smartphone search results below that.

Red Camera For Sale (ranks #8 on desktop and N/A on smartphone)

Desktop search results:
Desktop Search for Red Camera on Google

Mobile search results:
Mobile Search for Red Camera on Sale on Google

 

LTE Microcell (ranks #10 on desktop and N/A on smartphone)

Desktop search results:
Desktop Search for LTE Microcell on Google

Mobile search results:
Mobile Search for LTE Microcell on Google

 

HTC Vivid Radar (ranks #3 on desktop and #20 on smartphone)

Desktop search results:
Desktop Search for HTC Vivid Radar on Google

Mobile search results:
Mobile Search for HTC Vivid Radar on Google

 

Google Nexus 7 Versus ipad mini (ranks #8 on desktop and #18 on smartphone)

Desktop search results:
Desktop Search for Google Nexus 7 Versus iPad Mini on Google

Mobile search results:
Mobile Search for Google Nexus 7 Versus iPad Mini on Google

Skullcandy Pipe Review (ranks #5 on desktop and #10 on smartphone)

Desktop search results:
Desktop Search for Skullcandy Pipe Review on Google

Mobile search results:
Mobile Search for Skullcandy Pipe Review on Google

 

And here are a few where the rankings were not demoted.  They should be demoted, but they weren’t (at least for now):

 

Commodore 64 remake
Mobile Search for Commodore 64 Remake Review on Google 

 

Windows 8 touch screen requirements
Mobile Search for Windows 8 Touch Screen Requirements on Google 

 

 

How To Avoid Demoted Smartphone Search Rankings (Listen up Electronista)

The solution to this problem is fairly straightforward.  If you are using separate webpages for mobile content, then you should redirect your desktop pages directly to the mobile url for that content.  Do not redirect all requests from smartphones to the mobile homepage.  As Google explains, “This kind of redirect disrupts a user’s workflow and may lead them to stop using the site and go elsewhere.”  And by the way, Google also says that it’s better to show smartphone users the desktop content versus implementing a faulty redirect to the mobile homepage.  I completely agree.

In addition, make sure you use rel alternate on your desktop pages pointing to your mobile pages.  And then use rel canonical on your mobile pages pointing to your desktop pages.  You can read Google’s documentation for handling various mobile setups here.

Update: Pierre Far from Google provided some feedback based on reading this case study.  I asked Pierre how quickly Google would remove the demotion once the redirect problem was fixed.  Here is what Pierre said:
“When a fix is implemented, we’d detect it as part of the usual crawling and processing of each URL.”

So, it seems that once the redirects are corrected, Google will detect the proper setup as it recrawls the site.  As it does that, the pages should return to their normal rankings.  If Electronista makes the necessary changes, I’ll try and figure out how quickly their smartphone rankings return to normal. Stay tuned.

Avoid Smartphone-only Errors
I covered faulty redirects and the impact they can have on search rankings, but there’s another scenario that can get you in trouble.  Google also explains that smartphone-only errors can also result in demoted smartphone rankings.  And in my experience with auditing websites, these types of errors can go unnoticed for a long time.

For example, if you incorrectly handle Googlebot for smartphones, then you could incorrectly present error pages to users.  In addition, the code that handles mobile pages could be bombing, which would also present errors to smartphone users.  Needless to say, I highly recommend testing your setup thoroughly via a number of devices, checking your site via mobile emulators, and crawling your site as Googlebot for smartphones.  The combination will often reveal problems lying below the mobile surface.

Note, Google Webmaster Tools also recently added smartphone crawl errors.  The report provides a wealth of information about the errors that Googlebot for Smartphones is running into.  And that includes server errors, 404s, soft 404s, faulty redirects, and blocked urls.  I highly recommend you check out your reporting today.  You never know what you’re going to find.

Smartphone Crawl Errors Reporting in Google Webmaster Tools

 

Summary – Turning Demotions into Promotions
As mobile booms, more and more people are searching from their smartphones.  Google is well aware of the problems that mobile users can face while searching for, and viewing, content on their phones.  And in response to those problems, Google will demote your rankings in the smartphone search results.  Electronista is currently implementing faulty redirects, and based on that setup, its rankings are being heavily demoted.  Don’t let this happen to you.  Check your setup, view your reporting in Google Webmaster Tools, and then quickly fix any problems you are presenting to mobile users.  Think of all the traffic you might be losing by not having the right mobile setup in place.  The good news is it’s a relatively easy fix.  Now fire up those smartphones and visit your site.  :)

GG

Sunday, March 2nd, 2014

Flawed Google Algorithm Updates, Movie Blogs, and Copyright Infringement – Tracking The Panda Updates From February 2014

Summary:  Google ended up rolling out two major algorithm updates in February of 2014.  The first, which seemed like the monthly Panda update, caused serious collateral damage with a number of prominent movie blogs. After hearing from one movie blog owner, Matt Cutts of Google got involved, and Google refined the algorithm.  Two days later, Google Organic traffic surged back to the movie blogs, signaling that Google rolled out the algorithm update again. This post provides details, analysis, and findings based on analyzing movie blogs impacted by the UFeb14 and UFeb14Rev algorithm updates.

Flawed Panda Update from February 2014

I’ve been following a fascinating SEO situation over the past ten days.  And based on my analysis, I might have found something big.  As in Panda big.  So if you’re the type of person that’s interested in algorithm updates, or if you have been impacted by algo updates in the past, then this post is for you.

As I explained in my last post about the UJan14 update, Panda rolls out monthly, but Google won’t confirm those updates anymore.  In addition, it could take ten days to fully roll out.  That combination makes for a confusing situation for webmasters dealing with Panda attacks.  But, SEOs neck deep in Panda work can often see those updates, as they are helping a number of companies recover, while they also have companies with fresh hits reach out to them.

Those SEOs can act as human barometers for Panda updates.  And since I’m helping a number of companies deal with Panda hits, and I often have companies hit by algo updates reach out to me, I’m fortunate to have access to a lot of Panda data.  And that data can often signal when Panda rolls out each month.  In my last post, I documented the UJan14 update, based on seeing several companies recover and hearing from those on the flip side (the ones that got hit).  Those websites unfortunately got hammered, typically by 25-40% – overnight.

At the end of that post, I mentioned that the February Panda update looked like it was rolling out (right around February 11, 2014).  That made a lot of sense, since it was exactly one month from the previous Panda update.  By the way, I am using the naming convention U{Month}{Year} to track unconfirmed updates by Google, so February’s update would be UFeb14.

Well, it seems I was right.  After my post, I saw a number of companies impacted heavily by UFeb14, and most saw that impact beginning around February 11th through the 14th.  Based on seeing those hits and recoveries in early February, it was already a big deal that Panda was rolling out.  But little did I know what was coming…  and it was big.

Peter Sciretta of SlashFilm Tweets and Matt Cutts Responds

On February 21, 2014, Peter Sciretta from SlashFilm tweeted the following to Matt Cutts:

 

Boy, that got my attention for sure.  I always look for common themes when analyzing Panda to see if any new factors have been added to the algo, if there was collateral damage, etc.  As many in SEO know, Matt definitely responds to some people reaching out with messages, so I waited to see if he would respond.  And he did, on February 22nd, Matt responded with the following tweet:

 

OK, so that response was very interesting.  First, he hopes to dig into this soon… Really?  Wow, so Matt is digging into an SEO situation based on a tweet?  That was the first signal that Panda could have gone rogue.  Second, he apologized for the delay in responding.  Yes, another sign that Panda could have eaten some bad bamboo and went ballistic on sites that weren’t actually Panda targets.

So I ran to SEMRush and SearchMetrics to check out the damage.  And there was damage all right… Serious damage.  Check out the trending from SEMRush for SlashFilm:

SlashFilm Drop on February 14, 2014

Which led me to check out other sites in that niche.  And I found ScreenRant also had a huge drop.

SlashFilm Drop on February 14, 2014

And they weren’t alone.  A number of prominent movie blogs got absolutely crushed during UFeb14.  Based on SEMRush data, the movie blogs I analyzed lost between 40% and 50% of their Google Organic traffic overnight.  Boom.

U-Shaped Trending – The Blogs Bounce Back
What happened up to that point was already enough to have me heavily analyze the movie blogs, but the story was about to get better.  Each morning following Matt’s tweet, I planned to quickly check the trending for the movie blogs I was monitoring to see if there were any signs of recovery.  If Matt was checking on the situation, and if it was indeed a flaw in the algorithm, then Google could possibly roll out that algorithm update again.

The 23rd was quiet.  No changes there.  And then the 24th arrived, and what I saw blew me away.  SlashFilm’s trending popped.  Yes, it absolutely looked like they started to recover.  Check it out below:

SlashFilm Recovery on February 24, 2014

And ScreenRant showed the same exact jump.  Wow, this was big.  We just witnessed a flaw in the algo get rolled out, cause serious collateral damage, get re-analyzed, tweaked, and then rolled out again less than two days later.  And then the movie blogs recover.  I don’t know about you, but that’s the fastest Panda recovery in the history of Panda!  :)  Fascinating, to say the least.

So, I tweeted Barry Schwartz and Peter from SlashFilm about what I saw, and Peter did confirm they were seeing a big recovery.  He also said the following, which I thought was interesting:

And that’s some error… It’s a great example of how catastrophic major algorithm updates can be, especially when there’s a flaw in the algorithm that causes collateral damage.  Losing 40-50% of your organic search traffic overnight could end some companies.  And then there’s the most important question that Panda victims have been asking themselves since this happened.  What if Peter didn’t complain to Matt Cutts?  Would Google have picked up the problem on its own?  How long would that have taken?  And how much damage would those movie blogs would have experienced traffic-wise, business-wise, etc?  All great questions, and only Google knows.

Digging Into the Panda Data
For those of you that are familiar with my work, my blogging, etc., you probably know what’s coming next.  There’s no way in heck I would let this situation run by without heavily analyzing those movie blogs that experienced a serious drop in traffic.  I had many questions.  Why did they get hit?  Were there any consistencies across the websites?  What factors could have led to the flawed drop on 2/14/14?  And what was the flaw in the algorithm that triggered Panda hits on the movie blogs?

So I started collecting data immediately, and I would refresh that data each day.  That’s until I had time in my schedule to analyze the situation (which based on my chaotic schedule wasn’t until 5AM on Saturday morning).  But I’ve now spent a lot of time going through data from movie blogs that got hammered on 2/14/14 and that recovered on 2/24/14.  I’ve also dug into sites that only saw changes on one of those dates (to give me even more data to analyze).

I used SEMRush to uncover all of the keywords that dropped significantly in the rankings starting on February 14, 2014.  I also was able to export the landing pages for each of the keywords.  That was key, as Panda targets low-quality content.  Analyzing that content could help me uncover problems that could have caused the Panda attack.  I did this heavily for both SlashFilm and ScreenRant, as they both experienced a heavy drop on 2/14 and then a large recovery on 2/24.  But I also analyzed other sites in that niche that experienced problems and recoveries during February.  As I mentioned earlier, there were a number of movie websites impacted.

Analysis-wise, I heavily analyzed both sites manually and via crawls.  The crawls were used to flag certain problems SEO-wise, which could lead me down new paths.  My manual analysis was based on my extensive work with helping companies with Panda (knowing certain criteria that can cause Panda attacks).  The combination of the two helped me identify some very interesting factors that could have led to the faulty Panda hits.

Here’s what I found… and stick with me.  I’ll take you through some red flags before explaining what I think the actual cause was.   Don’t jump to conclusions until you read all of the information.
 
1. Thin Content
It was hard to overlook the overwhelming amount of thin content I was coming across during my analysis.  And when Panda targets low-quality content, which can often be extremely thin content, that had my attention for sure.  For example, pages with simply an image, blog posts that were just a few sentences, etc.

Thin Content on Movie Blogs

But, this was not a unique factor for February (or for just movie blogs), which is what I was looking for.  Previous Panda updates could have absolutely crushed these blogs for thin content already… so why now?  That led me to believe that thin content, although a big problem with the movie blogs I was analyzing, wasn’t the cause of the UFeb14 hit they took on 2/14/14.   It met the “consistency” factor, since it was across the movie blogs, but wasn’t unique to this update.  Let’s move on.

 

2. Affiliate Links
I’ve helped a number of companies with Panda that were engaged in affiliate marketing.  Unfortunately, many affiliate marketers have gotten crushed since February of 2011 when Panda first rolled out.  So, it was interesting to see what looked to be followed affiliate links to Amazon on a number of pages I analyzed.  Those pages were thin, provided a quick mechanism to send along affiliate traffic to Amazon, and could absolutely get a website in trouble SEO-wise.

Affiliate Links on Movie Blogs

But two things stuck out…  First, compared to overall indexation on the sites I was analyzing, the number of pages with affiliate links was low (at least for the affiliate links I picked up).  Second, I did not find the same type of links across movie blogs that were hit.  So, the “consistency” factor was not there.  Time to move on (although I would caution the movie blogs that providing followed affiliate links violates Google Webmaster Guidelines).

3. Zergnet and Other Content Syndication Networks
Moving from inconsistency to consistency, I found a common thread across almost every movie blog I analyzed.  I found Zergnet links at the bottom of each post.  Zergnet is a content syndication network (similar to Outbrain, Zemanta, etc.)  On the surface, and in their most current form, these networks shouldn’t impact SEO negatively.  The links are nofollowed, which they should be.

But, in the past some of the networks were used to gain followed links from relevant websites across the web.  And that violates Google Webmaster Guidelines.  Actually, I’m helping several companies right now try to clean up followed links from older pages that still have followed links via Zemanta.  Here’s what the Zergnet links look like on the movie blogs:

Zergnet Links on Movie Blogs

But, like I explained above, the current implementation of Zergnet links is fine right now.  All of the links are nofollowed, which should shield the sites from any Google damage.  Let’s move on.

4. Videos, Trailers, and Copyright Infringement – Bingo
When the movie blogs got hit, a number of people in SEO (including myself) started making the connection between YouTube, copyright infringement, and the algo hits.  As movie blogs, one could only imagine that there were a lot of posts about movies that contain video clips, trailers, etc.  So, I was interested in seeing how much video footage I would come across during my analysis, and how much of that was problematic copyright infringement-wise.

And since we live in an embeddable world (with YouTube and other video networks making it easy to embed video clips on your own website), questions started to arise about how Google could treat the various parties involved in copyright infringement. In other words, who is the source of copyright infringement?  And how can you police others SEO-wise that might be part of the problem?  What about websites that simply embed public YouTube clips?  All good questions, and I was eager to dig in.

It wasn’t long before I came across webpages with video clips that had copyright infringement problems.  Now, those clips were typically sourced at YouTube or other video networks like Yahoo Video.  The great part about YouTube clips that were taken down is that they will literally provide a message in the clip that the user associated with the account has been removed due to copyright problems.  That made my analysis easier, to say the least.

So, trailer by trailer, video clip by video clip, I came across more and more examples of videos and users removed due to copyright infringement.  Here are some screenshots based on my research.  Notice the copyright infringement messages on pages that got hammered during the UFeb14 algorithm update:

Copyright Infringement Notice for YouTube Videos

 

More Copyright Infringement Notice for YouTube Videos

And ladies and gentlemen, this is where I think the flawed algo incorrectly targeted the movie blogs.  SlashFilm, ScreenRant, and others weren’t the source of copyright infringement.  They were simply end users that embedded those clips in their own posts.  So, if YouTube originally let the clips reside on its own network, and freely let users embed those clips on their own sites, could Google actually penalize those destination websites?

That wouldn’t be right… The penalty should simply be a bad user experience for visitors of the blogs, since the clips won’t play.  Now, if the movie blogs were creating their own videos that violated copyright laws, then I get it.  But shouldn’t that damage come via the Pirate Update?  I heavily analyzed the Pirate Algorithm recently, and you can read more about my findings by following that link.

So, it was interesting to see copyright-driven factors severely impact websites during what seemed to be the monthly Panda update.  Is Google incorporating more Pirate factors into Panda?  Are we seeing the maturation of Pirate into a rolling monthly update like Panda?  Was this the first time Google tried to incorporate Pirate into the monthly update?  All good questions.

Back to Video & More Embed Problems…
Visit after visit, page after page, I came across all types of video embed problems on the movie blogs.  For example, I saw copyright notices, blank videos (like the videos were removed from the services being used), embed code actually on the page versus the videos, messages that a video was now marked as private, etc.  All of this could very well be tied to copyright infringement.

Video Embed Problems on Movie Blogs

 

More YouTube Embed Problems on Movie Blogs

CinemaBlend and Recovery During UFeb14Rev
And the plot thickens…  The examples listed earlier were based on analyzing sites that experienced a major hit on 2/14 and then recovered on 2/24.  But what about other movie sites during that timeframe?  Did any experience unusual declines or surges?  Yes, they did.  I started checking many sites in the movie blog niche, and one in particular caught my attention. Check out the trending for CinemaBlend.com:

CinemaBlend Panda Recovery

Wow, they experienced a huge surge in traffic once UFeb14Rev rolled out (the revised algorithm update that rolled out once Matt Cutts got involved).  It looks like they originally got hit in January (I’m assuming by UJan14).  Connecting the dots, if CinemaBlend recovered during the revised February update, then could they have been wrongly impacted in January?  Why would they recover during UFeb14Rev and not just UFeb14?  Yes, I had to dig in.  Down the rabbit hole I went…  And yes, this was becoming my own version of Inception.  One SEO rabbit hole led to another rabbit hole.  Maybe I should create a movie trailer about it and embed it here.  :)

I began digging into the data, based on CinemaBlend’s recovery and was eager to see if video clips, trailers, and copyright infringement would surface.  But what I found was really interesting… The pages looked clean… but almost too clean.  There were pages optimized for trailers, when in fact, there were no videos embedded on the page.  At least now.  The more pages I checked, the stranger that situation became…   Many trailer pages either contained blank spots where videos once resided, or the pages just contained no videos at all.  Very strange.

So it begs the question, did CinemaBlend quickly deal with their Panda hit from January?  Did they analyze their landing pages seeing a drop and remove dead video clips, videos that were flagged for copyright infringement, etc?  I can’t say for sure, but the crime scene looked too pristine to me.

Video Trailer Page on CinemaBlend

What This Means For SEOs, Panda Victims, Movie Sites, and Google
Based on what happened during February, there are some important points I wanted to list.  If you are dealing with a Panda situation, if you are susceptible to Panda, if you are an SEO helping others with Panda, or if your business simply relies on Google Organic traffic, then the bullets below should be of extreme importance to you.

  • Google Does Make Mistakes
    And when those mistakes are tied to major algorithm updates, collateral damage could occur (in grand ways).  Based on what happened with the movie blogs, I think all of the website owners owe Peter Sciretta a drink (or a bonus).  Without Peter speaking up, it’s hard to say how long that ugly situation would have gone on.  Instead, it was only ten days.
  • Know What You Post (and the source of that information)
    As more and more algorithm updates are being crafted in Google labs, and subsequently injected into the real-time algorithm, it’s more important than ever to know your site inside and out.  Know what you are posting, where it’s from, if it’s original, if you are breaking any copyright laws, if it’s scraped, etc.  If you don’t, you are leaving yourself susceptible to future Panda and Pirate attacks.  Talk about a shot across your bow.  :)  Be vigilant.
  • Unconfirmed Updates Create Madness in Webmasters
    I called this when it was first announced, but Panda updates without confirmation can be disastrous for webmasters.  It’s hard enough for SEOs neck deep in Panda work to decipher what’s going on, but it’s exponentially harder for people outside of SEO to know what happened.  It’s one of the reasons I’ve been writing more and more about Panda updates.  I want to make sure we document major algo updates that seem to be Panda (roll out once per month, target low-quality content, etc.)  Without some form of identification, we’ll be living in a quasi, post-apocalyptic web.  Queue another trailer, this time with Matt Cutts standing in for Mad Max.  :)
  • Track and Document Everything You Can (And Speak Up)
    It’s more important than ever to analyze your website, your analytics reporting, Google Webmaster Tools data, etc.  Use annotations in Google Analytics to mark dips and surges in traffic, add information about confirmed and unconfirmed algorithm updates, export your data regularly, and monitor the competition.  If you end up in a situation like the movie blogs, you’ll have a lot of data to analyze, to hand SEOs that are helping you, and even to provide Google if it comes to that.

 

A Quick Note About UFeb14 and UFeb14Rev
I know a number of people have reached out to me since UFeb14 rolled out on 2/11/14 asking for more details.  I focused this post on the movie blog situation, based on how unique and fascinating it was.  But, I do plan to write more about the latest Panda update (so stay tuned).  As I said earlier, it’s important to document as many algorithm updates as we can so webmasters impacted by those updates can have some idea what hit them, what the root causes of their problems are, etc.

Summary – Flawed Algorithms, Movie Blogs, and Collateral Damage
Based on my experience with Panda, the past ten days have been fascinating to analyze.  Needless to say, you don’t often see algorithm updates roll out, only to get refined days later before a second rollout.  But that’s exactly what we saw here with UFeb14 and UFeb14Rev.  On the one hand, it was great to see Google move quickly to rectify a flaw in the UFeb14 algorithm update.  But on the other hand, it makes you wonder how many other flaws are out there, and how many sites have been wrongly impacted by those flaws.

For Peter Sciretta, and his fellow movie bloggers, they dodged a serious bullet.  Actually, it was a magic bullet.  One that first passed right through their hearts, pulled a 180, headed back to Google, was taken apart and refined, and then shot back out across the web.  But how many other flawed bullets have been shot?  Wait, it sounds like a great storyline for a new movie.  Maybe Peter can connect me with some movie producers.  :)

GG

Monday, February 17th, 2014

Panda UJan14 – Uncovering the Panda Update from January 11, 2014 (with a note about Expedia)

Panda Update on January 11, 2014

As of July 2013, Google will not confirm Panda updates anymore.  And as I explained in my post about unconfirmed Panda updates, this can lead to serious confusion for webmasters.  For example, if Panda updates are not documented, then it becomes that much harder to understand why a serious drop in organic search traffic occurred.  Was it Panda, a smaller algo change, was the drop due to links, or other factors?  Even when Panda updates were confirmed, it was still a a confusing topic for business owners.  And now it’s even more confusing since those updates are cloaked.

According to John Mueller of Google (via a webmaster hangout video), the Panda algorithm is now trusted enough that Google feels comfortable rolling it out once per month.  That link should jump you to 22:58 in a video where John speaks about Panda.  It’s not real-time like some people think, it’s simply trusted more than it once was (and Google can bypass some of the testing it used to implement prior to rolling out Panda).  The new Panda can take ten days to fully roll out, and again, Google will not provide confirmation of the updates.  So yes, Panda updates have been occurring since the last confirmed update, but it’s just harder to pinpoint those exact dates.

Human Panda Barometers
In my post about unconfirmed Panda updates, I explained that SEOs well-versed in Panda can typically shed some light on new updates.  That’s because they have access to a lot of data.  And not just any data, but Panda data.  The more companies an SEO is helping with Panda, the more that SEO has visibility into when Panda actually rolls out.  In addition, SEOs heavily working with Panda might have more companies reach out to them that were impacted by subsequent Panda updates.  That’s even more Panda data to analyze.

That’s why I believe SEOs heavily involved with algorithm updates can act like human Panda barometers, and can help determine when new updates roll out.  Based on my work with Panda, I’ve had the opportunity to see when some cloaked Panda updates rolled out (like the August and September Panda updates that I documented in my post from November).  The reason I can identify some of the newer Panda updates is because some of the companies I’m helping see recovery, while other companies that were just hit by Panda reach out to me for help.  The combination of the two enables me to pick up when some Panda updates roll out.


Welcoming 2014 with a Panda Update – January 11th Specifically
So, 2014 kicked off and I was wondering when the first major algorithm update would happen.  And it didn’t take long… as January 11th was a tough day for many webmasters.  Right around the 11th, I noticed an uptick in webmaster chatter about an update occurring, which quickly led me to Google Analytics to trend Google organic search traffic across several websites dealing with Panda problems.   Low and behold, there was significant movement.

Check out the SEO visibility for a company that got hit by the January 2014 update:

Website Impacted by Panda UJan14

In addition to companies I am currently helping, my inbox also confirmed something was going on.  I had several new companies reaching out to me after the 11th explaining that they saw a major hit starting on that date.  Upon checking their reporting, you could clearly see a significant drop beginning on January 11, 2014.  And digging deeper revealed that a number of those companies had battled with Panda in the past.  A few had also exchanged blows with Phantom on May 8, 2013.

This led me to believe that we were witnessing our first Panda update of 2014.  And since I’m a big believer in naming updates to document them specifically, I’m going to name this one too.  I’m calling it Panda UJan14, for “Unconfirmed January 2014″.  I think this naming convention works extremely well, since the new Panda is supposed to roll out monthly.  Providing the month and year in the update will help clarify when those updates rolled out.

And based on the data I have analyzed since July, here are the Panda updates I believe have rolled out since the last confirmed update in July 2013.  Notice how they are approximately one month apart:

  • Panda UAug2013 – on August 26th
  • Panda USep2013 – on September 16th
  • Panda UNov2013 – on November 18th  - (Note, I don’t have a lot of data backing this update, but several sites I analyzed saw significant movement on the 18.)
  • Panda UDec2013 – on December 17th
  • The latest – Panda UJan2014 – on January 11, 2014

The Impact of Panda UJan14
Let’s start with the negative impact of the latest Panda update.  The companies that reached out to me after getting hit by Panda UJan14 saw a big drop from Google Organic search traffic.  That ranged from 20-35% and began on January 11, 2014.  Here’s the SEO visibility of another site hit by the January Panda update:

Negative Impact from Panda UJan2014

As mentioned earlier, a number of those companies had previous battles with Panda.  Clearly, they had content quality issues from a Panda standpoint.  When speaking with the business owners about the drop, they all explained implementing changes over the years when dealing with previous Panda updates.  But as I explained in a post about the grey area of Panda, if you don’t significantly tackle the content quality situation, you could very well get hit again (or not recover in the first place).  It’s extremely important to make significant changes in order to exit the grey area.  If you don’t, you could sit in the grey area of Panda forever, never knowing how close you are to recovery.  Several of the companies that were hit in January did not do enough to clean up their Panda issues, and were subsequently hit with another Panda update.

Now the positive impact from UJan2014.  On the flip side of the Panda hits were some positive stories.  A few companies I have been helping saw increases ranging from 15-25% based on the January 11, 2014 update.  These were companies that experienced previous Panda and/or Phantom hits, have been working on fixing their content problems, and saw an increase in Google organic traffic during the UCJan14 update.

Notice the uptick in impressions and clicks starting on January 11th:

Positive Impact from Panda UJan2014

It’s important to note that several of the companies did not recover fully to pre-Panda or pre-Phantom levels, but they definitely saw a nice increase.  Remember, there’s a reason the sites got hit by Panda in the first place.  The content that was once ranking well and driving traffic shouldn’t have been ranking that well in the first place…  which led to a lot of traffic with serious engagement issues.  And serious engagement issues (like extremely low dwell time), can cause a Panda attack.  More context about that situation in my Search Engine Watch column about the sinister surge before Panda strikes.

Others Benefiting From the Drop
In addition to websites recovering from Panda, I noticed a number of companies simply benefiting from the hits others were taking.  For example, if certain companies drop out of the rankings, then others take their place.  Those companies were simply benefiting from the drop in rankings of January Panda victims.

For example, here’s the trending for a site that directly competes with a Panda victim I analyzed.  Notice the jump starting around January 11, 2014.

A website benefiting from Panda UJan2014

A Note About Expedia – Was it Panda Versus a Manual Action?
It sure looks that way to me.  Nobody knows for sure other than Google and Expedia, but the drop occurred exactly when I saw the January Panda update.  Check out the trending below based on Search Metrics data.

Was Expedia Hit by Panda UJan2014?

That’s just something to think about since many people believe that Expedia was hit by an unnatural links penalty.  I tend to think it was Panda instead.  That said, I would have to heavily analyze the keywords that were impacted, the content that was once ranking, etc. to better determine if that was the case.

Summary – The Importance of Monitoring Cloaked Panda Updates
As I explained above, it’s getting extremely difficult to identify Panda updates.  They are supposed to roll out monthly, take ten days to fully roll out, but Google won’t confirm when the updates occur.  For the average business owner, this is a recipe for serious confusion when organic search trending takes a dive.

My goal with posts like this one is to provide as much data as I can with regard to major algorithm updates so webmasters can take the appropriate actions to rectify the problems at hand.  Without understanding the specific algorithm update that hits a website, companies could struggle with deciphering the root cause of the problem.  And that could easily lead to spinning wheels, or in a worst case scenario, implementing changes that actually make the situation worse SEO-wise.  Moving forward, I’ll try and document subsequent Panda updates the best I can.

But hold on… has the next Panda update already rolled out??  There’s a lot of chatter about a February update and I am seeing movement across sites hit by Panda (starting around February 11th).  It very well could be Panda UFeb14.  The timing makes a lot of sense as well, since the last update was exactly one month ago.  I’ll know more in a few days once more data comes in.  Stay tuned.  :)

GG

 

 

Monday, January 27th, 2014

In-SERP Hover Cards – How Google Could Surface Your Answers, Products, Downloads, Reviews, Events, and More Directly in the Search Results

New Google Hover Card in SERPs for Beats Music
Last Wednesday, Google rolled out new functionality in the search results, which sure got the attention of SEOs across the industry.  Now when searching for information, you will sometimes see an additional link directly in the search results for specific organizations and/or websites.  Users can click on that link to view additional information about that organization right in the search results (via data from Google’s Knowledge Graph).

Google states that this can occur for websites that are “widely recognized as notable online, when there is enough information to show or when the content may be handy for you.”  When clicking the link next to the URL in the search snippet, a small window opens providing the information.  It’s basically a hover card that provides additional information.  This is an important move by Google, since users don’t need to leave the search results to find more information.

Here’s an example of the info card for Netflix:
New Google Hover Card in SERPs for Netflix

The information displayed in the hover card is based on Google’s Knowledge Graph, or data that Google has collected about “real world things”.  Knowledge Graph data comes from a variety of trusted sources, including Freebase (which Google acquired), Wikipedia, CIA World Factbook, etc. As of July of 2012, Google had collected information about 570 million entities, including 18 billion facts and connections.

To quickly summarize the new addition to the search engine results pages (SERPs), if you are searching for answers, and Google has information in its Knowledge Graph about the sites ranking in the search results, you just might see that new link appear directly within the search listing.  And if you click that link, you’ll see Knowledge Graph data in a small window directly in the search results.

Hover Creep: Your Content, Answers, Products, and Downloads Directly in the Search Results?
As I was testing these new “Info Cards”, I started to think deeper about what was occurring, and how this might be the next phase of a monumental shift for Google.  Over the past few years, SEOs have seen Google provide more and more information directly in the search results.  For example, check out all of the types of answers Google will provide right in the results (courtesy of Pete Meyers).  Based on this shift to the all-knowing SERP, many SEOs believe that at some point, Google won’t need to drive users to third party websites anymore.  Instead, maybe it could provide all the information directly in the results.

Don’t believe me?  How about this search for “calories in bananas”:
Nutrition Information in the Search Results

 

Expanding Hover Cards – Coming Soon to a SERP Near You
Based on how much information Google is already providing in the search results (driven by Knowledge Graph data), combined with new hover card functionality in the search results, is it really far-fetched to think Google could expand this approach?  Sure, it won’t happen overnight, but as Google collects and trusts more information from trusted third parties, it could absolutely start providing that data right in the search results.

And that little popup window (hover card) is the first sign that Google isn’t afraid to add more information directly in the SERPs for specific listings.  Let’s face it, providing author details (based on authorship markup) is one thing.  But using a hover card to provide more content per search listing is another.

And maybe this is just a test to see how users react before rolling out more and more content directly in the search results.  And maybe it’s not limited to content… maybe other types of functionality are coming, like ecommerce functionality, downloads, sign-ups, etc.  Now that would be interesting, unless of course, you’re the owner of that content, download, etc. who gets cut out of the process.  Yes, beware the hover card.

So, let’s have some fun and explore what this could look like and how it could work.  It just might be closer than you think.


Trusted Sources, and a Note About Publishership
Some of you reading this post might be wondering how Google could verify typical websites, especially since it’s using trusted data for the recent release of “info cards”.   For example, Google trusts the data in its Knowledge Graph, so it’s comfortable providing the popup window with more information about certain entities.  But will it do this for the average site on the web?  If Google is going to provide more information directly in the search results, then it’s going to have to trust those third party websites, and their content, to do so.

Although many website owners have been focused on authorship markup, where author details can show up in the search results, there is publishership as well.  By claiming publishership (rel=publisher), Google can connect a website to an entity in Google Plus (similar to the way an author is tied to a G+ profile).  That connection could possibly be the basis for providing more content in the search results.  And yes, this could drive even more people to Google+ over the next few years.

By the way, just last year Google tested out showing publisher images in the search results (similar to author details).  I saw the test live, and others did too.  I almost fell out of my seat when I saw client logos in the search results.  That test was removed quickly once word started getting out, but here’s a screenshot of what that looked like.  Check out the publisher image in the search results below:

Publisher Markup in the Search Results

So, if Google understands more about a website via publishership, maybe it can use data from the website to provide more information directly in the SERPs.  Hey, it’s entirely possible.

Now, if this was the case, at least website owners could remove publishership from their sites (if they didn’t like Google providing more data directly in the search results).  But that could be a double-edged sword for content owners.  Sure, you could stop Google from providing your answers in the search results, but maybe Google won’t rank your listings highly anymore (since it’s getting more engagement from listings that provide the in-SERP functionality).    Who knows, I’m just thinking out loud here…

Now let’s take a look at what could potentially appear in the SERPs if this comes to fruition.

Hover Cards and Google – The Various Types of Content and Functionality That Could Appear Directly in the Search Results
Based on what I explained above, how could Google implement additional content or functionality directly in the search results?  And what would it look like?  I started brainstorming a few different ways this could happen and have provided some possibilities below.  Note, these are just some logical options based on what I’ve seen happening with Google and its search results over the past few years.  There are definitely more possibilities than what I’m listing below, but this is a good start.

And yes, in-SERP content and functionality could have a huge impact on websites and businesses.  I’ll cover more about that later in the post.

1. Direct Answers (From Your Site)
There are a lot of companies receiving traffic from users based on queries for direct answers to questions.  Again, Google is already providing many answer boxes for various topics (as covered earlier).  But that’s not per listing in the search engine results pages…  it’s usually via an answer box at the top of the search results.  That’s much different than a hover card per search listing (or for certain listings in the SERPs).

Let’s use my website as an example.  How about a search for “how many dmca requests google impact”?  That’s a search related to the Pirate Update, which I covered extensively in a post in December.  If Google provides the answer in the SERP via an “Answer Card”, it could look like this:

Google Answer Card in the Search Results

If this type of answer card rolls out, and the hover card provides enough of the answer, users will never hit your site.  So, if you are hoping that users visit your site to find the answer, and then take some other action on your website, good luck.  You better start thinking of another way to get that to happen.

2. How-Tos  or Tutorial Segments
If someone searches for how to perform a certain task, and that task is limited in steps, then maybe that information could show up in the search results via a “Tutorial Card”.  Or maybe someone is searching for a specific step in a tutorial.  Google could provide just that step in a hover card directly in the SERPs.

Google Tutorial Card in the Search Results

3. Product or Service Information
If someone is interested in a certain product category or service, then maybe that information is pulled directly from sites in that niche.  For example, if someone searches for “IT consulting” or “computer science” or “4K television”, Google could provide that information directly in the SERPs via a “Product or Service Card”.  For example:

Google Category Card in the Search Results

4. Ecommerce – Fighting Amazon via the “Ecommerce Card”
Information is great, but let’s talk about ecommerce.  Google and Amazon battle heavily in the ecommerce space.  Sure, Google doesn’t sell anything directly, but they make a boatload of money via paid search.  And product listing ads (PLAs) are at the heart of that growth right now.  On the flipside, many people go directly to Amazon to search for products.  That’s the result of a huge inventory, a boatload of review data, and Prime membership (with free, two-day shipping).

But, what if Google decided to provide one-click ecommerce functionality directly in the SERPs?  This could be handled by connecting your Google profile to Google Wallet and buying products directly in the SERPs via the “Ecommerce Card”.  This would be amazing for people that already know which product they want to buy.  It could look like this:

Google Ecommerce Card in the Search Results

And yes, this would be like AdWords on steroids since Google could generate revenue via the organic listings by earning a percentage of the sale.  Holy cow.  :)  More about the ecommerce impact later in this post.

 

5. Reviews
Going even further with our ecommerce example, if someone searched for reviews of a product or service, Google could surface that information and provide it directly in the “Review Card”.   For some people, the review snippet below would be enough.  And that could drastically impact the downstream traffic to pcmag.com.

Google Review Card in the Search Results

6. Downloads
Along the same lines, what if you were looking to download content via pdfs (or other formats)?  Imagine Google provided this download functionality via a “Download Card” directly in the search results.  Google could scan each file for malware and tee it up for users to download.  And if you want to charge for that file, then you can combine the “Ecommerce Card” with the “Download Card”.  That would be a smart combination for sure.

Google Download Card in the Search Results

7. Sign-ups/Registration
Looking to sign up for a webinar, join an email list, or confirm you’ll be attending an event?  Registration functionality could also be provided directly in the search results.  Actually, Google has already been testing functionality for joining email lists in AdWords (via ads in the search results).  This could easily be included in a “Registration Card” directly in the organic search results.

Google Registration Card in the Search Results

I can keep going here… but I think you get the picture.  And these hover cards don’t have to be limited to Knowledge Graph data.  If Google can verify certain entities, then it can feel comfortable providing more information to users directly in the search results.  That data could be answers, information, coupon codes, medical information, pricing, reviews, downloads, list signups, ecommerce functionality, and more.

 

What Happens if this Rolls Out?
Website owners will riot in the streets.  :)  Ok, maybe not literally, but this could cause serious problems for many business owners.

Publishers with an Ad-Driven Model
Let’s start with websites earning advertising revenue based on traffic.  Well, if a site is charging a CPM (or cost per thousand impressions), and 40% of its traffic goes away, their revenue will take a huge hit.  And as their traffic numbers plummet, so will their ability to sell advertising on the site.  Publishers will once again need to figure out other ways to monetize, which is no easy feat.

Ecommerce Retailers
Next on the list are ecommerce retailers.  The once pure, ROI-driven organic results will now be asking for a commission.  If Google does roll out the ability to buy directly from the search results via one-click “ecommerce cards”, then it will surely want a cut of the sale.  Remember, advertising comprises a huge percentage of Google’s revenue and product listing ads are doing extremely well for them (via AdWords).  But having the ability to sell via the much larger set of organic listings could be huge for Google.

Blogs and Resources
For those writing great content on blogs and resource websites, then the possibility of having that content surfaced in “answer cards” could be a big problem (although not as big of a problem as large publishers and ecommerce retailers).  The real downside here would be users gaining answers based on your hard work, without needing to visit your site.

And if they don’t visit your site, they can’t find out more about you, they can’t subscribe to your feed, find your social accounts, or contact you.  I’m sure some users will decide to visit the site, but a certain percentage surely won’t.  This could lead to a drop in awareness, which could impact multiple channels for content owners.  i.e. less subscribers, twitter followers, facebook fans, etc.  And of course, this could impact leads and new business for the organizations publishing content.

Hover Card Extensions – A Note About Ads
It’s hard to write about Google without bringing up advertising.  Again, advertising drives ~96% of Google’s revenues, so these new hover cards would probably have some type of advertising component.  I already mentioned the revenue that ecommerce cards could drive (via a percentage of the sale), but Google could absolutely add sponsored information to hover cards.

For example, imagine having the ability to promote certain pages on your site (to increase click through), provide the ability to subscribe to a feed, follow you on Google+, etc. right from the various hover cards.  This type of ad extension could easily be included in the AdWords platform.  And if that happens, Google could expand AdWords-like functionality to the organic listings.  As long as it’s clearly labeled, and it’s actually helpful to users, then it’s a huge win-win for Google.  Users get what they are looking for, and Google just added a massive new source of revenue.

Hover Card Ad Extensions in Google

 

Summary – Hover Cards and the All-Powerful SERP
The addition of “info cards” in the search results caught serious attention last week across the industry.  But is this just the beginning?  Is it merely a test to see how users react to providing more information directly in the search results per listing?  And if it works well, it’s hard to say how much information and functionality Google could provide in the SERPs.

Time will tell how much of what I listed above becomes a reality.  Until then, I recommend continuing to diversify your digital efforts.  If not, you run the risk of transforming from a website with a lot of traffic into a hover card sitting in the search results.  And there’s not much room to play with there.

GG

 

 

Tuesday, January 7th, 2014

Rap Genius Recovery: Analyzing The Keyword Gains and Losses After The Google Penalty Was Lifted

Rap Genius Recovers From Google Penalty

On Christmas Day, Rap Genius was given a heck of a gift from Google.  A penalty that sent their rankings plummeting faster than an anvil off the Eiffel tower.  The loss in traffic has been documented heavily as many keywords dropped from page one to page five and beyond.  And many of those keywords used to rank in positions #1 through #3 (or prime real estate SEO-wise).  Once the penalty was in place, what followed was a huge decrease in visits from Google organic, since most people don’t even venture to page two and beyond.  It’s like Siberia for SEO.

Gaming Links
So what happened that Google had to tear itself away from eggnog and a warm fire to penalize a lyrics website on Christmas Day?  Rap Genius was gaming links, and badly.  No, not just badly, but with such disregard for the consequences that they were almost daring Google to take action.  And that’s until Matt Cutts learned of the matter and took swift action on Rap Genius.

That was Christmas Day. Ho, ho, ho.  You get coal in your lyrical stocking.   I won’t go nuts here explaining the ins and outs of what they were doing.  That’s been documented heavily across the web.  In a nutshell, they were exchanging tweets for links.  If bloggers added a list of rich anchor text links to their posts, then Rap Genius would tweet links to their content.  The bloggers get a boatload of traffic and Rap Genius got links (and a lot of them using rich anchor text like {artist} + {song} + lyrics).  Here’s a quick screenshot of one page breaking the rules:

Rap Genius Unnatural Links

A 10 Day Penalty – LOL
Now, I help a lot of companies with algorithmic hits and manual actions.  Many of the companies contacting me for help broke the rules and are seeking help in identifying and then rectifying their SEO problems.  Depending on the situation, recovery can take months of hard work (or longer).  From an unnatural links standpoint, you need to analyze the site’s link profile, flag unnatural links, remove as many as you can manually, and then disavow the rest.  If you only have 500 links leading to your site, this can happen relatively quickly.  If you have 5 million, it can be a much larger and nastier project.

Rap Genius has 1.5 million links showing in Majestic’s Fresh Index.  And as you start to drill into the anchor text leading to the site, there are many questionable links.  You can reference their own post about the recovery to see examples of what I’m referring to.  Needless to say, they had a lot of work to do in order to recover.

So, you would think that it would take some time to track down, remove, and then disavow the unnatural links that caused them so much grief.  And then they would need to craft a serious reconsideration request documenting how they broke the rules, how they fixed the problem, and of course, offer a sincere apology for what they did (with a guarantee they will never do it again).   Then Google would need to go through the recon request, check all of the removals and hard work, and then decide whether the manual action should be lifted, or if Rap Genius had more work to do.  This should take at least a few weeks, right?  Wrong.  How about 10 days.

Rap Genius Recovers After 10 Days

Only 10 days after receiving a manual action, Rap Genius is back in Google.  As you can guess, the SEO community was not exactly thrilled with the news.  Screams of special treatment rang through the twitterverse, as Rap Genius explained that Google helped them to some degree understand how to best tackle the situation, or what to target.  Believe me, that’s rare.  Really rare…

Process for Removing and Disavowing Links
Rap Genius wrote a post about the recovery on January 4th, which included the detailed process for identifying and then dealing with unnatural links.  They had thousands of links to deal with, beginning with a master list of 178K.  From that master list, they started to drill into specific domains to identify unnatural links.   Once they did, Rap Genius removed what they could and disavowed the rest using Google’s Disavow Tool.   Following their work, Google removed the manual action on January 4th and Rap Genius was back in Google.

But many SEOs wondered how much they came back, especially since Rap Genius had to nuke thousands of links.  And many of those links were to deeper pages with rich anchor text.  Well, I’ve been tracking the situation from the start, checking which keywords dropped during the penalty, and now tracking which ones returned to high rankings after the penalty was lifted.  I’ll quickly explain the process I used for tracking rankings and then provide my findings.

My Process for Analyzing Rankings (With Some Nuances)
When the penalty was first applied to Rap Genius, I quickly checked SEMRush to view the organic search trending and to identify keywords that were “lost” and ones that “declined”.  Rap Genius ranks for hundreds of thousands of keywords according to SEMRush and its organic search reporting identified a 70K+ keyword loss based on the penalty.

Note, you can’t compare third party tools to a website’s own analytics reporting, and SEMRush won’t cover every keyword leading to the site.  But, for larger sites with a lot of volume, SEMRush is a fantastic tool viewing the gains and losses for a specific domain.  I’ve found it to be extremely thorough and accurate.

Checking the lost and declined keywords that SEMRush was reporting lined up with manual checks.  Those keywords definitely took a plunge, with Rap Genius appearing on page five or beyond.  And as I mentioned earlier, that’s basically Siberia for organic search.

When the penalty was lifted, I used the same process for checking keywords, but this time I checked the “new” and “improved” categories.  The reporting has shown 43K+ keywords showing in the “new” category, which means those keywords did not rank the last time SEMRush checked that query.

I also used Advanced Web Ranking to check 500 of the top keywords that were ranking prior to the penalty (and that dropped after the manual action was applied).  The keywords I checked were all ranking in the top ten prior to the penalty.  Once the penalty was lifted, I ran the rankings for those keywords.  I wanted to see how much of an improvement there was for the top 500 keywords.

Then I dug into the data based on both SEMRush and Advanced Web Ranking to see what I could find.  I have provided my findings below.   And yes, this is a fluid situation, so rankings could change.  But we have at least a few days of data now.  Without further ado, here’s what I found.

 

Branded Keywords
This was easy. Branded keywords that were obliterated during the penalty returned quickly with strong rankings.  This was completely expected.  For example, if you search for rap genius, rapgenius, or any variation, the site now ranks at the top of the search results.  And the domain name ranks with sitelinks. No surprises here.

Rap Genius Branded Keywords

Category Keywords
For category keywords, like “rap lyrics”, “favorite song lyrics”, and “popular song lyrics”, I saw mixed results after the recovery.  For example, the site now ranks #1 for “rap lyrics”, which makes sense, but does not rank well for “favorite song lyrics” and “popular song lyrics”.  And it ranked well for each of those prior to the penalty.  Although specific song lyric queries are a driving force for rap genius (covered soon), category keywords can drive a lot of volume.  It’s clear that the site didn’t recover for a number of key category keywords.

Rap Genius Category Keywords

 

Artist Keywords
I noticed that the site ranked for a lot of artists prior to the penalty (just the artist name with no modifiers).  For example, “kirko bangz”, “lil b”, etc.  Similar to what I saw with category keywords, I saw mixed results with artists.  Searching for the two artists I listed above does not yield high rankings anymore, when they both ranked on page one prior to the penalty.  Some increased in rankings, but not to page one.  For example, “2 chainz” ranks #12 after the penalty was lifted.  But it was MIA when the penalty was in effect.  Another example is “Kendrick Lamar”, which Rap Genius ranked #8 for prior to the penalty.  The site is not ranking well at all for that query now.  So again, it seems that Rap Genius recovered for some artist queries, but not all.

Rap Genius Artist Keywords

Lyrics Keywords
Based on my research, I could clearly see the power of {song} + lyrics queries for Rap Genius.  It’s a driving force for the site.  And Rap Genius is now ranking again for many of those queries.  When the penalty was first lifted, I started checking a number of those queries and saw Rap Genius back on page one, and sometimes #1.  But when I started checking in scale, you could definitely see that not all keywords returned to high rankings.

Rap Genius High Rankings for Lyrics Keywords

For example, “hallelujah lyrics”, “little things lyrics”, and “roller coaster lyrics” are still off of page one.  Then there are keywords that skyrocketed back up the charts, I mean search rankings.  For example, “swimming pool lyrics”, “marvins room lyrics”, and “not afraid lyrics” all returned after the penalty after being buried.  So, it seems that many song lyrics keywords returned, but there are some that rank page two and beyond.

Rap Genius Low Rankings for Lyrics Keywords

What About Keywords That Were Gamed?
I’m sure some of you are wondering how Rap Genius fared for keywords that were gamed via unnatural links.  For example, “22 two’s lyrics” yields extremely strong rankings for Rap Genius, when it was one of the songs gamed via the link scheme.  Actually, rap genius ranks twice in the top 5.  Go figure.

Rap Genius Rankings for Gamed Links - Jay Z

Ditto for “timbaland know bout me”, which was also one of the songs that made its way into the spammy list of links at the end of articles and posts.  Rap Genius ranks #3 right now.

Rap Genius Rankings for Gamed Links - Timbaland

And then there’s Justin Bieber, which I can’t cover with just one sentence.  Rap Genius currently ranks on page 3 for “Justin Bieber song lyrics”, when it used to rank #8!  And then “Justin Bieber baby lyrics” now ranks #12 on page 2, when it used to rank #8.  But for “Justin Bieber lyrics”, Rap Genius is #10, on page one.

Rap Genius Rankings for Justin Bieber Lyrics

Overall, I saw close to 100 Justin Bieber keywords pop back into the top few pages of Google after the penalty was lifted.  But, many were not on page one anymore… I saw many of those keywords yield rankings on page two or beyond for Rap Genius.  See the screenshot below:

Rap Genius Keywords for Justin Bieber

 

Summary – Rap Genius Recovers, But The Scars Remain
So there you have it.  A rundown of where Rap Genius is after the penalty was lifted.  Again, I can’t see every keyword that was lost or gained during the Christmas Day fiasco, but I could see enough of the data.  It seems that Rap Genius came back strong, but not full-blast.  I saw many keywords return, but still a number that remain buried in Google.

But let’s face it, a 10 day penalty is a slap on the wrist for Rap Genius.  They now have a clean(er) platform back, and can build up on that platform.  That’s a lot better than struggling for months (or longer) with horrible rankings.  As I explained earlier, too many business owners aren’t as lucky as Rap Genius.  10 days and help from Google can quicken up the recovery process.  That’s for sure.

I’ll end with one more screenshot to reinforce the fact that Rap Genius is back.  And it’s a fitting query. :)

Rap Genius I'm Sorry

GG