Wednesday, August 13th, 2014

Affiliate Marketer Attacked by Panda 4.0 Sees Temporary Recovery, Gets Hit Again 5 Days Later [Case Study]

Panda Temporary Recovery Case Study

Panda 4.0 arrived in late May with a fury not seen by many previous updates. It was a HUGE update and many sites were decimated by P4.0. Most businesses reaching out to me after the May 20 update saw drops of 50%+, with some losing 80% of their Google organic search traffic overnight. And on the flip side, recoveries were strong too. There were some companies I was helping with past Panda attacks that saw increases of 200%+, with some seeing over 400% increases. Like I said, everything about Panda 4.0 was big.

Panda Games – The Rundown
A few weeks ago, I was analyzing a Panda tremor and saw some very interesting movement across sites I have been helping. More to come on that front, but that’s not the focus of this post today. That same day, a business owner reached out to me explaining that he saw serious fluctuations on a site of his that was crushed by Panda 4.0. Needless to say between what I was seeing, and what he had just explained, I was interested for sure.

So I asked how much of a recovery he saw during the latest Panda tremor, and what I heard shocked me – “Close to a full recovery.”  Whoa, not many have recovered from Panda 4.0 yet, so now he had my attention. Since my schedule has been insane, I didn’t have time to dig in too much at that point. I was planning to, but just couldn’t during that timeframe.

But then I heard back from the business owner the following week. I was at the Jersey Shore on vacation when a giant wave crashed at my feet (both literally and figuratively).  The business owner’s email read, “FYI, I just lost all of the gains from the recovery last week”.  Once again, my reaction was “Whoa…” :)

So to quickly recap what happened, a site that got crushed by Panda 4.0 ended up recovering during a Panda tremor (in late July), only to get hammered again five days later. By the way, it was a near-full recovery during the five day stint (regaining 75% of its Google organic search traffic). In addition, I’ve been analyzing other Panda 4.0 sites that were impacted during the late July 2014 update (which I plan to cover in future blog posts).  It was big tremor.

Quick Note About Temporary Recoveries:
It’s worth noting that I have seen other Panda victims see increases in Google organic traffic during the recovery phase (almost like the site is being tested). I’ve seen this during Panda work since 2011. I’ll explain more about that phenomenon soon, but I wanted to bring it up now since this site did see a temporary recovery.

Digging In
If you know me at all, you know what came next. I fired up my Keurig and dug into the site. With a cup of Jet Fuel and Black Tiger in me, I wanted to know all I could about this interesting Panda 4.0 case study. In this post, I’ll explain more about the temporary recovery, the factors that led to the Panda hit, why I think the site saw a temporary recovery, and end with some key learnings that are important for any business owners dealing with Panda 4.0 attacks to understand.  Let’s go.

Panda Factors
Although I want to focus on the temporary recovery, let’s quickly cover the initial Panda 4.0 hit. The site is small, containing less than 60 pages indexed. It’s a site covering an extremely focused niche and it’s a partial match domain (PMD). After analyzing the site, here are what I believe to be the core factors that led to the Panda hit.

Heavy Affiliate Content:
Looking through the history of the site reveals an increase of content in 2013 and much of the site content became affiliate-driven. The site was heavily linking to Amazon.com for products tied to the niche (and some were followed affiliate links). So there was a lot of traffic arriving on the site that was quickly going out. That’s never a good situation from a Panda-standpoint. Also, the other content funneled visits to the affiliate pages where the site could have a greater chance at converting those visits into potential sales down the line. And of course, you have followed affiliate links, which should be nofollowed.

I can’t tell you how many affiliate marketers have reached out to me after getting smoked by Panda since February of 2011. If you aren’t providing a serious value-add, then there’s a strong chance of getting crushed. I’ve seen it a thousand times. That’s a nice segue to the next factor – engagement.

Low Engagement, High Bounce Rates
I’ve mentioned many times in Panda blog posts the importance of strong engagement. Google has several ways to measure user engagement, but one of the easiest ways is via dwell time. If someone clicks through a search result on Google, visits a page, and quickly clicks back to the search results, that’s a pretty clear signal that the user didn’t find what they wanted (or that they didn’t have a positive user experience). Low dwell time is a giant invitation to the mighty Panda.

Checking standard bounce rates for top landing pages leading up to the Panda attack revealed extremely high percentages. Many of the pages had 90% or higher bounce rates. I wish the site had implemented Adjusted Bounce Rate (ABR), but it didn’t. ABR is a much stronger view of actual bounce rate that takes time on page into account. That said, many top landing pages with 90%+ bounce rates is not good.

High Bounce Rates Before Panda Struck

No Frills Design, Broken HTML
The site itself did not help build credibility. It was a basic WordPress design with little credibility-building factors. There weren’t clear signs of who ran the site, which company owned the site, etc. It was basically a shell WordPress site that you’ve seen a million times. The “About” page was just a paragraph and doesn’t inform the user about who was actually writing the content, who was behind the site, etc. By the way, I find about pages like that to make matters worse, not better.

In addition, there were several pages with broken html, where some html was showing up on the page itself (like html tags).

Broken HTML and Design and Google Panda

When you are trying to drive strong engagement, trust definitely matters. The less people trust the site and the company behind the content, the less chance you have of retaining them. And again, the more users that jump back to the search results, the more virtual bamboo you are piling up.

Deceptive Ads (Cloaked)
During my analysis, I found ads throughout the content that were very similar in style and design to the content itself. So, it was easy to think the ads were the actual content, which could trick users into clicking the ads. I’ve seen this a number of times while analyzing Panda attacks (and especially Panda 4.0.) In addition, this is even called out in the latest version of the Quality Grader Guidelines.

Deceptive Ads and Panda

I’ve found deception to be an important factor in recent Panda hits, so ads that are cloaked as content can be extremely problematic. Remember, SEOs and digital marketers might pick them up pretty quickly, but we’re not the majority of users browsing the web. Think about what the average person would do if they found those ads… Many would have no idea they were ads and not content. And they sure wouldn’t be happy landing on some advertiser’s website after clicking them.

EMATs
Mixed throughout the content were many exact match anchor text links (EMATs), either pointing to the affiliate pages mentioned before or to off-site authority sites.  For example, a typical landing page would link heavily to the Amazon pages, but also to Wikipedia pages. I’ve seen this tactic used in the past as well with other Panda and Phantom victims (and I’ve even seen this during Penguin analysis).

Typically, the thought process is that if Google sees a site linking to authority sites, then it might trust that site more (the linking site). But it also creates a pattern that’s easy to pick up. It’s not natural to continually link to Wikipedia from many places on your site, and Google’s algorithms can probably pick up the trend when it takes all outbound links into account. And that many of the links are exact match anchor text doesn’t help (since the links throughout the pages tended to look over-optimized and somewhat spammy).

Authorship Backfiring
While analyzing the site, I noticed many of the top landing pages had authorship implemented. But when checking out the author, I got a feeling he wasn’t real. Sure, there was a G+ profile set up, and even other social accounts, but something didn’t feel right about the author.

And using reverse image lookup in Google images, I pulled up the same photo being used elsewhere on the web. In addition, it looked like a stock photo. The one used on the site I was analyzing was cropped to throw off the look (which helped make it look more unique).

So, if I had questions about the author, you better believe Google must have too. And add questionable authorship to the other factors listed above, and you can see how the credibility factor for this site was pushing it into the gray area of Panda. The author in the photo might as well been holding a piece of bamboo.

The Surge, The Hit, The Temporary Recovery, and Subsequent Hit
Below, I’ll quickly detail what happened as the site experienced a roller coaster ride across the giant Panda coaster.

Index status revealed a doubling of pages indexed leading into 2014. My guess is that more content was added to cast a wider net from an affiliate marketing standpoint. And again, many of those pages had affiliate links to Amazon to buy various products. That new content worked (in the short-term). Google organic traffic increased nicely on the site.

Then the site experienced the misleading and sinister surge that I wrote about in my Search Engine Watch column. In March of 2014, the site spiked in Google. Many different keywords related to the niche were driving traffic to the site. But unfortunately, that traffic was all leading to the problems I mentioned earlier.

Surge of Traffic Before Panda Attack

The surge I mentioned enables Google to gain a lot of engagement data from real users. And if you have content quality problems, usability problems, ad problems, etc., then you are feeding Panda a lot of bamboo. And that can easily lead to a Panda attack.

And that’s what happened during Panda 4.0. The wave crashed and the site lost 86% of its Google organic traffic overnight. Yes, 86%. Many of the keywords that the site picked up during the surge were lost during Panda 4.0. The landing pages that were once driving a boatload of organic search traffic dropped off a cliff visits-wise. One page in particular dropped by 96% when you compared post-Panda to pre-Panda (with 30 days of data). That’s a serious hit and speaks volumes about how Google was viewing the website.

Interesting Note – Money Term Untouched
While analyzing the keywords that dropped, it was interesting to see that the site’s money keyword was not impacted at all during Panda 4.0 (or even the second hit which I’ll cover shortly). That keyword, which is also in the domain name, stayed as-is. It’s hard to say why that was the case, but it was. Checking trending throughout the roller coaster ride reveals steady impressions, clicks, and average position.

Money Keyword Unaffected by Panda

July 22, 2014 – The Temporary Recovery
Then along came Tuesday, July 22. The site absolutely spiked with what looked to be a near-full Panda recovery. The site jumped up to 75% of its original traffic levels from Google organic.

Temporary Panda Recovery on July 22, 2014

Checking the keywords that surged back, they matched up very well with the keywords from pre-Panda 4.0. There was clearly a Panda update pushed out, although it was hard to say if it was a Panda tremor (minor tweaks) or something larger. It’s worth noting that I saw other sites dealing with Panda 4.0 hits show serious movement on this day. For example, one large site saw almost a full recovery (from a major Panda 4.0 hit).

July 27, 2014 – It was nice while it lasted.
Well, that was fast. It seems yet another Panda tremor came rolling through and the site lost all of its gains. I’ll cover more about that shortly, but it’s important to note that the site dropped back to its post Panda 4.0 levels. So, the temporary recovery lasted about 5 days. That’s a tough pill to swallow for the business owner, but taking a look at the situation objectively, it makes a lot of sense.

Second Panda Hit After Temporary Recovery

This situation underscores an important point about Panda recovery. You need to make serious changes in order to see long-term improvement. Band-aids and lack of action will get you nowhere. Or worse, it could yield a misleading, temporary recovery that gets your hopes up, only to come crashing down again. Let’s explore the temporary recovery in more detail.

Temporary Recoveries and Panda Tests
I mentioned earlier that I’ve seen Panda victims experience short bumps in Google organic traffic during the recovery phase. I even documented it in one of my Panda recovery case studies. It’s almost like Google is giving the site a second chance, testing user engagement, analyzing the new traffic, etc. And if it likes what it sees, the recovery could stick. In the case study I just mentioned, the site ended up recovering just a few weeks after the temporary bump occurred.

So, will this website experience a similar recovery? You never know, but I doubt it. The site that ended up recovering long-term made massive changes based on a deep Panda audit. They should have recovered (even quicker than they did in my opinion). The site I just analyzed hasn’t made any changes at all, so I doubt it will recover in its current state.

Key Learnings
I’ll end this post with some key learnings based on what I’ve seen with Panda recovery, tremors, etc. If you are struggling with Panda recovery, or if you are helping others with Panda recovery, then the following bullets are important to understand.

  • Google can, and will, push out minor Panda updates (which I call Panda tremors). Sites can recover during those updates to various degrees. For example, I saw a large-scale Panda 4.0 victim experience a near-full recovery during the July 22 update.
  • Small websites can get hammered by Panda too. I know there’s often a lot of focus on large-scale websites with many pages indexed, but I’ve analyzed and helped a number of small sites with Panda hits. Panda is size-agnostic.
  • When websites stir up a serious Panda cocktail, it can experience a misleading surge in traffic, followed by a catastrophic Panda attack. Understanding the factors that can lead to a Panda hit is extremely important. You should avoid them like the plague.
  • Be ready for Panda tests. When Google tests your site again, make sure you are ready from a content, ad, and engagement standpoint. Do the right things Panda-wise so you can pass with flying colors. If not, don’t bank on a recovery sticking. It might just be temporary…
  • Once again, I found deception and trickery contribute to a Panda hit. Cloaked ads, questionable authorship, heavy affiliate linking, and more led to this Panda attack. If you deceive users, expect a visit from the mighty Panda. And no, it probably won’t be pleasant.
  • In some situations, money terms may not be affected by panda. In this case study, the core money term was not impacted at all. It remained steady throughout the ups and downs. But as documented above, that didn’t stop the site from experiencing a massive drop in Google organic traffic (86%).


Summary: Long-Term Panda Changes = Long-Term Panda Wins
First, I’m glad you made it to the end of this post (I know it was getting long). Second, I hope you found this Panda case study interesting. It was definitely fascinating to analyze. I’ve helped many companies with Panda attacks since February of 2011 and this case had some very interesting aspects to it. As usual, my hope is this situation can help some of you dealing with Panda attacks better understand the fluctuations you are seeing over time. Panda can be a confusing topic for sure.

If there are few core things you should remember leaving this post, it’s that temporary recoveries can happen, implementing the right Panda changes over time is extremely important, Google can test your site during the recovery phase, and organic search traffic can come and go like the wind. Just make sure you’re ready when the Panda comes knocking.

GG

 

 

Tuesday, July 22nd, 2014

How To Get More Links, Crawl Errors, Search Queries, and More By Verifying Directories in Google Webmaster Tools

Verify by Directory in Google Webmaster Tools

In my opinion, it’s critically important to verify your website in Google Webmaster Tools (GWT). By doing so, you can receive information directly from Google as it crawls and indexes your website. There are many reports in GWT that can help identify various problems SEO-wise. For example, you can check the crawl errors report to surface problems Googlebot is encountering while crawling your site. You can check the HTML improvements section to view problems with titles, descriptions, and other metadata. You can view your inbound links as picked up by Google (more on that soon). You can check xml sitemaps reporting to view warnings, errors, and the indexed to submitted ratio. You can view indexation by directory via Index Status (forget about a site command, index status enables you to view your true indexation number).

In addition to the reporting you receive in GWT, Google will communicate with webmasters via “Site Messages”. Google will send messages when it experiences problems crawling a website, when it picks up errors or other issues, and of course, if you’ve received a manual action (penalty). That’s right, Google will tell you when your site has been penalized. It’s just another important reason to verify your website in GWT.

Limit On Inbound Links for Sites With Large Profiles
And let’s not forget about links. Using Google Webmaster Tools, you can view and download the inbound links leading to your site (as picked up by Google). And in a world filled with Penguins, manual actions, and potential negative SEO, it’s extremely important to view your inbound links, and often. Sure, there’s a limit of ~100K links that you can download from GWT, which can be limiting for larger and more popular sites, but I’ll cover an important workaround soon. And that workaround doesn’t just apply to links. It applies to a number of other reports too.

When helping larger websites with SEO, it’s not long before you run into the dreaded limit problem with Google Webmaster Tools. The most obvious limit is with inbound links. Unfortunately, there’s a limit of ~100K links that you can download from GWT. For most sites, that’s not a problem. But for larger sites, that can be extremely limiting. For example, I’m helping one site now with 9M inbound links. Trying to hunt down link problems at the site-level is nearly impossible via GWT with a link profile that large.

Inbound Links in Google Webmaster Tools

 

When you run into this problem, third party tools can come in very handy, like Majestic SEO, ahrefs, and Open Site Explorer. And you should also download your links from Bing Webmaster Tools, which is another great resource SEO-wise. But when you are dealing with a Google problem, it’s optimal to have link data directly from Google itself.

So, how do you overcome the link limit problem in GWT? Well, there’s a workaround that I’m finding many webmasters either don’t know about or haven’t implemented yet – verification by directory.

Verification by Directory to the Rescue
If you’ve been following along, then you can probably see some issues with GWT for larger, complex sites. On the one hand, you can get some incredible data directly from Google. But on the other hand, larger sites inherently have many directories, pages, and links to deal with, which can make your job analyzing that data harder to complete.

This is why I often recommend verifying by directory for clients with larger and more complex websites. It’s a great way to dig deep into specific areas of a website. As mentioned earlier, I’ve found that many business owners don’t even know you can verify by directory!  Yes, you can, and I recommend doing that today (even if you have a smaller site, but have distinct directories of content you monitor). For example, if you have a blog, you can verify the blog subdirectory in addition to your entire site. Then you can view reporting that’s focused on the blog (versus muddying up the reporting with data from outside the blog).

Add A Directory in Google Webmaster Tools

And again, if you are dealing with an inbound links problem, then isolating specific directories is a fantastic way to proceed to get granular links data. There’s a good chance the granular reporting by directory could surface new unnatural links that you didn’t find via the site-level reporting in GWT. The good news is that verifying your directories will only take a few minutes. Then you’ll just need to wait for the reporting to populate.

Which Reports Are Available For Directories?
I’m sure you are wondering which reports can be viewed by subdirectory. Well, many are available by directory, but not all. Below, you can view the reports in GWT that provide granular data by directory.

  • Search Queries
  • Top Pages (within Search Queries reporting)
  • Links to Your Site
  • Index Status
  • Crawl Errors (by device type)
  • HTML Improvements
  • Internal Links
  • International Targeting (New!)
  • Content Keywords
  • Structured Data

 

GWT Reporting by Directory – Some Examples

Indexation by Directory
Let’s say you’re having a problem with indexation. Maybe Google has only indexed 60% of your total pages for some reason. Checking the Index Status report is great, but doesn’t give you the information you need to isolate the problem.  For example, you want to try and hunt down the specific areas of the site that aren’t indexed as heavily as others.

If you verify your subdirectories in GWT, then you can quickly check the Index Status report to view indexation by directory. Based on what you find, you might dig deeper to see what’s going on in specific areas of your website. For example, running crawls of that subdirectory via several tools could help uncover potential problems. Are there roadblocks you are throwing up for Googlebot, are you mistakenly using the meta robots tag in that directory, is the directory blocked by robots.txt, is your internal linking weaker in that area, etc? Viewing indexation by directory is a logical first step to diagnosing a problem.

How To View Index Status by Directory in Google Webmaster Tools

 

Search Queries by Directory
Google Webmaster Tools provides search queries (keywords) that have returned pages on your website (over the past 90 days). Now that we live in a “not provided” world, the search queries reporting is important to analyze and export on a regular basis. You can view impressions, clicks, CTR, and average position for each query in the report.

But checking search queries at the site level can be a daunting task in Google Webmaster Tools. What if you wanted to view the search query data for a specific section instead? If you verify by directory, then all of the search query data will be limited to that directory. That includes impressions, clicks, CTR, and average position for queries leading to content in that directory only.

In addition, the “Top Pages” report will only contain the top pages from that directory. Again, this quickly enables you to hone in on content that’s receiving the most impressions and clicks.

And if you feel like there has been a drop in performance for a specific directory, then you can click the “with change” button to view the change in impressions, clicks, CTR, and average position for the directory. Again, the more granular you can get, the more chance of diagnosing problems.

How To View Search Query Reporting by Directory in Google Webmaster Tools

 

Links by Directory
I started explaining more about this earlier, and it’s an extremely important example. When you have a manual action for unnatural links, you definitely want to see what Google is seeing. For sites with large link profiles, GWT is not ideal. You can only download ~100K links, and those can be watered down by specific pieces of content or sections (leaving other important sections out in the cold).

When you verify by directory, the “links to your site” section will be focused on that specific directory. And that’s huge for sites trying to get a better feel for their link profile, unnatural links, etc. You can see domains linking to your content in a specific directory, your most linked content, and of course, the actual links. And you can download the top ~100K links directly from the report.

In addition, if you are trying to get a good feel for your latest links (like if you’re worried about negative SEO), then you can download the most recent links picked up by Google by clicking the “Download latest links” button.  That report will be focused on the directory at hand, versus a site-level download.

I’m not saying this is perfect, because some directories will have many more links than 100K. But it’s much stronger than simply downloading 100K links at the site-level.

How To View Inbound Links by Directory in Google Webmaster Tools

 

Crawl Errors By Directory
If you are trying to analyze the health of your website, then the Crawl Errors reporting is extremely helpful to review. But again, this can be daunting with larger websites (as all pages are reported at the site-level). But if you verify by directory, the crawl errors reporting will be focused on a specific directory. And that can help you identify problems quickly and efficiently.

In addition, you can view crawl errors reporting by Google crawler. For example, Googlebot versus Googlebot for Smartphones versus Googlebot-mobile for Feature Phones. By drilling into crawl errors by directory, you can start to surface problems at a granular level. This includes 404s, 500s, Soft 404s, and more.

How To View Crawl Errors by Directory in Google Webmaster Tools

Summary – Get Granular To View More Google Webmaster Tools Data
Verifying your website in Google Webmaster Tools is extremely important on several levels (as documented above).  But verifying by directory is also important, as it enables you to analyze specific parts of a website at a granular basis. I hope this post convinced you to set up your core directories in GWT today.

To me, it’s critically important to hunt down SEO problems as quickly as possible. The speed at which you can identify, and then rectify, those problems can directly impact your overall SEO health (and traffic to your site). In addition, analyzing granular reporting can help surface potential problems in a much cleaner way than viewing site-wide data. And that’s why verifying subdirectories is a powerful way to proceed (especially for large and complex sites).  So don’t hesitate. Go and verify your directories in Google Webmaster Tools now. More data awaits.

GG

 

 

Monday, July 14th, 2014

Panda, Penguin, and Manual Actions – Questions, Tips, and Recommendations From My SES Atlanta Session

SES Atlanta Panda

{Important Update About Penguin: Read John Mueller’s latest comments about the Penguin algorithm.}

I just returned from SES Atlanta, where I presented “How To Avoid and Recover From Panda, Penguin, and Manual Actions”. The conference was outstanding, included a killer keynote by Duane Forrester and sessions packed with valuable information about SEO and SEM. By the way, I entered my hotel room in Atlanta and immediately saw a magazine on the desk. The photo above is the cover of that magazine! Yes, a Panda was on the cover. You can’t make this stuff up. :)

During (and after) my presentation about algorithm updates and penalties, I received a number of outstanding questions from audience members. And later in the day, I led a roundtable that focused on Panda and Penguin. There were also some great conversations during the roundtable from business owners and marketers across industries. It’s always interesting to hear top marketer concerns about major algorithm updates like Panda and Penguin (and especially Panda 4.0 which just rolled out in late May). We had a lively conversation for sure.

On the plane flight home, I started thinking about the various questions I was asked, which areas were the most confusing for marketers, and the tips and recommendations I was sharing.  And based on that list, I couldn’t help but think a Q&A style blog post could be very helpful for others dealing with Panda, Penguin, and manual actions. So, I decided to write this post covering a number of those questions. I can’t cover everything that I spoke about at SES Atlanta (or this post would be huge), but I can definitely provide some important tips and recommendations based on questions I received during the conference.  Let’s jump in.

Algorithm Updates and Manual Actions – Q&A From SES Atlanta

Question: I’ve been hit by Panda 4.0. What should I do with “thin content” or “low-quality” content I find on my website?  Is it better to nuke the content (404 or 410), noindex it, or should I redirect that content to other pages on my site?

Glenn: I hear this question often from Panda victims, and I know it’s a confusing topic. My recommendation is to remove thin and low-quality content you find on your site. That means 404 or 410 the content or noindex the content via the meta robots tag. When you have a content quality problem on your site, you need to remove that content from Google’s index. In my experience with helping companies recover from Panda, this has been the best path to take.

That said, if you find content that’s thin, but you feel you can enhance that content, go for it. If you believe the content could ultimately hold information that people are searching for, then beef it up. Just make sure you do a thorough job of developing the additional content. Don’t replace thin content with slightly thin content. Create killer content instead. If you can’t, then reference my first point about nuking the content.

Also, it’s important to ensure you are removing the right content… I’ve seen companies nuke content that was actually fine thinking it was low-quality for some reason. That’s why it’s often helpful to have an objective third party analyze the situation. Business owners and marketers are often too close to their own websites and content to objectively rate it.

Panda Decision Matrix

 

Question: How come I haven’t seen a Panda recovery yet even though I quickly made changes? I was expecting to recover during the next Panda update once the changes were implemented.

Glenn: This is another common question from Panda victims. It’s important to understand that completing the changes alone isn’t enough. Google first needs to recrawl the site and the changes you implemented.  Then it needs to better understand user engagement based on the changes. I’ve explained many times in my blog posts about Panda that the algorithm is heavily focused on user engagement. So just making changes on your site doesn’t provide Google enough information.

Panda recovery can take time. Just read my case study about 6 months with Panda. That was an extreme situation in my opinion, but it’s a great example of how long it can take to recover.

Second, Panda roughly rolls out once per month. You need an update to occur before you can see changes. But that’s not a hard rule. John Mueller from Google clarified the “Panda Tremors” I have been seeing since Panda 4.0, and explained that there isn’t a fixed frequency for algorithm updates like Panda. Instead, Google can continue to tweak the algo to ensure it yields the desired results. Translation: you might see turbulence after a Panda hit (and you may see increases or decreases as the tremors continue).

Panda Tremors John Mueller

And third, you might see smaller recoveries over time during subsequent updates (versus a full recovery in one shot). I’ve had several clients increase with subsequent Panda updates, but it took 4-5 updates for them to fully recover. So keep in mind that you might not see full recovery in one shot.

 

Question:  We know we have an unnatural links problem, and that we were hit by Penguin, but should we tackle the links problem or just build new links to balance out our link profile?

Glenn: I’ve seen many companies that were hit by Penguin avoid tackling the root problem, and instead, just try and build new links to balance out their link profile. In my opinion, that’s the wrong way to go. I always recommend aggressively handling the unnatural links situation, since that’s the most direct path to Penguin recovery.

And to clarify, you should still be pumping out killer content, using Social to get the word out, etc. I always tell clients impacted by Penguin or Panda to act like they aren’t impacted at all. Keep driving forward with new content, sharing via social media, connecting with users, etc. Fresh links and shares will be a natural side effect, and can help the situation for sure. And then the content they are building while under the Penguin filter could end up ranking well down the line. It’s hard to act like you’re not hit, but that’s exactly what you need to do. You need to be mentally tough.

Address Unnatural Links for Penguin

 

Question: Is it ok to remove content from Google’s index? Will that send strange signals to the engines?

Glenn: Nuke it. It’s totally fine to do so, and I’ll go even further and say it could be a great thing to do. I mentioned this several times in my Panda 4.0 findings, but the right indexation is more important than high indexation. In other words, make sure Google has your best content indexed, and not thin, duplicate, or other low-quality content.

I had one client drop their indexation by 83% after being impacted by Phantom and Panda, and they are doing extremely well now Google organic-wise. I love the screenshot below. It goes against what many marketers would think. Lower indexation = more Google traffic. That’s awesome.

Indexation and Panda

 

Question: We consume a lot of syndicated content. What’s the best way to handle attribution?

Glenn: I saw a number of sites get smoked during Panda 4.0 that were consuming a lot of syndicated content and not handling that properly SEO-wise. The best way to handle attribution for syndicated content is to use the cross domain canonical url tag pointing to the original article. If you can’t do that (or don’t want to do that), then you can keep the content out of Google’s index by noindexing it via the meta robots tag.

It’s not your content, so you shouldn’t be taking credit for it.  That said, if set up correctly, it’s fine to have syndicated content on your site for users to read. But the proper attribution is important or it can look like you are copying or scraping content. I know that won’t go over well for ad teams looking to rank in organic search (to gain more pageviews), but again, it’s not your content to begin with.

Syndication and Panda

 

Question: Why hasn’t there been a Penguin update since October of 2013? What’s going on? And will there ever be another update?

Glenn: It’s been a long time since the last Penguin update (October 4, 2013). Like many others heavily involved with Penguin work, I’m surprised it has taken so long for another update.

Penguin 2.1 on October 4, 2013

Matt Cutts recently explained at SMX Advanced that they have been heavily working on Panda 4.0, so Penguin has taken a back seat. But he also said that an engineer came up to him recently and said, “it’s probably time for a Penguin update”. That situation is both positive and scary at the same time.

On the one hand, at least someone is thinking about Penguin on the webspam team! But on the flip side, they clearly haven’t been focusing on Penguin for some time (while many Penguin victims sit waiting for an update). On that note, there are many webmasters who have rectified their unnatural link problems, disavowed domains, urls, etc., and are eagerly awaiting a Penguin update. It’s not exactly fair that Google has been making those business owners wait so long for Penguin to roll out again.

Now, there’s always a possibility that there is a problem with the Penguin algorithm. Let’s face it, there’s no reason it should take so long in between updates. I’m wondering if they are testing Penguin and simply not happy with the results. If that’s the case, then I could see why they would hold off on unleashing a new update (since it could wreak havoc on the web). But that’s just speculation.

In my opinion, it’s not cool to let Penguin victims that have worked hard to fix their link problems sit in Penguin limbo. So either Google is seriously punishing them for the long-term, they have put the algo on the back burner while focusing on other algos like Panda, or Penguin is not up to par right now. Remember, if Google isn’t happy with the results, then they don’t have to push it out. And if that’s the case, Penguin victims could sit in limbo for a long time (even longer than the 9 months they have waited so far.)  Not good, to say the least.


Important Penguin Update: Google’s John Mueller provided more information about the Penguin algorithm on today’s Webmaster Central Office Hours Hangout.

John was asked if Penguin would be released again or if it was being retired. And if it was being “retired”, then would Google at least run it one more time to free those webmasters that had cleaned up their link profiles. John explained that Penguin was not being retired. Let me say that again. he said Penguin is not being retired. John explained that it can sometimes take longer than expected to prepare the algorithm and update the necessary data. He also explained that if Google were to retire an algorithm, then they would “remove it completely” (essentially removing any effect from the algorithm that was in place).

So we have good news on several fronts. Penguin is still alive and well. And if Google did retire the algo, then the effect from Penguin would be removed. Let’s hope another Penguin update rolls out soon.

You can view the video below (starting at 5:16) or you can watch on YouTube -> https://www.youtube.com/watch?v=8r3IIPCHt0E&t=5m16s

 

Question: We’ve been hit by both Panda and Penguin. We don’t have a lot of resources to help with recovery, so which one do we tackle first?

Glenn: I’ve helped a number of companies with Pandeguin problems over the years, and it’s definitely a frustrating situation for business owners. When companies don’t have resources to tackle both situations at the same time, then I’ve always been a big fan of tackling the most acute situation first, which is Penguin.

Pandeguin Hit

Panda is a beast, and has many tentacles. And Penguin is all about unnatural links (based on my analysis of over 400 sites hit by Penguin since April 24, 2012). That’s why I recommend focusing on Penguin first (if you can’t knock out both situations at once). I recommend aggressively tackling unnatural links, remove as many spammy links as you can, and then disavow the remaining ones you can’t get to manually. Then set up a process for monitoring your link profile over time (to ensure new unnatural links don’t pop up).

After which, you can tackle the Panda problem. I would begin with a comprehensive Panda audit, identify the potential problems causing the Panda hit, and aggressively attack the situation (the bamboo). Move quickly and aggressively. Get out of the grey area of Panda (it’s a maddening place to live).

 

Question: Is linkbuilding dead? Should I even focus on building links anymore and how do I go about doing that naturally?

Glenn: Links are not dead! The right links are even more important now. I know there’s a lot of fear and confusion about linkbuilding since Google has waged war on unnatural links, but to me, that makes high quality links even more powerful.

Duane Forrester wrote a post recently on the Bing Webmaster Blog where he explained if you know where a link is coming from prior to gaining that link, then you are already going down the wrong path. That was a bold statement, but I tend to agree with him.

Duane Forrester Quote About Linkbuilding

I had several conversations about this topic at SES Atlanta. To me, if you build killer content that helps your target audience, that addresses pain points, and teaches users how to accomplish something, then there’s a good chance you’ll build links. It’s not the quantity of links either… it’s the quality. I’d rather see a client build one solid link from a site in their niche versus 1000 junky links. The junky links are Penguin food, while the solid link is gold.

 

Question: I was hit by Panda, but my core competitors have the same problems we do. We followed what they were implementing, and we got hit. Why didn’t they get hit? And moving forward, should we follow others that are doing well SEO-wise?

Glenn: I can’t tell you how many times companies contact me and start showing me competitors that are doing risky things SEO-wise, yet those sites are doing well in Google. They explain that they tried to reproduce what those competitors were doing, and then they ended up getting hit by Panda. That situation reinforces what I’ve told clients for a long time. Competitive analyses can be extremely beneficial for gathering the right intelligence about your competitors, but don’t blindly follow what they are doing. That’s a dangerous road to travel.

Instead, companies should map out a strong SEO strategy based on their own research, expertise, target audience, etc. Ensure you are doing the right things SEO-wise for long-term success. Following other companies blindly is a dangerous thing to do. They could very easily be headed towards SEO disaster and you’ll be following right along.

For example, I had a client always bring up one specific company to me that was pushing the limits SEO-wise (using dark grey hat tactics). Well, they finally got hit during a Panda update in early 2014 and lost a substantial amount of traffic. I sent screenshots to my client which reinforced my philosophy. My client was lucky they didn’t follow that company’s tactics… They would have jumped right off the SEO cliff with them. The screenshot below shows an example of a typical surge in Google before a crash.

Surge in Traffic Before Algo Hit

 

Question: We’ve been working hard on a manual action for unnatural links, but right before filing reconsideration, it expired. What should we do?

Glenn: I’ve seen this happen with several clients I was helping with manual actions. It’s a weird situation for sure. You are working on fixing problems based on receiving a manual action, and right before you file a reconsideration request, the manual action disappears from Google Webmaster Tools. When that happens, is the site ok, do you still need to file a reconsideration request with Google, should you wait, or should you continue working on the manual action?

It’s important to know that manual actions do expire. You can read that article by Marie Haynes for more information about expiring manual actions. Google has confirmed this to be the case (although the length of each manual action is variable). But those manual actions can return if you haven’t tackled the problem thoroughly… So don’t’ think you’re in the clear so fast.

Expiring Manual Actions

 

That said, if you have tackled the problem thoroughly, then you are probably ok. For example, I was helping a company with a manual action for unnatural links and we had completed the process of removing and disavowing almost all of their unnatural links. We had already written the reconsideration request and were simply waiting on a few webmasters that were supposed to take down more links before filing with Google.

As we were waiting (just a few extra days), the manual action disappeared from Google Webmaster Tools. Since we did a full link cleanup, we simply drove forward with other initiatives. That was months ago and the site is doing great SEO-wise (surging over the past few months).

Just make sure you thoroughly tackle the problem at hand. You don’t want a special surprise in your manual action viewer one day… which would be the return of the penalty. Avoid that situation by thoroughly fixing the problems causing the penalty.

 

Summary – Clarifying Panda and Penguin Confusion
As you can see, there were some outstanding and complex questions asked at SES Atlanta. It confirms what I see every day… that business owners and webmasters are extremely confused with algorithm updates like Panda and Penguin and how to tackle penalties. And when you combine algo updates with manual actions, you have the perfect storm of SEO confusion.

I hope the Q&A above helped answer some questions you might have about Panda, Penguin, and manual actions. And again, there were several more questions asked that I can’t fit into this post! Maybe I’ll tackle those questions in another post. So stay tuned, subscribe to my feed, and keep an eye on my Search Engine Watch column.

And be prepared, I felt a slight chill in the air this past weekend. The next Penguin update could (and should) be arriving soon. Only Google knows, but I hope they unleash the algo update soon. Like I said in my post, there are many webmasters eagerly awaiting another Penguin rollout. Let’s hope it’s sooner than later.

GG

 

Tuesday, June 17th, 2014

Panda 4.0 Case Study – Thin Content, Deception, Mobile Redirects, and The Danger of the Wrong Content Strategy

Panda 4.0 and The Wrong Content Strategy

After Panda 4.0 rolled out, I analyzed many cases of both strong recoveries and severe fresh hits.  Based on analyzing over 40 websites hit by P4.0, I wrote two blog posts detailing my findings.  You can find my initial findings on my blog and then additional P4.0 findings in my Search Engine Watch column. I recommend reading those posts in addition to this case study to get a stronger feel for Panda 4.0, what it targeted, examples of sites it impacted, etc. Note, I’ve now analyzed over 50 websites impacted by Panda 4.0 and I plan to write more posts in the coming weeks.  Stay tuned.

As I explained in my posts about Panda 4.0, I’ve unfortunately seen a number of serious hits.  For example, companies seeing a massive drop in Google organic traffic (60%+).  That’s a horrible situation for sure, and many of those companies didn’t see Panda coming. They were blindsided on May 20 and have been working hard ever since to determine why they became Panda victims.

A Severe Panda 4.0 Hit:
Panda 4.0 Loss of Traffic

Although many companies are blindsided by Panda, you might be wondering if any received fair warning that Panda would strike?  That’s typically not the case, which is why one situation is standing out from the rest for me.  You see, one company had a month’s warning that the mighty Panda would be paying a visit. They weren’t warned by Google, Matt Cutts, or John Mueller, but instead, I told them. I’ll explain more about that shortly, including why I was nearly 100% sure they would get hit once I quickly reviewed their website.

In this post, I’ll cover the situation leading up to the Panda hit, what the company was doing wrong, the impact from Panda 4.0, and provide some final recommendations for companies looking to build a strong content strategy.

The Warning
About a month before Panda 4.0 rolled out, I spoke with a company that was looking to expand its SEO efforts. Specifically, they wanted to continue driving more organic search traffic to their site in order to boost mobile app installs (since they had seen a nice uptick from Google organic recently).  Upon digging into the site, their current content strategy, keywords leading to the site, landing pages from organic search, etc., I was shocked what I found.  And shocked in a bad way, not good.

Based on all the algorithm update work I do, I’ve become hypersensitive to certain website characteristics. For example, spotting unnatural links, thin content, technical problems causing website issues, severe duplicate content, copyright violations, etc.  So when I checked out this website, and specifically where they were driving visitors and how they were handling those visits, I almost fell out of my seat.

To me, the site was a giant piece of scrumptious bamboo. It was as clear as day. They were teed up to get smoked by the mighty Panda, but had no idea yet.

This reminded me of the situation I ran into last year when I started helping a company that was teed up to get hit by Penguin.  Upon finding a serious unnatural links problem, we worked hard to race Penguin by tackling the situation fully.  And we ended up winning the race, as Penguin rolled out and the company did not get hit.  Well, here I was again…  predicting an attack from another black and white animal from the Google Zoo before it was unleashed on the web.

I would typically jump at the chance to help a company thwart a Panda attack, but there were two problems with this specific situation.  First, I had absolutely no time in my schedule to help them. And second, they were extremely unfamiliar with SEO, Google algorithm updates, Google Webmaster Guidelines, etc., so everything I was explaining to them was foreign.  I got the feeling they weren’t too excited about making serious changes (especially when traffic was increasing steadily from Google).

Again, they wanted to expand SEO, not reengineer their current strategy.  So I explained how Panda works, how it rolls out monthly, and that I was 99.9% certain they were going to get hit.  We had a good conversion about their current situation, but again, I had no time to help them. After we got off the call, I’m sure they were scratching their heads wondering what to do, while I had a horrible feeling they would experience a serious Panda hit.

Why Change When Google Organic Traffic is Surging?
The Traffic Surge Before Algorithm Updates Strike

Content Strategy and Inviting Panda to Dinner
As I explained earlier, the company had seen a spike in Google organic traffic, based on a new content strategy. I began quickly reviewing their rankings, the landing pages receiving Google organic traffic, their trending over time, etc. And again, what I found was shocking.

The company’s goal was to drive prospective customers to their site, only to drive them to their mobile apps in the Google Play Store or the Apple App Store (to download the apps). That sounds fine, but the devil is the in the details.

The site had approximately 15K pages indexed and almost all of them were extremely thin, content-wise. The pages consisted of a thumbnail image, a title, and no other content.  For users on desktop, the thumbnail images sometimes linked downstream to videos that the company didn’t own (and were located on a third party site). The company had licensing deals in place to play the video clips in their mobile app, but not on the website. And sometimes the thumbnail images didn’t link to anything at all (the page was a dead end). I’ll cover the mobile experience soon, which was also extremely problematic.

Low Quality Content Wireframe and Flow

So, you had a lot of thin content, and a serious downstream problem.  Engagement had to be horrible for most visitors, and the users that did choose to engage, were driven off the site quickly. If you’ve read my posts about Panda in the past, you know that poor user engagement is a giant invite to the mighty Panda. But, combining crazy thin content with poor user engagement is like breaking into the Panda’s house, drugging him, stuffing him into your SUV, dropping him on your website full of bamboo, and waking him up with a taser gun.  It won’t end well, and it didn’t for this company.

It Gets Worse – Mobile Traffic
Understanding that this company’s focus was mobile, I decided to check how the site handled mobile visitors. It was hard to believe the situation could get worse, but it did. The desktop situation was sure to lead to a Panda hit (or even a manual action), but it got worse as I dug in from a mobile standpoint. After checking the mobile situation across multiple devices, I found it extremely risky on several levels.

I noticed that as soon as I clicked through to the website from the search results, the site automatically redirected me to either the Google Play Store or the Apple App Store. So I didn’t even get a chance to view anything on the site.  Yes, visitors wouldn’t even land on the site, but instead, mobile traffic (including Google organic traffic) was being immediately redirected to the app stores. They were basically forcing users to download the app.

Mobile Redirects Wireframe and Flow

Talk about risky? They were just asking Google to punish them… On that note, I mentioned the Panda recipe of death in my last Search Engine Watch column. Well, this was a high-octane version of the cocktail. You can substitute grain alcohol for rum in this version of the drink.

A High-Octane Panda Cocktail

Thousands of thin pages were indexed, the pages ranking were ranking well, desktop users spent very little time on the site, and mobile traffic was immediately redirected off the site to the app stores.  There was no engagement, no strong content, and no real focus from a content strategy standpoint. In addition, the site was clearly providing boatloads of content for the sole purpose of ranking in Google to attract users searching for popular terms (with the hope of getting some of them to download their apps). Like I said earlier, the site was teed up to get smoked by Panda.

A Note About Google and Smartphone Visitors
Google is already getting tough on websites in the smartphone rankings by demoting sites that implement faulty redirects.  And now they will provide a message in the search results in addition to providing a demotion.  Can you tell Google wants to ensure its users have a positive experience on their phones?

Google and Smartphone Demotions

But the redirects I was seeing on this site were even worse than standard faulty redirects… These redirects take users automatically off of the site, basically forcing them to download mobile apps to view content.  Not good, to say the least.

No Time To Help, But A Smart Recommendation
Back to the company needing SEO help. So, I spoke with one of the founders and explained that I didn’t have much availability at the time. But I clearly explained the problems I was seeing. I introduced Panda, I explained how Google was becoming tougher on sites from a mobile standpoint, and I explained how Google wants to protect its users from deception and poor user experience.

I basically explained that their surge in Google organic traffic was going to be short-lived. On that note, you can read my post about the sinister and misleading surge in Google organic traffic before algo updates to learn more about that phenomenon. I knew with almost 100% certainty that their site was going to get hit, and sooner than later. It was obvious to me after analyzing many sites impacted from Panda since February of 2011.

So the call ended and I told them I would get in touch if time opened in my schedule (which it hasn’t, ironically due to the very algorithm update that hit their website). Panda 4.0 rolled out on May 20, 2014, and as I’ve documented in my posts about the update, it was huge. Many websites got smoked during P4.0, and it sure looks like this company was one of the casualties.

Detailing The Panda 4.0 Hit
Checking SEMRush and SearchMetrics for the website at hand, I saw a distinct dip after Panda 4.0 rolled out. And checking both traffic and keyword trending, I could see more Panda tremors in the weeks following Panda 4.0 (which I have seen consistently across sites impacted by P4.0).

Then I started checking the various keywords the site used to rank for, and low and behold, the site was not ranking at all anymore. Actually, I couldn’t find the site ranking anywhere for those keywords (even if I added the domain name to the search!) Google had indeed smoked the site.

Lost Keywords After Panda 4.0

I tested this across both desktop and mobile and could not find the site ranking at all. So either Panda 4.0 took care of the situation, or they’ve been hit with a manual action. I can’t tell for sure since I don’t have access to the company’s Google Webmaster Tools account (remember, I didn’t have time to help them originally). But the site saw a big drop right around May 20 and has seen subsequent panda tremors since then. It sure looks like Panda 4.0 to me.

As of Friday, June 13, the site still had thousands of thin pages indexed, and the mobile redirects were still in place. But then Saturday, June 14 revealed a big change.  All of the pages must have been taken down the day prior. The company must have seen the impact from Panda 4.0 and decided to nuke all of the thin content, the mobile redirects, etc. I wish they would have done that when I first told them to.  :)

All Thin Pages Removed From Google After Panda Hit

So I’m not sure what the company has planned moving forward, but it’s pretty clear that their previous content strategy took them down a very dangerous path. A path filled with SEO mines and lots of bamboo. They have their work cut out for them if they want to recover, which is a horrible place to be for a startup.

Panda Lessons – Content Strategy and SEO
The term “content strategy” gets thrown around a lot in marketing departments (especially over the past few years). But it should not be taken lightly. Great content is critically important from an SEO standpoint. It’s the reason the mighty Panda was created by Google in the first place. If you don’t have the chops to create unique, killer content, then SEO is probably not for you.

Hey, there is always AdWords, Bing Ads, and Facebook Ads if SEO isn’t your thing. Those can be viable solutions, but you’ll pay for every visitor. That doesn’t sound great, but it’s better than artificially boosting your Google organic traffic in the short-term, only to burn your domain by breaking the rules.

Here’s an important recommendation for any company looking to increase quality traffic from SEO. If you are going to craft a strong content strategy, then absolutely get an experienced SEO involved from the beginning.  In today’s Google environment, it’s way too risky to go down the wrong path, test the Panda waters, only to get smoked by an algorithm update.

If you do, you’ll have months of recovery to deal with. And you would have wasted time, money, and resources on a plan that’s destined to fail. I’ve seen too many companies go down this path and then call me after it’s too late. Those calls are tough… there’s a lot of silence on the other end of the line when I explain what actually happened. Avoid situations like that at all costs.

Tips For Developing The Right Content Strategy
I wanted to end this post with some recommendations for companies that are new to SEO, but want to develop a strong content strategy.  The bullets below are merely a starting point, but they are still incredibly important.

  • Research is Key
    Don’t rush into producing content. Complete the necessary research first. Understand your niche, your target audience, and what people are searching for. It’s a great starting point.
  • Competitive Analysis
    Competitive intelligence is also extremely important.  Fully analyze your competition, what they are ranking for, the landing pages receiving traffic, their overall content strategy, etc. You can glean important insights from what is already out there… and who you are competing with.
  • Brainstorming is Good
    Buy a whiteboard and brainstorm often. For companies I help with content strategy, I love facilitating brainstorming sessions based on data. I’ve always said that if you get your top people in a conference room with a whiteboard and start brainstorming ideas, you’ll have a year’s worth of content mapped out. And add data from the first two bullets above and you’ll be in even better shape.
  • Add An Experienced SEO to the Mix
    Hire an SEO to help develop and then review your content strategy. I don’t care if that’s an in-house SEO, consultant, or agency, but definitely have one involved. Professional SEOs will understand how to effectively research a niche, a target audience, and the competition. In addition, they will be up to speed on Google’s latest algorithms, manual actions, and guidelines. It’s like an insurance plan for your website. If you avoid this step, then proceed at your own peril.
  • Continual Analysis
    Companies should continually analyze their efforts to understand the true impact of a content strategy. For example, analyze organic search traffic, referral traffic, linkbuilding, conversion, revenue, growing a social following, etc. The beautiful part about digital marketing is the speed at which you can change. If something isn’t working, and change is needed, then you can quickly turn on a dime and go down another path. Continually learn from your success and failures.  That’s how you’ll succeed.

Summary – The Right Content Strategy Can Help You Avoid Panda
After reading this case study, I hope you understand the risks involved with rolling out the wrong content strategy. In the Google world we live and play in, the wrong strategy doesn’t just impact short-term organic search traffic. It can also lead to an algorithm hit or a manual action. And if that happens, you’ll have months of recovery work in front of you.

And the time, money, and resources you’ll waste on recovery work could have been used to drive more targeted traffic to the site. And by the way, there’s no guarantee you’ll recover from Panda so quickly. Like I said earlier, avoid this situation at all costs.

Start with the right content strategy, think about users, produce killer content, and avoid the mighty Panda.  Good luck.

GG

 

Friday, May 23rd, 2014

Panda 4.0 Analysis | Nuclear Option Rewarded, Phantom Victims Recover, and Industry Experts Rise

Panda 4.0 Rolls Out

On May 20th, 2014 Google’s Matt Cutts announced that Panda 4.0 was rolling out.  Leading up to that tweet, there was a lot of chatter across the industry about an algorithm update rolling out (based on reports of rankings volatility and traffic gains/losses).  I was also seeing lots of movement across clients that had been impacted by previous algorithm updates, while also having new companies contact me about massive changes in rankings and traffic.  I knew something serious was happening, but didn’t know exactly what it was.  I thought for a while that it could be the pre-rollout and testing of Penguin, but it ended up being a new Panda update instead.

Matt Cutts Announces Panda 4.0

When Panda 4.0 was officially announced, I had already been analyzing sites seeing an impact (starting on Saturday May 17th, 2014).  I was noticing major swings in rankings and traffic with companies I’ve been helping with previous algo trouble.  And like I said above, several companies started reaching out to me via email about new hits starting that weekend.

And I was glad to hear a confirmation from Matt Cutts about Panda 4.0 rolling out.  That enabled me to hone my analysis.  I’ve mentioned in the past how unconfirmed Panda updates can drive webmasters insane.  When you have confirmation, it’s important to analyze the impact through the lens of a specific algorithm update (when possible).  In other words, content quality for Panda, unnatural links for Penguin, ad ratio and placement for Top Heavy, etc.

And by the way, since Google named this update Panda 4.0, we must assume it’s a new algorithm.  That means new factors could have been added or other factors refined.  Needless to say, I was eager to dig into sites that had been impacted to see if I could glean any insights about our new bamboo-eating friend.

Digging into the Panda 4.0 Data (and the Power of Human Barometers)
I’ve written before about the power of having access to a lot of Panda data.  For example, working with many sites that had been previously impacted by Panda.  It’s often easier to see unconfirmed Panda updates when you can analyze many sites impacted previously by the algorithm update.  I’ve helped a lot of companies with Panda hits since February of 2011 when Panda first rolled out.  Therefore, I can often see Panda fluctuations, even when those updates aren’t confirmed.  That’s because I can analyze the Panda data set I have access to in addition to new companies that reach out to me after getting hit by those Panda updates.  The fresh hits enable me to line up dates with Panda recoveries to better understand when Google rolls out unconfirmed updates.   I’ve documented several of the unconfirmed updates here on my blog (in case you wanted to go back and check the dates against your own data).

So, when Google announced Panda 4.0, I was able to quickly start checking all the clients I have helped with Panda recovery (in addition to the ones I was already seeing jump in the rankings).  And it didn’t take long to see the impact.  A number of sites were clearly being positively impacted by P4.0.

Panda 4.0 Recovery

Then, I analyzed new sites that were negatively impacted, based on those companies reaching out to me after getting hit (starting on 5/17/14).  Together, I have been able to analyze a boatload of Panda 4.0 data.  And it’s been fascinating to analyze.

I have now analyzed 27 websites impacted by Panda 4.0.  The sites I analyzed ranged from large sites receiving a lot of Google Organic traffic (1M+ visits per month) to medium-sized ecommerce retailers and publishers (receiving tens of thousands of visits per month) to niche blogs focused on very specific topics (seeing 5K to 10K visits per month).  It was awesome to be able to see how Panda 4.0 affected sites across industries, categories, volume of traffic, etc.  And as usual, I was able to travel from one Panda 4.0 rabbit hole to another as I uncovered more sites impacted per category.

 

What This Post Covers – Key Findings Based on Heavily Analyzing Websites That Were Impacted by Panda 4.0
I can write ten different posts about Panda 4.0 based on my analysis over the past few days, but that’s the not the point of this initial post.  Instead, I want to provide some core findings based on helping companies with previous Panda or Phantom hits that recovered during Panda 4.0.  Yes, I said Phantom recoveries. More on that soon.

In addition, I want to provide findings based on analyzing sites that were negatively impacted by Panda 4.0.  The findings in this post strike a nice balance between recovery and negative impact.  As many of you know, there’s a lot you can learn about the signature of an algorithm update from fresh hits.

Before I provide my findings, I wanted to emphasize that this is simply my first post about Panda 4.0.  I plan to write several additional posts focused on specific findings and scenarios.  There were several websites that were fascinating to analyze and deserve their own dedicated posts.  If you are interested in learning about those cases, the definitely subscribe to my feed (and make sure you check my Search Engine Watch column).  There’s a lot to cover for sure.  But for now, let’s jump into some Panda 4.0 findings.    

 

Panda 4.0 Key Findings

The Nuclear Option – The Power of Making Hard Decisions and Executing
When new companies contact me about Panda, they often want to know their chances of recovery.  My answer sometimes shocks them.  I explain that once the initial audit has been completed, there will be hard decisions to make.  I’m talking about really hard decisions that can impact a business.

Beyond the hard decisions, they will need to thoroughly execute those changes at a rapid pace (which is critically important).  I explain that if they listen to me, make those hard decisions, and execute fully, then there is an excellent chance of recovery.  But not all companies make hard decisions and execute thoroughly.  Unfortunately, those companies often sit in the grey area of Panda, never knowing how close they are to recovery.

Well, Panda 4.0 reinforced my philosophy (although there were some anomalies which I’ll cover later).  During P4.0, I had several clients recover that implemented HUGE changes over a multi-month period.  And when I say huge changes, I’m talking significant amounts of work.  One of my Panda audits yielded close to 20 pages of recommendations in Word.  When something like that is presented, I can tell how deflated some clients feel.  I get it, but it’s at that critical juncture that you can tell which clients will win.  They either take those recommendations and run, or they don’t.

To give you a feel for what I’m talking about, I’ve provided some of the challenges that those clients had to overcome below:

  • Nuking low-quality content.
  • Greatly improving technical SEO.
  • Gutting over-optimization.
  • Removing doorway pages.
  • Addressing serious canonicalization problems.
  • Writing great content. Read that again. :)
  • Revamping internal linking structure and navigation.
  • Hunting down duplicate content and properly handling it.
  • Hunting down thin content and noindexing or nuking it.
  • Removing manual actions (yep, I’ve included this here).
  • Stop scraping content and remove the content that has been scraped.
  • Creating mobile friendly pages or go responsive.
  • Dealing with risky affiliate marketing setups.
  • Greatly increasing page speed (and handling bloated pages, file size-wise).
  • Hunting down rogue risky pages and subdomains and properly dealing with that content.
  • And in extreme cases, completely redesigning the site. And several of my clients did just that. That’s the nuclear option by the way.  More about that soon.
  • And even more changes.

Now, when I recommend a boatload of changes, there are various levels of client execution. Some clients implement 75% of the changes, while some can only implement 25%.  As you can guess, the ones that execute more have a greater chance at a quicker recovery.

But then there are those rare cases where clients implement 100% of the changes I recommend.  And that’s freaking awesome from my standpoint.  But with massive effort comes massive expectations.  If you are going to make big changes, you want big results.  And unfortunately, that can take time.

Important Note: This is an incredibly important point for anyone dealing with a massive Panda or Penguin problem.  If you’ve been spamming Google for a long time (years), providing low-quality content, that’s over-optimized, using doorway pages to gain Google traffic, etc., then you might have to wait a while after changes have been implemented.  John Mueller is on record saying you can expect to wait 6 months or longer to see recovery.  I don’t think his recommendation is far off.  Sure, I’ve seen some quicker recoveries, but in extreme spamming cases, it can take time to see recovery.

Fast forward to Panda 4.0.  It was AWESOME to see clients that made massive changes see substantial recovery during P4.0.  And several of those clients chose the nuclear option of completely redesigning their websites.  One client is up 130% since 5/17, while another that chose the nuclear option is up 86%.  Here’s a quick screenshot of the bump starting on 5/17:

A Second Panda 4.0 Recovery

 

Side Note: The Nuclear Option is a Smart One When Needed
For some of the companies I was helping, there were so many items to fix that a complete redesign was a smart option.  And no, that doesn’t come cheap.  There’s time, effort, resources, and budget involved versus just making changes to specific areas.  It’s a big deal, but can pay huge dividends down the line.

One client made almost all of the changes I recommended, including going responsive.  The site is so much better usability-wise, content-wise, and mobile-wise.  And with Panda 4.0, they are up 110% since 5/18 (when they first started seeing improvement).

I’ve mentioned before that for Panda recovery, SEO band-aids won’t work.  Well, the clients that fully redesigned their sites and are seeing big improvements underscore the point that the nuclear option may be your best solution (if you have massive changes to make).  Keep that in mind if you are dealing with a massive Panda problem.

 

Phantom Victims Recover
On May 8th, 2013, I picked up a significant algorithm update.  After analyzing a number of websites hit by the update, I decided to call it “Phantom”.  It simply had a mysterious, yet powerful signature, so Phantom made sense to me.  Hey, it stuck. :)

Phantom was a tough algorithm update.  Some companies lost 60% of their traffic overnight.  And after auditing a number of sites hit by Phantom, my recommendations were often tough to hear (for business owners).  Phantom targeted low-quality content, similar to Panda.  But I often found scraped content being an issue, over-optimized content, doorway pages, cross-linking of company-owned domains, etc.  I’ve helped a number of Phantom victims recover, but there were still many out there that never saw a big recovery.

The interesting part about Panda 4.0 was that I saw six Phantom victims recover (out of the 27 sites I analyzed with previous content quality problems).  It’s hard to say exactly what P4.0 took into account that led to those Phantom recoveries, but those victims clearly had a good day.  It’s worth noting that 5 out of the 6 sites impacted by Phantom actively made changes to rectify their content problems.

One of the sites did nothing to fix the problems and ended up recovering anyway.  This could be due to the softening of Panda, which is definitely possible.  There were definitely some sites I analyzed that showed increases after Panda 4.0 that didn’t necessarily tackle many problems they were facing.  But in this situation, the site was a forum, which I cover next.  Note, you can read my post about the softening of Panda and what I saw during the March 24, 2014 Panda update to learn more about the situation.

Phantom Victim Recovers During Panda 4.0

Forums Rebound During Panda 4.0
My next finding was interesting, since I’ve helped a number of forums deal with previous Panda and/or Phantom hits.  I came across four different forums that recovered during Panda 4.0.  Three were relatively large forums, while one was a smaller niche forum run by an category expert.

One of the larger forums (1M+ visits per month) made a boatload of changes to address thin content, spammy user-generated content, etc.   They were able to gut low-quality pages, noindex thinner ones, and hunt down user-generated spam.  They greatly increased the quality of the forum overall (from an SEO perspective).  And they are up 24% since Panda 4.0 rolled out.

Noindexing Low Quality Content on a Forum

A second forum (1.5M visits per month) tackled some of the problems I picked up during an audit, but wasn’t able to tackle a number of items (based on a lack of resources).  And it’s important to know that they are a leader in their niche and have some outstanding content and advice.  During my audit I found they had some serious technical issues causing duplicate and thin content, but I’m not sure they ever deserved to get hammered like they did.  But after Panda 4.0, they are up 54%.

And the expert-run forum that experienced both Panda and Phantom hits rebounded nicely after Panda 4.0.  The site has some outstanding content, advice, conversations, etc.  Again, it’s run by an expert that knows her stuff.  Sure, some of the content is shorter in nature, but it’s a forum that will naturally have some quick answers.  It’s important to note that the website owner did nothing to address the previous Panda and Phantom problems.  And that site experienced a huge uptick based on Panda 4.0.  Again, that could be due to the softening of Panda or a fix to Panda that cut down on collateral damage.  It’s hard to say for sure.  Anyway, the site is up 119% since May 17th.

Forums Recover During Panda 4.0

Industry Experts Rise
During my research, I saw several examples of individual bloggers that focus heavily on niche areas see nice bumps in Google Organic traffic after Panda 4.0 rolled out.  Now, Matt Cutts explained Google was looking to boost the rankings of experts in their respective industries.  I have no idea if what I was seeing during my research was that “expert lift”, but it sure looked like it.

Here’s an example of a marketing professional that saw a 38% lift after Panda 4.0:
Bloggers Recover During Panda 4.0

And here’s a sports medicine expert that has shown a 46% lift:
Niche Expert Recovers During Panda 4.0

It was great to see these bloggers rise in the rankings, since their content is outstanding, and they deserved to rank higher!  They just didn’t have the power that some of the other blogs and sites in their industries had.  But it seems Google surfaced them during Panda 4.0.  I need to analyze more sites like this to better understand what’s going, but it’s worth noting.

Update: I reached out to Matt Cutts via Twitter to see if Panda 4.0 incorporated the “authority” algo update I mentioned earlier.  Matt replied this afternoon and explained that they are working on that independently.  So, it doesn’t seem like the bloggers I analyzed benefited from the “authority” algo, but instead, benefited from overall quality signals.  It was great to get a response from Matt.  See screenshot below.

Matt Cutts Tweet About Subject Matter Expert Algorithm

 

An Indexation Reality Check – It’s Not The Quantity, But the Quality That Matters
After conducting a laser-focused Panda audit, it’s not uncommon for me to recommend nuking or noindexing a substantial amount of content.  That is usually an uncomfortable decision for clients to make.  It’s hard to nuke content that you created, that ranked well at one point, etc.  But nuking low-quality content is a strong way to proceed when you have a Panda problem.

So, it was awesome to see clients that removed large amounts of content recover during Panda 4.0. As an extreme example, one client removed 83% of their content from Google’s index.  Yes, you read that correctly.  And guess what, they are getting more traffic from Google than when they had all of that low-quality and risky content indexed.  It’s a great example about quality versus quantity when it comes to Panda.

Indexation Impact and Panda 4.0

On the other hand, I analyzed a fresh Panda 4.0 hit, where the site has 40M+ pages indexed.  And you guessed it, it has serious content quality problems.  They got hammered by Panda 4.0, losing about 40% of their Google organic traffic overnight.

If you have been impacted by Panda, and you have a lot of risky content indexed by Google, then have a content audit completed now.  I’m not kidding.  Hunt down thin pages, duplicate pages, low-quality pages, etc. and nuke them or noindex them.  Make sure Google has the right content indexed.

 

Engagement and Usability Matter
While analyzing the fresh hits, it was hard to overlook the serious engagement issues I was coming across.  For example, stimulus overload on the pages that were receiving a lot of Google organic traffic prior to the hit.  There were ads that expanded into or over the content, double-serving of video ads, stacked “recommended articles” on the page, lack of white space, a long and confusing navigation, etc.  All of this led to me wanting to bounce off the page faster than a superball on concrete.  And again, high bounce rates and low dwell times can get you killed by Panda.  Avoid that like the plague.

Check out the bounce rates and pages per session for a site crushed by Panda 4.0:

Low Engagement Invites Panda


Side Note: To hunt down low-quality content, you can run this Panda report in Google Analytics.  My post walks you through exporting data from GA and then using Excel to isolate problematic landing pages from Google Organic.

Downstream Matters
While analyzing fresh Panda 4.0 hits, it was also hard to overlook links and ads that drove me to strange and risky sites that were auto-downloading software, files, etc.  You know, those sites where it feels like your browser is being taken over by hackers.  This can lead to users clicking the back button twice and returning to Google’s search results.  And if they do, that can send bad signals to Google about your site and content.  In addition, risky downstream activity can lead to some people reporting your site to Google or to other organizations like Web of Trust (WOT).

And as I’ve said several times in this post, Panda is tied to engagement.  Engagement is tied to users.  Don’t anger users.  It will come back to bite you (literally).

 

Summary – Panda 4.0 Brings Hope
As I said earlier, it was fascinating to analyze the impact of Panda 4.0.  And again, this is just my first post on the subject.  I plan to write several more about specific situations I’ve analyzed.  Based on what I’ve seen so far, it seems Panda 4.0 definitely rewarded sites that took the time to make the necessary changes to improve content quality, engagement, usability, etc.  And that’s awesome to see.

But on the flip side, there were sites that got hammered by P4.0.  All I can say to them is pull yourself up by your bootstraps and get to work.  It takes time, but Panda recovery is definitely possible.  You just need to make hard decisions and then execute.  :)

GG

 

Monday, May 12th, 2014

How To Remarket 70+ Ways Using Segments and Conditions in Google Analytics

Remarketing in Google Analytics Using Conditions and Segments

I know what you’re thinking. Can you really remarket more than 70 different ways using segments in Google Analytics?  Yes, you can!  Actually, when you combine the methods I’ll cover today, there are many more types of Remarketing lists you can build!  So the total number is much greater than 70.

My post today is meant to introduce you to segments in Google Analytics (GA), explain how you can use them to remarket to people who already visited your site, and provide important Remarketing tips along the way.  I hope once you read this post, you’re ready to kick off some Remarketing campaigns to drive more sales, leads, phone calls, etc.

What Are Segments in Google Analytics?
Many digital marketers know about Remarketing already.  That’s where you can reach people that already visited your website via advertising as they browse the web.  For example, if John visited Roku’s website, browsed various products, and left, then Roku could use Remarketing to advertise to John as he browses the Google Display Network (GDN).  The Google Display Network is a massive network of sites that run Google advertising, and includes Google-owned properties like YouTube, Google Maps, Gmail, etc.  According to Google, the GDN reaches 90% of internet users worldwide.

Remarketing via The Google Display Network (GDN)

By the way, if you’ve ever visited a website and then saw ads from that website as you browsed the web, then you’ve been remarketed to.  As you can guess, this can be an incredibly powerful way to drive more sales, leads, etc.  It can also be extremely frustrating and/or shocking to users.  So be careful when crafting your Remarketing strategy!

When Remarketing first rolled out, you could only set up Remarketing lists in the AdWords interface.  That was ok, but didn’t provide a massive amount of flexibility.  That’s when Google enabled marketers to set up Remarketing lists via Google Analytics.  That opened up an incredible amount of opportunity to slice and dice visitors to create advanced-level Remarketing lists.  For example, you could create Remarketing lists based on users who visited a certain section of your website, or lists based on users completing a certain conversion goal, etc.  Needless to say, tying Google Analytics to Remarketing was an awesome addition.

Now, I started using Google Analytics Remarketing functionality immediately to help clients build advanced Remarketing lists, but I had a feeling that Google was going to make it even more powerful.  And they did.

Along Came Segments… Remarketing Options Galore
You might already be familiar with segments in Google Analytics, which was originally named “Advanced Segmentation”.  In July of 2013, Google released a new version in Google Analytics and simply called it “Segments”.  But don’t get fooled by the simpler name.  Segments enable marketers to slice and dice their users and traffic to view reporting at a granular level.  For example, I often set up a number of segments for clients, based on their specific goals. Doing so enables me to quickly view granular reporting by removing a lot of the noise residing in standard reports.

Using Segments to Create Remarketing Lists in Google Analytics

But starting in January of 2014, Google rolled out an update that enabled marketers to use those segments to create Remarketing lists.  Yes, now marketers had an incredible number of options available when creating Remarketing lists.  In addition, you could easily import segments you are already using! This means you could leverage the hard work you’ve already put in when creating segments in Google Analytics.

Although I thought I had a lot of flexibility in creating Remarketing lists leading up to that point, the ability to use segments opened the targeting flood gates.  I remember checking out the list of options when segments for Remarketing first launched, and I was blown away.

For example, using segments you could create Remarketing lists based on:

  • Demographics like age, gender, language, location, and more.
  • Technology options like operating system, browser, device category, mobile device model or branding, and more.
  • Behavior like the number of sessions per user, days since last session, transactions, and session duration.
  • “Date of First Session” where you could create lists based on the initial session date or a range (sessions that started between two dates).
  • Traffic Sources based on campaign, medium, source, or keyword.
  • Ecommerce options like transaction id, revenue, days to transaction, product purchased, or product category.
  • And you can combine any of these options to create even more advanced Remarketing lists.

 

Now, the options listed above are based on the major categories of segments in Google Analytics.  But you can also set Remarketing lists based on conditions.  Using conditions, you could leverage many of the dimensions or metrics available in Google Analytics to build advanced Remarketing lists.  Actually, there are so many options via “conditions” that I can’t even list them all here in this post.

For example, there are eight major categories of dimensions and metrics you could choose from, including Acquisition, Advertising, Behavior, Custom Variables, Ecommerce, Time, Users, and Other.  And each category has a number of dimensions or metrics you can select to help craft your Remarketing lists.

Using Conditions to Create Remarketing Lists in Google Analytics

Note, it can definitely be overwhelming to review the list of options when you first check this out.  Don’t worry, I provide some tips for getting started later in this post.  For now, just understand that you can use segments and conditions in Google Analytics to craft Remarketing lists based on a number of factors (or a combination of factors).  Basically, you have the power to remarket however you like.  And that’s awesome.

Examples of What You Can Do
Enough with the introduction.  Let’s get specific.  I’m sure you are wondering how segments in Google Analytics can be used in the real-world.  I’ll provide a few examples below of Remarketing lists you can build to get back in front of people who already visited your website.  Note, the lists you build should be based on your specific business and website.  I’m just covering a few options below so you can see the power of using segments to build Remarketing lists.

Example 1: Remarket to users who came from a specific referral path (page).
Imagine you knew that certain referring webpages drove a lot of high-quality traffic on a regular basis.  Based on the quality of traffic coming through those referring pages, you decide that you would love to remarket to those users as they browse the web (since you have a strong feel for the type of user they are based on the content at hand).

Using segments, you could create a Remarketing list based on the original referral path (i.e. the referring pages).  And once that list reaches 100 members, then you can start getting targeted ads in front of those users and driving them to your preferred landing page (whether that’s current content, campaign landing pages, etc.)

Using Referring Path to Create Remarketing Lists

And if you find several referring pages that target similar categories of content, then you could use Boolean operators to combine those pages from across different websites.  For example, {referring path A} AND {referring path B}.  For example, if three referring pages are all about Category A, then you could combine them to create a Remarketing list.  You can also use regular expressions to match certain criteria.  Yes, the sky’s the limit.

Using Boolean Operators to Create Advanced Remarketing Lists

Example 2: Reach a certain demographic that has visited your website.
Let’s say you just launched a new product targeting 18-25 year olds and wanted to remarket to users who already visited your website that fit into this category.  You know they showed some interest in your company and products already (since they already visited your site), so you want to reach them via display advertising as they browse the web.

Using segments, you could create a Remarketing list based on age using the Demographics category.  Simply click the checkbox next to the age category you want to target.

Creating Remarketing Lists Based on Demographics

Or to get even more targeted, you could combine age with gender to test various messaging or visuals in your ads.  Going even further, you could add location as another selection to target users based on age, gender, and geographic location (down to the city level if you wanted).

Combining Demographics to Create Advanced Remarketing Lists

Example 3: Target users of specific campaigns, ad groups, or keywords.
Let’s say you are already using AdWords to drive targeted users to your website.  Using segments in Google Analytics, you could build a Remarketing list based on specific campaigns, ad groups, or keywords.  For example, if you have an ad group targeting a specific category or product, then you could create a list containing the users that already searched Google and clicked through your ads related to that category.  It’s a great way to get back in front of a targeted audience.

Creating Remarketing Lists Based on Previous Campaigns

And by combining the targeting listed above with ecommerce conditions like the number of transactions or amount of revenue generated, you could create advanced Remarketing lists targeting very specific types of users.

Creating Remarketing Lists Based on Revenue

Example 4: Pages or Page Titles
If you have been building a lot of new content and want to reach those visitors as they browse the web, then you could create a Remarketing list based Pages or Page Titles.  For example, let’s say you have 25 blog posts about a certain category of content.  They rank very well, have built up a nice amount of referral traffic, etc.  You could build a Remarketing list by select a grouping of pages via urls or via page titles. Then you could reach those users as they browse the web and drive them to a targeted landing pages, knowing they were interested in a certain post (or group of posts) about a certain subject.

Creating Remarketing Lists Based on Page Titles

And you can combine those pages with conversion goals to add users to a list that completed some type of important action on the site.  For example, users that signed up for your email newsletter, users that triggered an event, downloaded a study, etc.

Creating Remarketing Lists Based on Page Titles and Conversion

Remarketing Tips

Based on the examples listed above, I hope you see the power in using segments and conditions to craft Remarketing lists.  But as I said earlier, it can quickly become overwhelming (especially for marketers new to Remarketing).  Below, I’ve listed several important tips to keep in mind while crafting your campaigns.

  1. Remarketing Lists Require 100 Members
    A list requires at least 100 members before you can start showing ads to users.  Keep this in mind when building lists to ensure you can reach that number.  If not, you will never get back in front of those users.
  2. Start Simple, Then Increase in Complexity
    Based on the 100 member requirement, start with simpler Remarketing lists and increase your targeting as you get more comfortable with Remarketing.  Don’t start with the most granular targeting possible, only to have a list of 3 people.
  3. Refine Your Tracking Snippet
    Google requires that you refine your Google Analytics tracking code in order take advantage of Remarketing.  Review the documentation to ensure you have the proper technical setup.
  4. Craft a Strategy First, and Your Lists Should Support Your Strategy
    Don’t create lists for the sake of creating lists. Always start by mapping out a strong Remarketing strategy before jumping into list creation. Your strategy should dictate your Remarketing lists, and not the other way around.  Spend time up front mapping out who you want to target, and why.  And once you have a solid plan mapped out, you can easily build your lists via Google Analytics segments and conditions.
  5. Use Display Advertising In Addition to Text Ads
    Remarketing enables you to use both image ads and text ads.  Definitely use both when crafting your campaigns.  There are a number of sizes and formats you can use.  I recommend hiring a designer to build your ads unless you have in-house staff that is capable of designing high-quality ads.  Use image ads where possible to grab the user’s attention and provide text ads as a backup when a site doesn’t support image ads.  You don’t have to choose one or the other.
  6. Measure Your Results! Don’t “Set It and Forget It”.
    Remarketing is advertising.  And advertising campaigns should have a goal.  Don’t simply set up Remarketing without knowing the intended action you want users to take.  Instead, make sure you set up conversion goals to track how those users convert.  Do not set up the campaign and let it run without analyzing the results.  Understand the ROI of the campaign.  That’s the only way you’ll know if it worked, if the campaign should keep running, and if you should base other campaigns on the original.

 

Summary – New and Powerful Ways to Remarket
After reading this post, I hope you see the power in using segments and conditions for creating Remarketing lists.  In my opinion, too many marketers keep going after new eyeballs and easily forget about the eyeballs that already showed an interest in their company, products, or services.  I believe that’s a mistake.  Instead, marketers can craft advanced Remarketing lists to get back in front of a targeted audience.  Doing so provides another chance at converting them.

Remember, a warm lead is always more powerful than a cold call.  Good luck.

GG

 

Wednesday, April 23rd, 2014

April 2014 Google Algorithm Updates Heavily Targeted Song Lyrics and MP3 Websites (4/05 and 4/18)

Summary: Google has rolled out multiple algorithm updates in April that heavily impacted song lyrics and mp3 websites. This post provides more information about those updates, documents specific sites that were hit, and provides some possible problems that the algo targeted. I plan to update this post as I analyze more sites impacted by the UApril14 updates. 

Google Algorithm Updates From April 2014

I was not planning on writing a post this week, since my schedule is crazy right now.  In addition to my client work, I’ve been building my presentation for the Weber Shandwick Data Salon on Thursday about Google Algorithm Updates, how to recover from them, etc.  That’s ironic because I just stumbled across yet another fascinating algorithm update by Google that has done some serious damage (a set of updates actually).

If you’ve been following my posts, then you probably remember the flawed algorithm update from February.  That update severely impacted movie blogs based on an upstream copyright infringement issue at YouTube.  Google subsequently rolled out a second update in late February, which fixed the problem and returned traffic to normal levels (for the lucky ones).  Some never recovered.

Well, here we go again.  But this time it’s song lyrics websites that got hammered.  I received an email from the owners of songmeanings.com, which provides lyrics, meanings, etc.  I could tell by the messages I received that something serious had gone down.  And it didn’t take long to see the damage.  I fired up SEMRush and saw the massive drop in traffic starting on 4/18.  It looked like they lost 50% of their Google traffic overnight.

Songmeanings.com Impacted by Google Algo Update

And they weren’t alone.  Upon checking other lyrics websites, I saw a number of them had gotten hit just like songmeanings.com.  More about the destruction of lyrics websites soon.  Let’s take a step back and talk Panda for a second.

Claims of Panda Updates in Early April
To take a step back, there was a lot of webmaster chatter in early April about a potential Panda update.  I documented the March Panda update, which looked like the softer Panda that Matt Cutts had mentioned during SMX West.  And once the guys at songmeanings.com reached out to me, it was clear that April was an extremely volatile month as well.  I am seeing multiple updates based on the analysis I have conducted.

First, it was crystal clear that an algorithm update was rolled out on 4/18 (based on analyzing songmeanings.com and the song lyrics niche).  A number of websites all seeing massive drops in traffic overnight is a clear signal that Google rolled something out.  In addition, a lot of websites in one niche getting hit signals that Google was targeting something very specific with the update.  So I told the owners of songmeanings.com to sit tight.  I needed a midnight work session to analyze the site (and the niche).  They signed off and I started burning the midnight oil.  What I found was fascinating, complex, and sometimes confusing.  But it’s important to document this, so webmasters that are impacted can start troubleshooting the situation.

Song Lyrics Niche Heavily Targeted
Just like when the movie blog niche was targeted in February, this update seemed to heavily target song lyrics websites.  Songmeanings.com was not alone when 50% of its Google traffic exited stage right on 4/18.  I quickly saw that others lost significant traffic as well, including lyricsfreak.com, azlyrics.com, lyricsmode.com, sing365.com, etc.

Lyricsfreak.com Impacted by Google Algo Update

Lyricsmode.com Impacted by Google Algo Update

And here’s sing365.com which got hammered on 4/5, only to get hit even more on 4/18:

Sing365.com Impacted Twice by Google Algo Updates

And one really caught my eye.  It showed the same exact trending that slashfilm.com experienced in February with the flawed algo update!  Anysonglyrics.com got hammered on 4/5, only to recover on 4/17.  Check out the screenshot below.

Flashback to SlashFilm – Check out this trending!
Anysonglyrics.com Impacted by Google Algo Updates and then Recovers

Now, I thought February would be a rare occurrence.  It’s not often you see Google roll out an update, only to refine and re-roll that update out just a few weeks later.  But it seems that’s exactly what happened again!  Is this a trend?  Is Google rolling out updates that aren’t fully baked, only to refine and re-roll them back out?  If so, that’s freaking scary.  Just ask Peter from slashfilm.com how business was going during the ten day downturn in traffic.  I’m sure he lost a few nights of sleep, to say the least.

And just like I wondered when the flawed UFeb14Rev came rolling back out, how many other sites were wrongly targeted?  How many won’t recover like SlashFilm did, and how many will ultimately go out of business based on the algo update?  All good questions and only Google knows.  But one thing is for sure.  One algo update can rock your world.  Losing 50%+ of your traffic overnight, and possibly due to a flawed algo, is a tough pill to swallow.

Not All Lyrics Websites Were Negatively Impacted.
Similar to the movie blog situation, not all websites in the niche were negatively impacted.  Some actually increased in traffic during the 4/18 update.  And of course, that got me wondering about the signature of this algorithm update.  What was it targeting?  Why did some websites get slammed while others remained intact?  It was time to roll up my sleeves and research some song lyrics.  Maybe “Sympathy for the Devil” by the Stones or “Free Fallin” by Tom Petty?

Let’s Add More Complexity – mp3 Sites Also Impacted, But Starting on 3/30
OK, now I’m starting to sound crazy, right? Can you see why algorithm updates without confirmation of algo updates can be maddening?  While analyzing several lyrics websites, I found several had relationships with mp3 websites (you know, the ones that illegally let you download music).  Well, checking the trending for those sites revealed big drops starting around 3/30, which was a few days before the lyrics sites started getting hit (on 4/5).

For example, I saw relationships with mp3raid.com, which has 426K DMCA takedowns filed (urls requested to be taken down). I also saw links to 49mp3.com, which has 385K urls requested to be taken down via DMCA.  Yes, that’s a lot of DMCA takedowns, especially compared to some of the song lyrics websites (which often revealed just a handful).  I’m not sure if Google is hammering upstream sites linking to those mp3 websites, or if there’s something else at play.  That said, it’s very interesting to see those mp3 sites get hammered just days before the lyrics websites got hit (and again, the sites are connected via links, and possibly affiliate relationships).

MP3Raid.com Impacted by Google Algo Update on 3/30

An Important Note About Quickly Rebounding
I mentioned anysonglyrics.com earlier and how it rebounded already (dropping on 4/5, but recovering on 4/17, presumably as Google rolled out a second update).  Well, they weren’t alone. I saw that trending a few times during my analysis.  That got me thinking that the update was targeting something that could be turned off pretty quickly by the websites that were impacted.  Now, I’m not saying that’s 100% the case, but it could be.

For example, were they linking to websites or downloads that Google didn’t like?  I did notice many links to toolbars like RadioRage, which has a horrible WOT score (see screenshot below).  It sounds like malware has been a big issue with RadioRage (and similar products).  If Google feels sites are heavily driving users to malware, or a conduit for malware, then I could definitely see them taking action.  And for the sites that rebounded, was there something they did or changed during that downturn?  Hard to say.

Linking to Toolbars That Distribute Malware

 

Identifying Common Traits That Could Be Targeted
So as the midnight oil burned, I started digging into song lyrics websites. My goal was to identify common traits across sites negatively impacted, while also checking out the sites that were spared.  I had no idea if I would find a smoking gun, but I had hopes of identifying several possible causes.

Disclaimer: Now is a good time to run through a quick disclaimer. Only Google knows what it targeted during the updates in April. I can only give my best guess based on helping many companies with Panda and other algorithm updates. With that out of the way, here are some interesting issues that surfaced during my analysis.

DMCA Takedown Notices
Based on the nature of the websites, I quickly checked Google’s Transparency Report for DMCA takedowns filed against the domains.  Several of the sites were listed, but some only had a few.  For example, songmeanings.com only had three urls listed.  Others had more like lyricsmode.com with 993, but there wasn’t a consistent high number associated with all that were hit.  Also, some that were spared also had DMCA takedowns filed against them (like DirectLyrics.com with 14).

But the mp3 sites that were targeted had many DMCA takedowns filed (as I mentioned earlier).  And if the lyrics websites are affiliates, or are simply driving users to illegally download files, then maybe Google targeted that.  Hard to say, but it was an interesting find. Now you would think that issue would be taken care of via the Pirate update, and not necessarily Panda or a separate update, but it’s entirely possible.  Let’s move on.

DMCA Takedown Notices and Algo Updates

Followed Affiliate Links and Heavy Cross-linking
Analyzing the sites hit by the 4/18 update revealed a number of affiliate links.  And some were definitely followed affiliate links (which violates Google Webmaster Guidelines).  But, this didn’t look like a new issue, and there wasn’t much consistency.  For example, there were sometimes followed affiliate links on sites that weren’t hit by the update.  Therefore, I’m not sure the affiliate links were the cause of the algo hit (although I would recommend to all the lyrics websites that they nofollow all affiliate links).

Beyond the obvious affiliate links, there was a boatload of cross-linking going on between lyrics websites.  I’m not sure if many are owned by the same network, but it was pretty clear that some were trying to drive traffic and SEO power to the others.  And many of those links were followed.  Without digging into the history of all the domains, it’s hard to identify all of the relationships (which websites are owned by one company, which have long-standing affiliate relationships, etc.)  But I saw this enough across lyrics websites that I wanted to bring it up here.

Affiliate Links and Algo Updates

Duplicate Content and Thin Content
We know that Pandas love eating duplicate content, thin content, etc.  I can’t say whether this was a Panda update, or something more sinister, but I did notice some typical Panda issues across several sites. I definitely found duplicate content issues across lyrics websites (and some were relatively extreme).  I also found many thin pages, with some containing almost no content at all (beyond the site template).  But, this was not a new issue, and I ran into the consistency problem again.  Not all sites hit had the same level of duplicate or thin content, and some sites were unscathed that had those problems.

Therefore, I’m not confident that duplicate content was the cause.  But again, I would definitely fix the content problems asap.  Just because I don’t think it was the cause of this hit doesn’t mean it couldn’t cause another hit.  Like I said in my last Search Engine Watch column, make your site the anti-bamboo.  :)

Duplicate and Thin Content and Panda

Page Speed and Serious Performance Issues
Now here’s an interesting problem I saw across a number of lyrics websites negatively impacted by the 4/5 and 4/18 updates.  Many were experiencing serious performance issues.  I’m not talking about taking a few seconds to load.  I’m talking about NEVER fully loading.  You could see chrome and firefox still trying to load something even a full minute or two into rendering the page.

And when I tried to running page speed tests, they wouldn’t even run!  I can tell you, I rarely come across that during my audits.  So, could extreme performance issues have caused the algo hit?  Hard to say, since I’m not analyzing the sites on a regular basis.  But let’s face it, Google definitely doesn’t want to send users to sites that take forever to load.  I’ll mark this down as “maybe”.  But if I were the owners of the lyrics websites, I would definitely take a hard look at performance and try to rectify the excessive load times.

Page Speed and Algo Updates

YouTube Upstream Copyright Issues Again?
I noticed that several of the sites negatively impacted had video sections (or contained videos on the lyrics pages for each song).  Based on what I saw with SlashFilm and the movie blog niche, it wouldn’t shock me if the same upstream copyright infringement issue was at play here.  For example, videos that had been taken down or flagged for copyright infringement that are being embedded on the lyrics sites.

Just like I said with the movie blog situation, that’s not really the fault of the websites that are embedding the videos… since it’s more of a YouTube problem.  But I saw this heavily during my movie blog analysis and I know the lyrics websites contain YouTube videos.  It’s worth looking into if you’re a lyrics website impacted by these recent updates.

Video Copyright Infringement and Algo Updates


A Note About Unnatural Links and/or Paid Text Links
Checking the link profiles of various lyrics websites revealed an unnatural links problem. I won’t go into too much detail here, but you could see red flags for sure.  But based on what I’m seeing trending-wise, it’s hard to believe this was some type of an unnatural links algo update.  Some sites rebounded just a few weeks later (or even days later), so I’m not sure this reflects some type of algorithmic move by Google to hammer sites gaming links.

From a manual actions standpoint, I don’t know how many of these sites have manual actions, but I do know several of them don’t.  So, I’ll just leave the unnatural links discussion here… But a warning to lyrics websites about Penguin and unnatural links, I’d probably tackle that situation sooner than later.

Unnatural Links and Song Lyrics Websites

 

Moving Forward – Next Steps for Lyrics Websites Impacted
Like I said earlier, it’s been fascinating to analyze the latest algo updates pushed out in April. As you can see, song lyrics websites were hit pretty hard.  Some have recovered, but a number of them still remain impacted.  Also, mp3 websites were hit hard too, but it looks like that update started closer to 3/30.  Remember what I said about the complexity of algorithm updates?

For sites that have been impacted, I recommend moving quickly to track down all possible problems.  Then I would begin fixing them asap.  The quicker you can get your site in order, the quicker you can experience recovery.  And since some sites have recovered already, it’s possible that can happen to your site as well.  Since I couldn’t identify a smoking gun, I would review all of the problems I documented in this post.  That’s a great place to start.  Good luck.

GG

 

Wednesday, April 16th, 2014

I’m Speaking at the Weber Shandwick Data Salon on April 24th – Learn About Google Algorithm Updates, Manual Penalties, and More

Weber Shandwick Data Salon on April 24, 2014

I’m excited to announce that I’ll be speaking at the Weber Shandwick Data Salon on Thursday, April 24th in New York City (from 6:00PM to 7:30PM).  Each month, Weber Shandwick invites leaders from various areas of digital marketing to speak, to spark conversation, and to share ideas.  I’m thrilled to be presenting next week to speak about the latest in SEO.

My presentation will cover some extremely important topics that I’m neck deep in on a regular basis, including Google algorithm updates, manual penalties, and the war for organic search traffic that’s going on each day.  I’ll be introducing various algorithm updates like Panda and Penguin, explain what manual actions are, and provide case studies along the way.  I’ll also introduce the approach that Google is using to fight webspam algorithmically, while also covering how manual penalties work, how to recover from them, and how to ensure websites stay out of the danger zone.

My goal is to get the audience thinking about content quality, webspam, unnatural links, and webmaster guidelines now before any risky tactics being employed can get them in trouble.  Unfortunately, I’ve spoken with hundreds of companies over the past few years that were blindsided by algo updates or manual actions simply because they never thought about the repercussions of their tactics, didn’t understand Google’s stance on webspam, or the various algorithm updates it was crafting.  Many of them learned too late the dangers of pushing the envelope SEO-wise.

So join me next Thursday, April 24th at 6PM for a deep dive on algorithm updates, manual penalties, and more from the dynamic world of SEO.  You can register today via the following link:

Register for Weber Shandwick’s Data Salon on April 24th:
https://www.surveymonkey.com/s/N6G5K5B

Below I have provided the session overview.  I hope to see you there!

Weber Shandwick Data Salon #3
April 24, 2014 from 6:00PM to 7:30PM
Speaker: Glenn Gabe of G-Squared Interactive
Moderator: Kareem Harper of Weber Shandwick
909 Third Avenue, 5th Floor
*Refreshments will be available starting at 6:00pm
  


Front Lines of SEO

The Frontlines of SEO – Google Algorithm Updates, Penalties, and the Implications for Marketers
Explore Google’s war on webspam, learn about key changes and updates occurring in Search right now, and fully understand the implications for digital marketers.

There’s a battle going on every day in Search that many people aren’t aware of.  With millions of dollars in revenue on the line, some businesses are pushing the limits of what’s acceptable from an SEO perspective.  In other words, gaming Google’s algorithm to gain an advantage in the search results.

Google, with its dominant place in Search, is waging war against tactics that attempt to manipulate its algorithm.  From crafting specific algorithm updates that target webspam to applying manual actions to websites, Google has the ability to impact the bottom line of many businesses across the world.  And that includes companies ranging from large brands to small local businesses.  This session will introduce the various methods Google is using to address webspam in order to keep its search results as pure as possible. Specific examples will be presented, including case studies of companies that have dealt with algorithm updates like Panda and Penguin.  Manual penalties will be discussed as well.

Beyond battling webspam, the major search engines have been innovating at an extremely rapid pace.  The smartphone and tablet boom has impacted how consumers search for data (and how companies can be found).  And now the wearable revolution has begun, which will add yet another challenge for marketers looking to reach targeted audiences.   Glenn will introduce several of the key changes taking place and explain how marketers can adapt.  Glenn is also a Glass Explorer and will provide key insights into how Google Glass and other wearables could impact marketing and advertising.

Register today to learn more about Google’s war on webspam, to better understand the future of Search, and to prepare your business for what’s coming next.

You can register online by clicking the following link:
https://www.surveymonkey.com/s/N6G5K5B

 

 

Monday, March 31st, 2014

Did the Softer Panda Update Arrive on March 24, 2014? SMBs Showing Modest Recovery Across Industries

Softer Panda Update on March 24, 2014

As a consultant helping a number of companies with Panda recovery, I’ve been eagerly awaiting the March Panda update.  Based on the data I have access to, I was able to pick up and analyze Panda UJan14, UFeb14, and the infamous UFeb14Rev (where Google re-rolled out the algorithm update after mistakenly hammering movie blogs).  Needless to say, it’s been an interesting beginning to the year Panda-wise. And if you’re wondering what U{Month}{Date} is, that’s the naming convention I’m using for unconfirmed Panda updates.

And in case you forgot, Google announced in July of 2013 that they wouldn’t be confirming Panda updates anymore.  As I explained in a post soon after that, unconfirmed Panda updates can cause mass chaos and can drive webmasters dealing with mysterious traffic losses insane.  But, I also explained that if you have access to a lot of Panda data, you can sometimes pick up the updates.  And that’s where SEOs helping a lot of companies with Panda can come in very handy.  Those SEOs have turned into human Panda barometers and can help identify when the specific updates roll out. Remember, we know that Panda is supposed to roll out monthly, and can take about ten days to roll out.  It’s not real-time, but Google trusts the algorithm enough to unleash it once per month.

I have been able to identify a number of updates since July of 2013, including UJan14 from January 11th and the flawed UFeb14 from February 11th (which was the movie blog fiasco I mentioned earlier).  But it’s been relatively quiet since then from a Panda standpoint.  I’ve only seen some moderate movement around 3/11/14, but nothing that I could nail down as Panda.  But then the 24th arrived, and it quickly became clear that something widespread was taking place.  Just like a Panda update.

Upward Movement Across Panda Victims
During the week of March 24th, I was checking organic search trending across clients and quickly noticed a number of increases from Google Organic.  The first one that caught my attention was a website that had been battling Panda for a long time.  It’s an ecommerce site that has seen many ups and downs since February of 2011 when Panda first arrived (and more downs than ups if you know what I mean).  The 25th showed an 18% increase in traffic, and it has consistently remained higher since then.  Google Webmaster Tools now shows increases in impressions and clicks starting on the 24th.  Comparing the entire week to previous weeks reveals Google Organic traffic was up 15%.

An example of a bump in Google Organic traffic starting on 3/24/14:

Panda Recovery Begins on 3/24/14

And that site wasn’t alone.  I was seeing similar lifts in Google Organic traffic across a number of Panda victims I have been helping.  That lift ranges from 9% to 24%, with a few outliers that saw much larger increases (45%+).  Note, those sites seeing larger increases didn’t have massive amounts of traffic, so it was easier to show a much larger lift.  That being said, the lift was significant for them.  But overall, I mostly saw moderate recoveries versus significant ones during this update.  And that leads me to think we just might have seen the “softer” Panda update that was supposed to help small to medium sized businesses (SMBs).

 

Matt Cutts and a “Softer” Panda
At SMX West, Matt Cutts from Google explained that they were working on a “softer” version of Panda that would make it less of an issue with certain websites.  Matt said the “next generation” Panda would be aimed at helping small businesses that might be affected by Panda.  Well, based on what I’m seeing, it sure looks like the new Panda could have rolled out.  Almost all of the companies I analyzed that were positively impacted by the 3/24 update could be categorized as SMBs.  They aren’t big brands, major corporations, they don’t have a lot of brand recognition, and some are run by just a few people.

In addition, most of the recoveries fell into the 10-20% range, which were modest increases.  Don’t get me wrong, that’s still a nice lift for some of the companies that previously got hit by Panda, but it’s not a massive recovery like you might see during other Panda updates.  For example, a company I was helping that got hit by Phantom in May of 2013 ended up recovering in August and surged by 68% in Google Organic.  That’s a big recovery.  So, modest recoveries line up with a “softer” algo that could help small businesses (in my opinion).


March 24, 2014 – A Good Day for (SMB) Panda Victims
Below, I have included some screenshots of Google Organic trending for companies impacted by Panda UMarch14.  You can clearly see a lift starting on the 24th and remaining throughout the week.  Note, these companies span various industries, so it wasn’t tied to one specific niche.

Panda Recovery SMB on 3/24/14 GA Data

 

Panda Recovery on 3/24/14 Google Webmaster Tools

 

Panda Recovery on 3/24/14 Google Analytics

 

Panda Recovery on 3/24/14 GWT

 

Panda Recovery on 3/24/14 Searchmetrics

 

 

Common Characteristics and Drivers for Recovery
If you have been impacted by Panda in the past, or if you are simply interested in the algorithm update, then I’m sure you are wondering why these specific companies recovered on 3/24.  And no, not all companies I’m helping with Panda recovered.   Now, only Google knows the refinements they made to the Panda algorithm to soften its blow on small businesses.  That said, I think it’s important to understand what Panda victims have addressed in order to better understand how the algorithm works.

Below, I’ll cover some of the common problems I’ve been helping companies tackle over the past several months Panda-wise (the companies that recovered during UMarch14).  I’m not singling out certain factors as the trigger for this specific update and recovery.  But I do think it’s worth covering several of the factors that were causing serious problems Panda-wise, and that were rectified over the past few months leading up to the recoveries.

Over-optimized Thin Pages
Several of the websites that experienced recovery had serious problems with thin pages that were over-optimized.  For example, pages with very little content combined with over-optimized title tags, meta descriptions, body copy, etc.  And the body copy was typically only a paragraph or two and was clearly written for search engines.

Over-optimized Titles and Panda

Doorway Pages
Along the same lines, several of the companies employed doorway pages to try and gain organic search traffic across target keywords.  For example, they would reproduce pages and simply change the optimization to target additional keywords.  For some of the sites I was helping, this was rampant.

Duplicate Pages and Panda

Keyword Stuffing
Some of the companies that saw recovery were keyword stuffing pages throughout their sites.  For example, all core page elements excessively contained target keywords.  The copy was extremely unnatural, the on-page titles (which were often the h1s), were clearly targeting keywords, the navigation was all using exact match anchor text, and the footer was crammed with more keyword-rich content and exact match anchor text links.  And many times, the target keywords were repeatedly bolded throughout the content.  It was obvious what the goal was while analyzing the pages… it was all for SEO.

Keyword Stuffing and Panda

Excessive Linking Using Exact Match Anchor Text
Some of the websites that saw recovery were previously weaving exact match anchor text links into every page on the site.  So, you would visit a page and immediately find exact match or rich anchor text links from the copy to other pages on the site.  It was excessive, unnecessary, and made for a horrible user experience.  And as I explained above, several of the sites were also employing spammy footers with exact match anchor text links (and many of them).

Affiliate Links
Several of the companies that saw recovery were including followed affiliate links in the site content.  Those links should absolutely have been nofollowed.  During initial audits I would uncover followed affiliate links, flag them, and document them in a spreadsheet.  When sharing my findings with my clients, some of the links were so old that my clients didn’t even remember they were there!  “Affiliate creep” can cause big problems post-Panda.  Nofollowing all affiliate links or removing them was important for sure.

Followed Affiliate Links and Panda

Nuking Duplicate Content or Noindexing Thin Pages
Some of the companies that saw recovery had an excessive amount of duplicate or thin content.  Upon surfacing the problematic urls, my clients either removed or noindexed those pages.  In some cases, that impacted tens of thousands of pages (or more).  Addressing “low-quality” content is one of the most important things a company can do Panda-wise.  And that’s especially the case if some of those pages were ranking well in Google (prior to the Panda hit).  You can read more about the sinister surge in traffic before Panda strikes to learn more about that phenomenon.

Noindexing Content and Panda

 

Warning: Some Sites Slipped Through The Panda Cracks
I also wanted to quickly mention something that can happen with algorithm updates.  There were two sites I analyzed that showed a modest recovery that shouldn’t have recovered at all.  They were rogue sites that some of my clients had set up in the past that were simply sitting out there.  Those sites are not getting a lot of attention from my clients, and there has been very little work on those sites from a Panda standpoint.  Needless to say, I was surprised to see those sites positively impacted by Panda UMarch14.  Sure, they didn’t surge in traffic, but they definitely increased starting on the 24th.  This also leads me to believe that we saw the softer Panda update that Matt Cutts mentioned.

False Panda Recovery on 3/24/14
Summary – Be In It For The Long Haul
As I explained earlier, Matt Cutts promised a “softer Panda” at SMX West that could help small businesses.  Based on what I have seen, that new update might have rolled out on 3/24.  I saw a number of companies that were dealing with Panda problems recover to some extent starting on that date.

If you have been hit by Panda, then the recoveries I documented above should signal hope.  The companies that saw recovery have worked hard to rectify a range of “content quality” problems.  Audits were completed, problems were identified, and a lot of work was completed over the past few months.

The good news is that a number of the websites making significant changes saw a positive impact from Panda UMarch14.  I think it underscores a Panda philosophy I have been preaching for a long time.  You must be in it for the long haul.  Short-term thinking will not result in recovery.  You need to have the proper analysis completed, identify all content-related problems, and work hard to rectify them as quickly as  you can.  And Google crafting an algorithm update that softens the blow of Panda sure helps.  So thank you Matt Cutts.  From what I can see, there are companies seeing more traffic from Google today than they did a week ago.  And that’s never a bad thing.

GG

 

Thursday, March 27th, 2014

Smartphone Rankings Demotion in Google Search Results Based on Faulty Redirects [Case Study]

Smartphone Rankings Demotion in Google Search Results

In June of 2013, Pierre Far from Google explained that providing a poor mobile experience could impact a site’s rankings in the smartphone search results.  Basically, if a site is making mistakes with how it is handling mobile visits, then that site risks being demoted in the search results when users are searching from their smartphones.  And as smartphones boom, that message scared a lot of people.

Specifically, Pierre listed two common mistakes that could cause a poor user experience.  First, having faulty redirects could force users to irrelevant content, or to just to the mobile homepage of a website.  For example, imagine searching for a specific product, service, review, blog post, etc., and finding that in the search results.  But as you click through, the site redirects you to the mobile homepage.  That sounds really annoying, right?  But it happens more than you think.  And that’s especially true since the problem is hidden for desktop users.

But that June day in 2013 passed, and businesses moved on.  Sure, mobile is important, it’s taking over, blah blah blah.  In addition, I’m sure many wondered if Google would really demote a site in the smartphone search results.  I mean, why move a powerful site like yours down in the results when your pages really should rank highly (like they do on desktop)?  Google would probably only do that to low quality sites, right?..  I think you see where I’m going with this.

 

Faulty Redirects – An Example Caught in the Wild
Last week, I was checking Techmeme for the latest technology news and clicked through an article written by Electronista.  I forget which story the article was about, but Electronista was listed first for the news at hand.  So I clicked through and was immediately redirected to the mobile homepage.  I navigated back to Techmeme, clicked the listing again, and was promptly redirected again.  So I visited another site listed for the story on Techmeme and got the information I was looking for.

*ALERT* – That’s exactly the user experience Google is trying to avoid happening to people searching Google.  And that’s one of the core scenarios that Pierre listed that could result in a rankings demotion.  So that got me thinking.  What about other pages on Electronista?  Were they also redirecting mobile users to the mobile homepage?  And if this problem was widespread, were they being demoted in the smartphone search results?  And so I dug in.

Side note: I’m not targeting Electronista by writing this.  Actually, I hope this post helps them.  I can only imagine that if they fix the problem, then their traffic from smartphone users on Google will skyrocket.


An Example of a Faulty Redirect on Electronista

I’m sure you are wondering how this looks.  Here’s a quick example.  Let’s say I was researching a Nexus 7 tablet and comparing it to an iPad mini.  Electronista has an article focused on that topic.  On desktop or tablet, I visit that url and can view the entire post.  But on my smartphone, visiting that url redirects me to the mobile homepage (via a 302 redirect).

Desktop URL resolves correctly:

Desktop URL on Electronista.com

When searching on my smartphone, the site incorrectly redirects me to the mobile homepage:

Redirect to Mobile Homepage on Electronista.com

 

Here is the 302 redirect in action:
302 Redirect to Mobile Homepage on Electronista.com

 

Examples of Demoted Smartphone Rankings
Electronista.com has 143K pages indexed in Google.  And every url I checked on my smartphone is redirecting to the mobile homepage.  So it wouldn’t take Google very long to pick up the problem, and across many pages on the site.  But now I needed evidence of rankings being demoted based on this problem.

So I fired up SEMRush and checked the organic search reporting for Electronista.com.  I started picking keywords that the site ranked highly for (on page 1 on Google).  Then I started searching on my desktop and smartphone using Chrome for Android (incognito mode).  And low and behold, I noticed the problem almost immediately.  Smartphone rankings were either much lower or non-existent for content that was ranking highly on desktop.  Almost all of the keyword/ranking combinations I checked revealed the demotion in the smartphone search rankings.

Note, not every Electronista listing was being demoted.  There were a few outliers where the page still ranked well (as well as it did on desktop search).  But the user experience was still horrible.  I was redirected to the mobile homepage and forced to fend for myself.  Needless to say, I wasn’t going to start searching the mobile site for the url I expected to see.  I just bounced.  And again, Google doesn’t want its users to have to deal with this situation.  Instead, Google will just demote the search rankings on smartphones.

A picture is worth a thousand words, so let’s take a look at some examples.  Below, I have provided screenshots of the demotion in action.  You’ll see the desktop search results first and then the smartphone search results below that.

Red Camera For Sale (ranks #8 on desktop and N/A on smartphone)

Desktop search results:
Desktop Search for Red Camera on Google

Mobile search results:
Mobile Search for Red Camera on Sale on Google

 

LTE Microcell (ranks #10 on desktop and N/A on smartphone)

Desktop search results:
Desktop Search for LTE Microcell on Google

Mobile search results:
Mobile Search for LTE Microcell on Google

 

HTC Vivid Radar (ranks #3 on desktop and #20 on smartphone)

Desktop search results:
Desktop Search for HTC Vivid Radar on Google

Mobile search results:
Mobile Search for HTC Vivid Radar on Google

 

Google Nexus 7 Versus ipad mini (ranks #8 on desktop and #18 on smartphone)

Desktop search results:
Desktop Search for Google Nexus 7 Versus iPad Mini on Google

Mobile search results:
Mobile Search for Google Nexus 7 Versus iPad Mini on Google

Skullcandy Pipe Review (ranks #5 on desktop and #10 on smartphone)

Desktop search results:
Desktop Search for Skullcandy Pipe Review on Google

Mobile search results:
Mobile Search for Skullcandy Pipe Review on Google

 

And here are a few where the rankings were not demoted.  They should be demoted, but they weren’t (at least for now):

 

Commodore 64 remake
Mobile Search for Commodore 64 Remake Review on Google 

 

Windows 8 touch screen requirements
Mobile Search for Windows 8 Touch Screen Requirements on Google 

 

 

How To Avoid Demoted Smartphone Search Rankings (Listen up Electronista)

The solution to this problem is fairly straightforward.  If you are using separate webpages for mobile content, then you should redirect your desktop pages directly to the mobile url for that content.  Do not redirect all requests from smartphones to the mobile homepage.  As Google explains, “This kind of redirect disrupts a user’s workflow and may lead them to stop using the site and go elsewhere.”  And by the way, Google also says that it’s better to show smartphone users the desktop content versus implementing a faulty redirect to the mobile homepage.  I completely agree.

In addition, make sure you use rel alternate on your desktop pages pointing to your mobile pages.  And then use rel canonical on your mobile pages pointing to your desktop pages.  You can read Google’s documentation for handling various mobile setups here.

Update: Pierre Far from Google provided some feedback based on reading this case study.  I asked Pierre how quickly Google would remove the demotion once the redirect problem was fixed.  Here is what Pierre said:
“When a fix is implemented, we’d detect it as part of the usual crawling and processing of each URL.”

So, it seems that once the redirects are corrected, Google will detect the proper setup as it recrawls the site.  As it does that, the pages should return to their normal rankings.  If Electronista makes the necessary changes, I’ll try and figure out how quickly their smartphone rankings return to normal. Stay tuned.

Avoid Smartphone-only Errors
I covered faulty redirects and the impact they can have on search rankings, but there’s another scenario that can get you in trouble.  Google also explains that smartphone-only errors can also result in demoted smartphone rankings.  And in my experience with auditing websites, these types of errors can go unnoticed for a long time.

For example, if you incorrectly handle Googlebot for smartphones, then you could incorrectly present error pages to users.  In addition, the code that handles mobile pages could be bombing, which would also present errors to smartphone users.  Needless to say, I highly recommend testing your setup thoroughly via a number of devices, checking your site via mobile emulators, and crawling your site as Googlebot for smartphones.  The combination will often reveal problems lying below the mobile surface.

Note, Google Webmaster Tools also recently added smartphone crawl errors.  The report provides a wealth of information about the errors that Googlebot for Smartphones is running into.  And that includes server errors, 404s, soft 404s, faulty redirects, and blocked urls.  I highly recommend you check out your reporting today.  You never know what you’re going to find.

Smartphone Crawl Errors Reporting in Google Webmaster Tools

 

Summary – Turning Demotions into Promotions
As mobile booms, more and more people are searching from their smartphones.  Google is well aware of the problems that mobile users can face while searching for, and viewing, content on their phones.  And in response to those problems, Google will demote your rankings in the smartphone search results.  Electronista is currently implementing faulty redirects, and based on that setup, its rankings are being heavily demoted.  Don’t let this happen to you.  Check your setup, view your reporting in Google Webmaster Tools, and then quickly fix any problems you are presenting to mobile users.  Think of all the traffic you might be losing by not having the right mobile setup in place.  The good news is it’s a relatively easy fix.  Now fire up those smartphones and visit your site.  :)

GG