Analysis and Findings From The September 2015 Google Algorithm Updates (9/2 and 9/16): Panda 4.2 Tremors, Manual Updates, The Methode Philosophy, and More

September 2015 Google Updates

In my last post, I explained what I have seen during the extended rollout of Panda 4.2. I ended up analyzing over seven weeks of Panda data, since P4.2 is going through an extended rollout. And yes, it’s still rolling out now. More about that soon. Overall, it had been very quiet leading up to that post. Sure, there were some sites that had seen impact and movement, but not like typical Panda updates. It was an underwhelming update to say the least.

Well, something has changed and more significant tremors have been seen across the web.

It’s hard to say if what I’ve been seeing is Panda or a tweak to Google’s core ranking algorithm, but September has been an extremely volatile month algorithm-wise. Specifically, there was movement on 9/2 with some sites seeing large drops in Google organic traffic. And then there was significant movement on 9/16 where some sites surged in traffic, while others dropped heavily.

For example, here’s a site that dropped on 9/16:

Drop During September 16 Google Update

And here’s a site that surged:

Surge During September 16  Google Update

Now, Google pushes hundreds of updates per year, with over one thousand last year according to John Mueller. But the September tremors seem to impact sites with both Panda and Phantom baggage. I’ll cover more about that in this post, but to me it seems like the tremors we have seen are tied to content quality. And again, we have two quality cowboys in town now with Phantom (AKA The Quality Update) and Panda. I definitely saw some sites that got hit hard by Phantom jump during the 9/16 update.

So it’s hard to say it was one over the other. And like I said in my post about Phantom, maybe that was the beginning of Panda being baked into Google’s core ranking algorithm. In other words, they might not be so different…

In this post I’ll cover more about what I’ve seen in September, what Googlers have said about Panda 4.2, I’ll explain more about the connection of these latest updates to Panda and Phantom, and I’ll touch on what I’m calling “The Methode Philosophy”, which is a possible change in how Google views Panda. Let’s jump in.

Confirmed: Panda 4.2 Still Rolling Out & Will For Months
Panda 4.2 started rolling out on 7/18/15 and we knew it would be an extended rollout that could take months. But nobody knew how long that extended rollout would last. Is it still rolling out, has it completed already, and how long before the update completes? All good questions that needed to be answered.

Well, at SMX East Google’s Gary Illyes explained that Panda 4.2 is indeed still rolling out. He last checked on 9/30 and Panda 4.2 was still roaming the web. In addition, he said it will continue to roll out for the next few months. Yes, read that again. Panda 4.2 could have a six month rollout (or longer). As you can imagine, that’s a huge change from previous updates.

Panda 4.2 is still rolling out.

Panda 4.2: Manually Pushed Or Auto-Rollout?
Another question about Panda 4.2 revolves around how it’s being released. Was a giant red button pushed on 7/18 and then P4.2 began its long and extended rollout? Or is Google manually pushing Panda at intervals during the extended rollout. There’s an important distinction between the two.

If Google is pushing the updates manually, then they can refine and tweak the algo based on what they are seeing in the SERPs (to ensure optimal results). That’s similar to what I saw with Panda 4.0 tremors, which was clarified by John Mueller. He said that after a large update, Google can and will tweak that algo to ensure everything is working they way they need it to.

{Updated based on clarification from Gary Illyes:}
So I decided to ask Google’s Gary Illyes on Twitter. At first, Gary explained that Panda 4.2 was being manually pushed at various intervals. But that’s actually not correct. He misunderstood the tweet and didn’t mean that Panda was being pushed manually at various intervals. Instead, Gary explained that Panda 4.2 was manually pushed on 7/18/15 and then automatically rolls out over months. From what he said, there have been no refinements to Panda 4.2 since the initial rollout. That’s important to understand, since it can help us better understand what we are seeing now volatility-wise. And a big thank you to Barry Schwartz for questioning the original response. Without that, I think we would still think Panda 4.2 was being manually pushed.

Both twitter conversations are below. First, here’s my question from yesterday.

Gary Illyes confirms manual Panda 4.2 rollout.
And here’s the clarification from Gary this morning:
Gary Illyes Clarifies Panda 4.2 Rollout

September Volatility – 9/2 and 9/16 To Be Specific
I’m sure you’re wondering what type of volatility I’ve seen in September and how that manifested itself in Google rankings and traffic. Below, I’ll provide some screenshots of what I’ve seen across websites, categories, and countries. I’ll focus on September 16 and highlight any connections I’ve seen with previous Panda and/or Phantom impact.

Google Analytics Trending For A Site Impacted By The 9/16 Update:

GA Drop During Sep 16 Google Update

Google Search Console Trending For A Site Impacted By The 9/16 Update:

GSC Drop in Clicks During 9/16 Google Update

SEMrush Trending For A Site Impacted By The 9/16 Update:

Drop During September 16 Google Update

Google Analytics trending showing a drop during the 9/2 update:

Drop From Sep 2 Google Algorithm Update

Searchmetrics showing a Phantom hit and then positive impact during the 9/16 update:

Phantom and Sep 16 Impact

Another site showing negative impact during Phantom and then an increase during the 9/16 update:

More Phantom and September 16 Movement

SEMrush Trending For A Site Positively Impacted By The 9/16 Update:

Surge In Traffic During Sep 16 Google Update

Searchmetrics Trending For A Site Positively Impacted By The 9/16 Update:

More Upward Movement During 9/16 Update

SEMRush Trending For A Site Seeing Positive and Negative Movement During The 9/2 and 9/16 Updates:

Ups and Downs During September Google Updates

The September Updates Seemed To Target Content Quality (Again)
When analyzing the drop in traffic across sites impacted by the 9/2 and 9/16 updates, I saw a number of familiar problems from a content quality standpoint. These are problems I’ve seen many times while helping companies with Panda and/or Phantom hits. Although some of the sites impacted had historical link problems, not all sites impacted had link issues. But they all had content quality problems.

User Happiness FTW
If you’ve read my previous posts about Panda or Phantom, then you know user happiness is extremely important. If you frustrate or anger users, then you can send them fleeing from your site. During my analysis of the September 2 and September 16 updates, I saw pages that dropped in rankings that provided horrible user experiences. For example, heavy ads scattered throughout the page, forced pagination (for monetization purposes), low quality supplemental content, and more. Let’s explore several of the problems below. Note, if you’ve been following Panda over the years, these problems should not surprise you.

Slideshows or Paginated Content with Keyword Content Not Visible On-Load
I saw several examples of pages that dropped out for keywords when those keywords were not visible on-load. For example, imagine searching for a topic, clicking a result in the search results, landing on a slideshow or paginated article, and not seeing the keywords or topic visible.

This could result in a number of problems SEO-wise. The most important being users getting frustrated and jumping back to the search results since they can’t easily find the content they searched for. Remember, low dwell time is a giant invitation to Panda. Also, John Mueller has said many times that hidden content will either be heavily discounted or not indexed at all. So if one doesn’t get you, the other will. What’s interesting is that pages which used to rank highly for queries related to keywords searched for dropped during the 9/16 update.

September 16 Update Pagination Issues

Thin Pages, Yet Hiding Content
This was a strange one, and it somewhat relates to the bullet above. I found pages that were inherently thin (given the niche), yet the company still chose to force a user to trigger additional content (versus providing it on-load). I have no idea why they would do this on an already thin page, but they did. And I saw those pages drop. It’s worth noting several of those sites also saw drops during the Quality Update (Phantom) in early May of 2015.

September 16 Update and Thin Content

Technical Problems Causing Content Problems (or Ad Problems)
There were a few sites that saw substantial impact during the 9/16 update based on technical problems causing content and/or ad problems. For example, one site built using AngularJS using html snapshots ended up having a giant ad plastered across its content (the snapshot that Google indexes).

Thousands of pages were impacted. On 9/16 they lost approximately 40% of their Google organic search traffic. I can’t say 100% that the drop was due to the ad issue, but it sure looks that way. It’s a leading site in its niche with thorough and detailed content. Although I haven’t analyzed the entire site, having your html snapshots polluted by a giant ad is not good. Not good at all actually.

September 16 Update and Ad Problems

Indexed Bulk Thin Content
A large-scale website with 60M+ pages indexed ended up having over 1M ultra-thin, lower quality pages crawled and indexed (by accident). The business owner already nuked those pages, but they have been on the site for a while. This is also a site that typically sees movement with Panda updates based on its niche, size, and complexity. This is a solid example of “bamboo creep” that can occur without showing any major signs. Then boom, it catches up to you very quickly.

September 16 Update and Bulk Thin Content

Simple, 1997-like Design… INCREASED
I have to explain one jump I saw, since it shocked me at first. The site surged during the 9/16 update, so I dug into the increase in rankings and traffic. When I first visited the site, I almost fell out of my seat. It was so simple, so plain, and so 1997-like, that I couldn’t believe the traffic numbers and strong rankings across competitive keywords.

Simple and Plain Design Surges During September 16 Update

But I picked myself up off the ground and took some time going through the site. It didn’t take long to understand why this site ranked well. It provides granular details for a specific niche, it’s well-documented, and up-to-date. So I reviewed a number of important queries for the site and put myself in the shoes of someone searching for that information. And when I did, I was actually quite happy with the site. It was easy to find the information I wanted, it was organized well (even if it was extremely vanilla), and I found what I wanted quickly.

Remember the first bullet from above? User happiness FTW. :)

Are Bi-Weekly Tremors The New “Quality” Release Schedule?
So, we had an update on Wednesday 9/2 and then another exactly two weeks later on Wednesday 9/16. That sounds a little too perfect, doesn’t it? Based on what we’ve seen, does this mean that Google is tweaking Panda 4.2 and pushing smaller updates every two weeks? Gary says no, so it doesn’t seem like that would be Panda. But it could be Panda 4.2 continuing its rollout. It’s just interesting to see very little impact from P4.2 turn into serious turbulence in early September. And then exactly two weeks later, we saw more turbulence targeting content quality.

That’s speculation, but the updates are similar to the “tremors” I saw after Panda 4.0 (as explained earlier). And by the way, if that bi-weekly schedule is remotely correct, then we should have another tremor any day now. I guess we’ll see.

The Methode Philosophy – A Change In Panda Thinking
Before ending this post, I wanted to bring up an important point based on a video I watched with Google’s Gary Illyes. And I believe it’s a change in Panda philosophy by Google. Specifically, Gary explained that many wrongly believe Panda is about hurting or penalizing a website. He says it’s not. Instead, it’s about ensuring Google “adjusts” websites that become overly prominent for keywords that they shouldn’t rank for. Google wants its users to find the best content possible for the query at hand. So Panda is about making sure websites do not become overly prominent for keywords they shouldn’t rank for. You can watch him explain this in a Q&A with Bruce Clay (at 8:04 in the video).

The Methode Philosophy

Now, I totally understand that, and I actually think it’s great. But that’s not how Panda worked in the past. It didn’t just demote pages on an impacted domain for keywords they shouldn’t rank for. Sure, that was part of it, but it crushed websites overall. For example, sites dropped by 60%+ overnight and it wasn’t just overly generic keywords that site shouldn’t rank for. It included core keywords that the site should rank for. That’s why many viewed Panda as an algorithmic penalty.

The reason I bring this up is because Gary’s comments lead me to think that Panda could have changed… and maybe that’s why we are seeing different results from 4.2 than we typically see. Panda 4.2 is supposed to be a refresh. But what if it’s something different? What if it now works based on how Gary explained it?

Again, I like the approach and I think most SEOs would be on board. But I’m concerned for websites that were hit by Panda 4.1 on September 23, 2014 or by the October 24, 2014 update. Many of those websites are still struggling to recover, even after making significant changes. And if Panda has fundamentally changed, then those sites may never recover. They would be in Panda limbo with little hope of coming back, and that would be wrong. It’s hard to say if this is happening, but it’s worth noting.

As The Panda Turns – The Future For Panda and Phantom Victims
After speaking with many webmasters that have been impacted by Panda and/or Phantom, many feel as if they are currently in a horror movie (which fits since Halloween is right around the corner). You know, like being stranded in an abandoned sleep away camp in the woods, knowing a killer could strike at any time, hoping they are spared during the next attack, and trying to send an SOS to local law enforcement. But they just can’t seem to get out of Camp Crystal Lake. Yes, it’s like Friday the 13th, but for SEOs. Just swap Jason for Panda and make it a 6 month movie versus 2 hours. :)

That said, I’m actually hopeful given the volatile September we just experienced. Frequent updates can be good, as more sites can see impact (and hopefully up with the down). Let’s face it, would you rather have 10 months of inactivity or frequent updates? I’m for the latter.

Moving forward, it’s always smart to clean up any problems your site has (both on-site and off-site). This is the same advice I’ve been giving since before Panda 4.2 rolled out and I still believe tackling all quality problems is a smart way to proceed. In a nutshell, you should be ready for more tremors, for Panda 4.3 (if that’s coming), or for the real-time Panda (which we know Google wants to roll out).

That’s all you can do at this point. And of course, you could sip a witches’ brew with a bamboo garnish while trick or treating. But watch out for that Jason Voorhees guy. I heard he’s dressing up as a panda this year. Good luck.  :)



Panda 4.2 Analysis and Findings 7 Weeks Into The Extended Rollout – A Giant Ball of Bamboo Confusion

Panda 4.2 Analysis and Findings

Note: I reached out to Google last week to learn more about the current rollout of Panda 4.2, when it would be completed, and other interesting questions I had based on my analysis. But I haven’t heard anything back directly related to those questions. I will update this post if I receive more information about P4.2.

On Wednesday, July 22nd, Barry Schwartz broke huge SEO news. Google finally started rolling out Panda 4.2, which we’ve been eagerly waiting for since 10/24/14. That was the last Panda update, which was over nine months ago at the time! That’s extremely unusual for Panda, which typically rolled out monthly (and even more frequently at certain times).

Google explained to Barry that Panda began rolling out the weekend prior (July 18th) and that this would be an extended rollout (which was also very strange). Then John Mueller explained in a webmaster hangout that the extended rollout was due to technical problems that Google was having with Panda. They didn’t want to push an extended update, but were forced to.

So according to Google, Panda 4.2 could take months to fully roll out. Here’s a tweet from Google’s Gary Illyes confirming the update.

Gary Illyes Confirms Panda 4.2

I’ll be honest. I was completely shocked when I heard about the extended rollout. Panda usually rolled out quickly and sites that were impacted could easily identify the exact date of the impact.

For example, here is a big hit from Panda 4.0 in May of 2014.

Panda 4.0 Hit

And here is a recovery during Panda 4.1 in September of 2014:

Recovery During Panda 4.1

One day, big impact, and easier to associate with a specific Panda update. Ah, those were the days.

Having a specific date makes it much easier for webmasters to understand what hit them, and then what to fix. With the extended rollout of Panda 4.2, sites could theoretically see impact right after 7/18, a few weeks from then, or even a few months out from 7/18. And with Google pushing hundreds of updates throughout the year (and over one thousand last year according to John Mueller), how are webmasters supposed to know if Panda impacted them, or if it was something else (like Phantom, Penguin, or any of the other updates Google rolls out during year)? Short answer: they can’t.

I’ll expand on this topic later in the post, but for now just understand that you can gradually see impact from Panda 4.2 over time. Some sites will see more impact in a shorter period of time, but it’s entirely possible to see smaller movement over months. And of course, you might see nothing at all. That’s a good segue to the next section.

Analyzing Over 7 Weeks of Panda 4.2 Data
I’ve been heavily analyzing the update since we learned about Panda 4.2, and I specifically held off on publishing this post until I reviewed enough data. Due to the extended rollout, I wanted to make sure I gave our new Panda friend enough time to show his face. Now that over seven weeks have gone by, and I’ve seen a number of interesting things along my Panda travels, I decided to finally publish this post.

Warning: You might like what you read, and you might not. But it is what it is. That’s unfortunately the case. Read on.

All Quiet on the Giant Panda Front – Many Typical Panda Players Not Impacted (Yet)
I have access to an extremely large set of Panda data globally. The data includes many sites that have dealt with Panda problems in the past (and quality problems overall). And that includes some of the biggest sites with previous Panda scars. For example, sites with tens of millions of pages indexed, that are inherently Panda-susceptible, and that have dropped and surged over time as they enter and exit the gray area of Panda.

The large Panda dataset I have access to often enables me to see when Panda rolls out (and when other quality algorithms roll out like Phantom did in late April and early May.) The sites I have access to include ecommerce retailers, news publishers, press release websites, directories, affiliate websites, large-scale blogs, song lyrics websites, coupon websites, and more.

As I’ve been analyzing the trending for these websites since 7/18, it was easy to see that many of the larger, Panda-susceptible sites have seen very little impact since Panda 4.2 rolled out. It’s almost like Google didn’t want to touch these sites during the initial rollout. I’m not saying that’s the case (since Panda is algorithmic), but it sure seemed that way.

For example, I’ve seen a lot of trending that looks like this:

Stable Trending Through Panda 4.2

and this:

More Stable Trending Through Panda 4.2

No movement. At all.

This is important to understand if you are monitoring a large-scale website that has been working hard on Panda remediation. If you have seen very little impact so far, it could be that Panda 4.2 simply hasn’t impacted your site yet (or it won’t impact your site at all).

Like many others in the industry, I fully expected more large-scale sites that have previously been impacted by Panda to see movement. But most of the sites that act as Panda barometers have seen little or no impact. It’s very, very strange to say the least. Time will tell if that changes.

Also, John Mueller explained more about the rollout in a webmaster hangout video. He said that technical problems on Google’s end are forcing them to roll out Panda very slowly. Now, we don’t know what those technical problems are, but it seems that if John is correct, then Panda could still impact your site as time goes on. I haven’t seen that happen for most sites, but I guess it’s still possible. Needless to say, this isn’t optimal for webmasters battling the mighty, I mean aging Panda. And it’s extremely out of the ordinary for Panda.

Extended Rollouts and Going Real-Time – The Easiest Way To Hide Impact From Major Algorithm Updates
Google has stated several times that they intend to incorporate Panda into its core ranking algorithm. I don’t know when that will happen, but we are clearly seeing the first signs of that happening. We had Phantom 2 in May, which was a change to Google’s core ranking algo related to how it assesses “quality”. Now we have an extended rollout of Panda, which means we can’t pin the update on a specific date.

And by the way, Google also wants to incorporate Penguin into its core algo. Yes, say goodbye to external algos that are unleashed on the web in one fell swoop.

Every time Google released Panda and/or Penguin, the web erupted. And as you can guess, many that were negatively impacted screamed bloody murder. So much so, that the mainstream media started reporting on algorithm updates. Needless to say, Google doesn’t want that type of attention. It makes them look bad (even when most people will admit they have a really hard job trying to algorithmically determine what’s high quality, what’s spam, what’s unnatural, etc.)

So what’s the easiest way to avoid the flak created by a specific algorithm update? Roll that update out over months, or even better, bake it into the core ranking algorithm. Once you do, then nobody can pin a date on a specific update, nobody can say “Panda is flawed”, “Google has lost control of Penguin”, “Phantom is scarier than the exorcist”, etc.

Adding a drop of Penguin here, a teaspoon of Panda there…

Baking Panda and Penguin Into Core Ranking Algorithm

Well my friends, this is what we are dealing with now Panda-wise. And from a Penguin standpoint, you have a better chance at seeing Halley’s comet than seeing another update. Both are chained, cloaked, and do not have a specific update schedule planned.

Personally, I believe it became harder and harder to release Panda and Penguin updates without causing a lot of collateral damage. Google has many algos running, and I believe it was very hard to gain accurate results when unleashing Panda and Penguin on the web. So here we are. No dates, no information, no evidence, no screaming, and no mainstream media articles about Google algo updates.

With that out of the way, let’s dig deeper with what I’ve seen across my Panda data.

Panda 4.2 Illusions
During my Panda 4.2 travels, I came across a number of examples of websites that looked like they were heavily impacted by Panda 4.2, but actually weren’t. There were other factors at play that caused the drop or gain in traffic after 7/18/15, but just happened to coincide with the rollout of Panda 4.2.

For example, check out the trending below. Wow, that looks like a severe Panda 4.2 hit. But once I dug in, there were technical SEO issues with that site that was causing traffic to go to a sister website. Basically, as one site’s traffic decreased, Google traffic to the sister site increased.

Panda 4.2 Illusion Due To SEO Technical Problems

And here’s an example of another Panda 4.2 illusion. The site began losing significant traffic the week before Panda 4.2 rolled out.  Maybe it was impact from Panda 4.2 being tested in the wild or from the Phantom tremor I saw in mid-July? No, it ends up the website was in the process of migrating to another domain. And the new domain picked up that traffic.

Panda 4.2 Illusion Due To Migration

The key point here is that context matters. Don’t assume you’ve been impacted by Panda 4.2. It very well could be other factors at play. For example, technical SEO problems can cause big swings in organic search traffic, seasonality can come into play, and other factors could cause traffic changes. And this gets exponentially worse as time goes on… That’s because we don’t have a hard Panda date to base drops or gains of traffic on. Remember, there is an “extended rollout” of Panda 4.2.

Panda 4.2 Fresh Hits
So, there has not been widespread impact, but that doesn’t mean there hasn’t been some impact. And from what I can see, some websites seemed to have been heavily impacted right after Panda 4.2 rolled out. For example, sites seeing a significant drop in Google organic traffic (and a major drop in rankings) immediately following the initial rollout.

Now, it’s important to note that these types of hits were uncommon during Panda 4.2. I did not see many of them (at all). But there were some.

For example, check out the trending for this large-scale website. I have provided Searchmetrics data for the domain. Notice the initial drop when Phantom rolled out and then a bigger hit when Panda 4.2 rolled out:

Panda 4.2. Big Hit

This hit did not shock me at all. I’m confident that many people have cursed at their screens after arriving on the site. It’s filled with aggressive ads, pagination for monetization purposes, has technical SEO issues causing content quality problems, and simply provides a horrible user experience. Well, the site took a big hit after Panda 4.2 rolled out.

And then there were sites impacted more gradually since 7/18. For example, here’s a smaller site that has decreased gradually since 7/18. The site didn’t see full impact immediately, but did see a gradual decline in rankings and Google organic traffic since Panda 4.2 rolled out.

Smaller Site Hit By Panda 4.2 Over Time

Those are just two examples, and there are more like them (with more moderate drops than big hits). But the extended rollout is making it very hard to pin a drop in traffic on Panda 4.2. Again, I believe Google would like that to be the case. And remember, Google rolled out over 1,000 changes last year. Based on that number, there may have been 50-80 changes that have rolled out since 7/18 that weren’t Panda 4.2 related. Again, it is what it is. :)

Panda 4.2. Recovery (or at least improvement from P4.2)
I know what many of you are thinking by reading that subheading… Panda 4.2 recovery is an oxymoron! And overall, I would agree with you. As I explained earlier (and it’s important to highlight this), many sites working hard to recover from past Panda hits have not seen impact yet from P4.2. And some of the largest Panda-susceptible sites have also not seen movement. But there has been some positive impact already (to varying degrees).

Below is an example of a surge in Google organic traffic from a website that was hit hard in September of 2014 (when we had both the 9/5 Panda update and Panda 4.1). The website owner reached out to me this winter for help, but I unfortunately couldn’t help due to a lack of availability. That said, we stayed in touch throughout the year. I received an email the Saturday after Panda 4.2 rolled out (7/25/15) explaining that the website was seeing a surge in Google organic traffic. So I took a look at Google Analytics and Google Search Console to see what was going on.

Here is the sustained surge since 7/25/15. The site is up 495%. Note, sessions have been removed at the request of the business owner.

Surge After Panda 4.2

Now, since I didn’t work on the site, perform a deep audit of the site, guide the changes, etc., it’s hard for me to say this was 100% due to Panda 4.2. That said, I have reviewed sections of the original site and the list of changes that were implemented. The items addressed seemed extremely Panda-like, including handling content challenges, affiliate marketing refinements, and making some advertising fixes. Then after Panda 4.2 rolled out, the site surged.

Moving on, there are other websites that have seen partial recovery since P4.2 rolled out. For example, here is a site that increased after Panda 4.2 rolled out. And the site had been hit hard in the past by Panda, but shot up after July 18th.

Google Organic Increase After Panda 4.2

So I definitely agree that recovery from Panda 4.2 is rare, but there are sites that were positively impacted. Why more sites weren’t impacted is hard to say… Again, I reached out to Google for more information now that I analyzed seven weeks of data, but I haven’t received specific answers to my questions. I’ll update this post if I learn more.

Reversals, Tremors, or Other Disturbances in the Force
I know many in the industry have kept track of one of the more public recovery stories from Panda 4.2. I’m referring to Search Engine Roundtable. Barry Schwartz saw an immediate jump in Google organic traffic once Panda 4.2 rolled out and had been steadily increasing over time. It wasn’t a massive surge, but the site was up ~35% since 7/18.

Although some believed the surge was from people searching for Panda 4.2 related articles, the surge was actually from a number of queries across topics. And even when Barry stripped out the Panda-related articles from his analysis, there was still an increase in Google organic traffic.

Well, the increase was reversed in mid-August. So, was this some type of Panda 4.2 tremor or something else? We know that Google can and will tweak major algorithm updates to ensure they are seeing the right results in the SERPs, so it’s entirely possible. And by the way, there were many other people claiming to see similar changes in mid-August.

Panda 4.2. Reversal at Search Engine Roundtable
Image From SER

Personally, I saw websites experience a similar change. One site reversed a downward slide and shot up on 8/13 while the other change was more recent on 8/24 (the site began to drop after increasing). There are many other reports of changes starting in mid-August, so this isn’t just a few sites.

That said, and I’m sorry to have bring this up again, but since we are so far out from the release of Panda 4.2, it’s hard to say the impact is from P4.2. It could be, but it also could be other changes and tweaks. Remember, we have Phantom (which focuses on “quality”) plus Google rolled out over 1,000 changes last year.

Saying Goodbye to Panda, Penguin, and Other Algos Causing Mass Hysteria
In 2013, I wrote a post about Google starting to roll out Panda over 10 days. In that post, I explained that when this happens, or when Panda goes real-time, there will be massive webmaster confusion. If sites experience serious gains or losses at random times throughout the year, without tying that to a specific algorithm update, then how in the world are those webmasters supposed to know what caused the drop? Quick answer: They won’t.

Prior to Panda 4.2, webmasters already had trouble determining what caused a Panda hit. And now that impact can happen any time during the extended rollout of Panda 4.2, that confusion will exponentially get worse. And we know Google fully intends to incorporate Panda (and Penguin) into its core ranking algorithm. When that happens, there won’t be an extended rollout… it will be running all the time. You can technically see gains or decreases at any time. Good luck trying to figure that one out.

Based on what I explained above, we may never see another Panda or Penguin update again. Read that line again. It’s entirely possible that Panda 4.2 will be the last Panda update and that Penguin has become so hard to manage that maybe just parts of it get baked into Google’s core ranking algo. And if that happens, who knows what happens to Panda and Penguin victims. One thing is for sure, Panda 4.2 and Penguin 3.0 were not effective. Actually, I’d go so far as to say they were a mess. It’s hard to look at it any other way.

So yes, I called this in 2013, and here we are. Fun, isn’t it? :) As more days pass from the initial Panda 4.2 rollout date (7/18/15), it will become harder and harder to determine if Panda is impacting your site, if Phantom found new ectoplasm, or if other algorithms are at play. It’s why I’m recommending the nuclear option more and more recently. Identify all quality problems riddling a site, and fix them all. That’s both on-site and off-site.

Summary – What’s Next For Panda?
Well, that’s what I’ve seen so far. Panda 4.2 started rolling out on 7/18, but it can take months to fully roll out. And Phantom is running all the time. And Penguin hasn’t rolled out in a while, and who knows when (or if) it will roll out again. And Google pushes 500-600 updates each year (with over one thousand last year). There’s a lot going on… that’s for sure.

Regardless, I will continue to analyze the impact from Panda 4.2 (and other miscellaneous disturbances in the force) and I plan to write more about my findings as time goes on. In the meantime, keep improving content quality, fix all quality problems riddling your website (on-site and off), drive stronger user engagement, and fix any technical SEO problems that can be causing issues. That’s all you can do right now from a Panda standpoint. Then just hope that Google fixes Panda or bakes it into its core ranking algorithm.

But of course, if that happens, you will never know what hit or helped you. And I think that’s just fine from Google’s standpoint. Good luck.




Challenging Murphy’s Law – 8 Immediate SEO Checks After A CMS Migration Goes Live

Murphy's Law for CMS Migrations

CMS migrations are a necessary evil for many companies. If your current technical setup is inhibiting your business from doing what it needs to be successful, then a CMS migration could be your way out. But migrations should not be taken lightly, and especially for large-scale websites that are changing urls. Any time you change urls, you run the risk of destroying SEO. And if SEO is an important driver of traffic for your site, then a migration could cause serious problems for your business.

Unfortunately, SEOs know Murphy’s Law all too well. It states, “Anything that can go wrong, will go wrong”. Well, large-scale migrations have many moving parts and the chances of a migration going live without some type of hiccup are rare. And you might find many hiccups, gremlins, or flat-out mistakes once the button is pushed and a migration goes live. But to me, it’s how you react once those gremlins are found that can make or break your migration. Quick checks and fast refinements can end up saving your SEO. Speed and accuracy matter.

The Summer of Migrations
For whatever reason, this is the summer of migrations and redesigns for me. Several of my large-scale clients are either migrating to a new CMS or they are redesigning their websites. Therefore, I’ve been neck deep in the strategy, testing, implementation, and auditing of each move. And based on my work this summer, I decided to write a post explaining what you can do as soon as the migration goes live to ensure all is ok (or to nip SEO problems in the bud).

Specifically, I have provided eight immediate checks you should make as soon as your new site goes live. The eight checks cover what you can do immediately following a push to production and can help catch serious problems before they become bigger ones. For SEOs, large-scale CMS migrations are not for the faint of heart. They are stressful, there’s a lot on the line, and there are many areas to monitor to ensure all goes well.

The Power of a Crawlable and Testable Staging Server
Before I hop into the top eight checks you should make after a migration goes live, I wanted to touch on a very important element to a CMS migration – the staging server.

If you can have your new site up and running on a staging server that’s accessible and crawlable to your SEO consultant or agency, then you can (typically) thoroughly test that setup prior to the migration going live. For clients that have this available, I’m able to crawl the new site in staging, test redirects, analyze technical SEO elements, browse with multiple devices, and more. It’s a much easier transition for a site that has a staging server accessible and crawlable than one that doesn’t.

If you don’t thoroughly test the new site in staging, including testing the redirects, you’ll have to test like mad as soon as the new site goes live. And believe me, you will find problems. Then you’ll need to fix those problems on the fly. And those fixes could lead to other problems, that will need to be fixed quickly as well… so on and so forth. It’s a slippery slope.

Keep this in mind if you are planning a CMS migration. Don’t proceed on faith alone… Test and know. That’s how you can nip serious SEO problems in the bud.

Some Prerequisites
Before we move on, you’ll need a few things in place. First, make sure you have all variations of your site set up in Google Search Console (GSC). That includes www, non-www, and if applicable, https www, and https non-www. And if specific subdomains are part of the migration, make sure you have them set up as well. For example,

Second, you should have already collected your top landing pages from the old site. You should export all top landing pages from Google Search Console, Google Analytics, Bing Webmaster Tools, and various link analysis tools. Then you should combine them and dedupe them. That’s your core list of urls to check after the migration goes live. I’ll cover more about this soon.

Third, you’ll need a tool that can crawl your site. The two crawlers I use extensively are DeepCrawl (for large-scale crawls) and Screaming Frog (for smaller crawls or for laser-focused crawls). This is how you will check your redirects in bulk. Note, I’m on the customer advisory board for DeepCrawl. It’s one of my favorite tools for enterprise crawls and I’ve been using it for years.

OK, now that you have the necessary elements in place, it’s time to perform eight immediate checks once the migration goes live. Note, the following list is just the beginning of your testing process. You definitely want to continue analyzing the migration over time to ensure all is ok. What I’m providing below should be checked as the button is pushed and your new site goes live.

1. Robots.txt and robots.txt Tester
Yes, robots.txt is a simple text file, but one that can cause all sorts of problems. Developers will often use a staging-specific robots.txt file which can easily get pushed to production by accident. And if it blocks important files or directories from being crawled, you could kill your SEO traffic.

So check this first after the new site goes live. Make sure it’s the version that should be in production and that the directives are free of errors. And make sure important areas of the site are not being blocked. That includes CSS and JavaScript that are necessary for Google to render the page properly. More about that soon.

And use the robots.txt Tester in Google Search Console. It’s a sandbox that enables you to test urls on your site. If you notice urls being blocked that shouldn’t be blocked, hunt down the directives causing the problems. You can edit the robots.txt file in GSC to test your changes (without impacting your actual robots.txt file).

Checking Robots.txt After CMS Migration

2. Check For Google Analytics and/or Other Analytics Code
Before you and your client’s executive team check Google Analytics after the migration goes live, and have a heart attack, make sure you add the necessary analytics code to your new site. If not, you will see traffic drop off a cliff, when in fact, that’s not really happening. It’s a scary sight for sure.

And more importantly, you will lose visibility into how the migration is going. I continually check various sources of traffic over time after a migration goes live to ensure all pages are being handled properly. If you drop your tracking code, then you’ll be out of luck.

Check Google Analytics Code After CMS Migration

Quick Tip: Providing Analytics Air Cover
It’s not a bad idea to have multiple analytics packages running on a site. For example, I have some clients running both Google Analytics and Adobe Analytics on the same site. When a recent migration went live without GA tagging (by accident), we could check Adobe Analytics to see how the migration was going. It provided air cover as GA tracking was put in place.

3. Canonical URL Tag Errors
Rel canonical is a good example of a single line of code that can wreak havoc on your site SEO-wise. When the migration goes live, quickly check core page types on the site to ensure canonical url tags are set up properly. You can also quickly crawl a snapshot of the site to get a feel for how rel canonical is being handled in bulk. If not, you can destroy rankings and subsequent traffic to pages that were performing extremely well before the migration.

You can check my post about rel canonical problems to learn about the various issues that can be implemented by accident. For example, all canonicals pointing to a homepage, canonicals pointing to 404s, endless canonical loops, etc. A quick check of the canonical url tag after a crawl can save your SEO.

Checking Rel Canonical After CMS Migration

4. Meta Robots Tag Problems
Along the same lines, the meta robots tag could have a serious impact on your site if the wrong content values are used. For example, you could be noindexing important pages, or on the flip side, you could be opening up pages that shouldn’t be indexed.

Again, manual checks plus a snapshot crawl will give you a view at the meta robots tag across many pages. You can start to pick up patterns and make changes before serious damage can be done.

Checking Meta Robots Tag After CMS Migration

5. Mobile-friendly Test (MFT)
Since Google announced the mobile-friendly algorithm, many companies have taken the plunge and moved to a responsive design or mobile urls. So when you make a big change, like moving to a new CMS, you definitely want to check the mobile-friendliness of your new urls.

Unfortunately, you can’t run the mobile-friendly test on a staging server that requires authentication. You can definitely run others tests while in staging to ensure the site is mobile-friendly, which you should do. But remember Murphy’s Law… don’t trust that your staging urls are behaving the same way as your production urls. That’s why you should absolutely run Google’s official mobile-friendly test once the new site is live.

To begin, I recommend testing key urls by category. That would include your homepage, category urls, specific products or services, and other important types of pages on the site.

Running New URLs Through Google's Mobile-Friendly Test

And then you can use a tool like Url Profiler to check mobile-friendliness in bulk. Import a list of urls on the new site and fire away. The resulting spreadsheet will provide details about mobile-friendliness. Then you can hunt down any problems and rectify them quickly.

Checking Mobile-Friendliness In Bulk Via URL Profiler

6. Fetch and Render in GSC
To ensure Google can fetch the necessary resources to accurately render the page at hand, you should run important urls through fetch and render in Google Search Console. Similar to the mobile-friendly test, you cannot run fetch and render on a staging server that requires authentication. Therefore, you’ll need to test this out as soon as the site goes live.

Google has explained repeatedly that if you block resources, including CSS and JavaScript, then that will impact how Google indexes your pages. Google wants to render the page just like a user would in a browser. So as I said on Twitter a few weeks ago, “If you want to rock, don’t block.” :)

Using Fetch and Render on New URLs

7. Check XML Sitemaps
XML sitemaps are an important supplement to a traditional web crawl. Using sitemaps, you can feed both Google and Bing all of your canonical urls. After going live with a new CMS, you will have a list of new urls and old urls. In the short-term, you should submit both your old urls and your new ones via xml sitemaps. Continuing to submit your old urls for a short time will enable the engines to quickly find the 301 redirects to the new urls. That can quicken up the process of getting the new urls crawled and indexed.

Based on the migration, you will definitely want to check your new xml sitemaps to ensure they are being published accurately. First, you should check the sitemaps reporting in Google Search Console for both warnings and errors. If you see anything out of the ordinary, identify the problems and send to your developers ASAP. You want to nip problems in the bud.Checking XML Sitemap Warnings in Google Search Console

Second, you should crawl your new sitemaps to ensure they lead to the right urls that resolve with 200 header response codes. I can’t tell you how many times I’ve crawled new sitemaps and found 404s, 500s, and redirects. Note, if your old urls are changing, then they should 301, since the old urls are redirecting to your new ones. But the new sitemap should not have any redirects or non-200 response codes. You want clean sitemaps, not “dirty” ones.

You don’t want your crawl graph to look like this:

Crawl Graph of XML Sitemap URLs

Crawling XML Sitemaps To Check Response Codes

And while we’re on the topic of crawling urls, let’s jump to an incredibly important check – crawling your top urls from before the migration!

8. Crawl Top Landing Pages From The Old Site (and Check Redirect Chains)
When you migrate to a new CMS, there’s a good chance you’ll be changing urls. And even if one character changes in your url, then it’s a brand new one to Google. So, in order to maintain search equity during the migration, it’s critically important to organize and then crawl your top landing pages to ensure they resolve accurately. Your top landing pages are urls that ranked well prior to the migration, the ones driving traffic, the ones that have built inbound links, and obviously the ones sustaining your business. Don’t assume your redirects are working well. There are many reasons they might not be.

The first thing you need to do is collect all important landing pages (as explained earlier). You can find these landing pages in Google Analytics, Google Search Console, Bing Webmaster Tools, and from various link analysis tools. Once you download them all, you should combine them, and then dedupe them.

Deduping URLs in Excel

I use both DeepCrawl and Screaming Frog to check the redirects list. Depending on the size of your list, you might have a few hundred urls, or you might have a few hundred thousand urls (or even more). DeepCrawl is extraordinarily good at crawling many urls (over 100K), while Screaming Frog is outstanding for laser-focused crawls (under 50K urls).

When crawling with Screaming Frog, ensure that “follow redirects” is enabled in your settings. This will allow the Frog to not only check the url you feed it, but it will also follow the redirect chain. That’s incredibly important, since just setting up a 301 isn’t enough… That 301 needs to lead to a 200 code at the true destination url.

Following Redirects in Screaming Frog

One of the biggest mistakes I’ve seen is assuming all 301s work perfectly. In other words, you crawl your top landing pages and they all 301. That’s great, but where do they lead? If they don’t lead to the new url that resolves with a 200 code, then you could be killing your SEO.

In Screaming Frog, crawl your urls and then export the redirect chains report (which is accessible from the reports dropdown). When opening that file in Excel, you will see the initial url crawled, the header response code, and the number of redirects encountered. You can follow the sheet along to the right to see the next url in the chain, along with its response code.

Analyzing Redirect Chains After A CMS Migration

Does the destination url resolve with a 200 code or does that redirect again? If it redirects again, you are now daisy-chaining redirects. That’s not great, as Google will only follow a certain number of redirects before giving up. And as you can guess, you can keep following the chain to the right to see how each url in the list resolves. You might be surprised what you find… like three, four, or even more redirects in the chain. And even worse, you might find daisy-chained redirects that lead to 404s. Not good.

In DeepCrawl, you can export the 301 redirect report, which will also provide the redirect chain. If you have many urls to check, then DeepCrawl can be extremely helpful. It doesn’t run locally and can handle hundreds or thousands, or even millions of urls.

Once your export the urls, you’ll need to use Excel’s “text to columns” functionality to break apart the redirect chain column. Once you do, you can follow the chain to the right to see how each url resolves. Again, you might find 404s, excessive redirect chains, or redirects that lead to 500s or other errors. The core point is that you won’t know until you crawl the old urls, and the redirect chain. So crawl away.

Checking Redirect Chains in DeepCrawl


Summary – Never Assume All Is OK With A Migration
Unfortunately, Murphy’s Law almost always comes into play with a CMS migration. And that’s especially the case for larger-scale websites with a lot of moving parts. Even if you’ve heavily checked a site in staging, make sure you perform the necessary checks once the new site is pushed to production. If not, you run the risk of having major problems severely impact SEO.

And the point of migrating to a new CMS isn’t to destroy SEO… it’s to increase the effectiveness of your website! Although the list I provided above is just a starting point, they are important factors to check once the new site goes live. Good luck.



Phantom Tremors Continue As SEOs Wait For Panda – Latest Tremor on 7/14/15

Phantom 2 Tremors

As many of you know, Phantom 2 began rolling out on April 29, 2015, just days after the mobile-friendly update was released. Phantom 2 was a big update that Google initially denied. During my analysis of the update, I saw a lot of movement across sites dealing with content quality problems. It was clear from the beginning that the algorithm update was focused on content quality (like Panda). That’s one of the reasons many people (including myself) initially believed it was a Panda update.

Once Google confirmed the update, they explained that it was a change to its core ranking algorithm with how it assessed “quality”. Even though Google explained that Phantom was part of its core ranking algorithm, I had a feeling that websites would not be able to recover quickly. The reason? I had seen this before, specifically with the original Phantom update in May of 2013. Phantom 2 was eerily similar to Phantom 1 (and rolled out almost 2 years to the day that Phantom 1 rolled out). Interesting timing to say the least.

Both Phantom 1 and 2 Were Panda-Like
With Phantom 1, I also saw the update target content quality problems. The companies I helped with fresh hits had a dangerous recipe of user engagement issues, content quality problems, and technical issues that all led to significant drops in organic search traffic. It took nearly three months before I saw the first Phantom 1 recovery, with others following as time went on.

It took a lot of work to see recovery from Phantom 1… it was not a trivial undertaking. And I expect Phantom 2 remediation and recovery to follow suit.

Actually, we are already over two months out from the Phantom 2 update (late April/early May), and I haven’t seen any major recoveries yet. That being said, I’ve seen a lot of movement on specific dates following Phantom. That includes both additional drops in traffic or sudden jumps for Phantom 2 victims. And that’s the topic of this post – Phantom tremors.

Multiple Phantom Tremors Since 4/29
After Panda 4.0 in May of 2014, I saw what looked like near-weekly updates impacting Panda victims. I called these updates Panda tremors, and they were confirmed by Google’s John Mueller.

Panda 4.0 Tremors and John Mueller

Basically, after Google pushes a major update, it can refine and tweak an algorithm and then push smaller updates (to ensure everything is working the way they want). I saw many of those tremors after Panda 4.0. Well, Phantom 2 has been similar. I started seeing Phantom tremors soon after the initial rollout and I have seen several more over time.

And most recently, I saw the latest Phantom tremor starting on 7/14/15. I’ve seen the impact with companies I am helping now, but I’ve also had new companies reach out to me explaining what they are seeing. And when tremors roll out, Phantom victims can see more impact (mostly negative based on my analysis, but there has been some positive movement).

Negative movement makes a lot of sense if the site in question hasn’t made any efforts to improve from a quality standpoint. For example, here is the hourly trending for a site hit by a Phantom tremor on 7/14. Notice the change in trending starting around 11AM ET on 7/14:

Traffic Drop From Phantom Tremor on July 14, 2015

And here is a screenshot of a site that saw positive movement starting around that time:

Traffic Gain From Phantom Tremor on July 14, 2015

Here is another screenshot of negative impact, this time showing the drop in clicks when comparing traffic after the tremor to the timeframe prior (based on the 6/8 tremor):

Traffic Drop Based on 6/8 Phantom Tremor

(Update: Here are some more screenshots of impact from the 7/14 Phantom tremor):

This example shows impact during the late June tremor and then more impact on 7/14:

Impact from two Phantom tremors

And here is an example of a quick jump at the end of June with that gain being rolled back during the 7/14 update:

Temporary Recovery During Phantom Tremor

Documenting Important Phantom Dates:
In order for Phantom victims to track their own trending based on Phantom tremors, I have included important dates below. It’s always important to understand why drops or spikes occur, so I hope this list of dates provides some clarity:

Original Phantom 2 Rollout: 4/29/15
Phantom Tremor: 5/27/15
Phantom Tremor: 6/8/15
Phantom Tremor: 6/17/15
Phantom Tremor: 6/28/15
Phantom Tremor: 7/14/15

What This Means for Panda Victims
Remember, many Phantom victims have been impacted previously by Panda. For example, some sites that were still working to recover from Panda got hit by Phantom, while others that already recovered from Panda got hit too. Yes, some of those sites saw drops after fully recovering from Panda.

For those sites still impacted by previous Panda updates that got hit by Phantom 2, it’s clearly not a great sign. We know that Phantom focuses on “quality” and so does Panda. I can’t imagine Phantom reacting negatively to your site, while Panda reacts positively. If that’s the case, then Google has an algo problem.

Now, if a site has recovered from Panda and then saw negative impact from Phantom, then that does not bode well for the next Panda update. Phantom is clearly picking up quality problems, which could also contribute to a future Panda hit. We are eagerly awaiting the next Panda refresh so recent Phantom hits on sites that have recovered from Panda should be concerning.

Panda Recovery Followed By Phantom 2 Hit

What Are Phantom Tremors?
Great question, and it’s hard to say exactly. Google could be refining the algorithm and then rolling it out again. Or they could be adding more factors to Phantom and rolling it out. Or they could be adjusting the threshold for each factor. All three are possibilities. That’s why it’s so important to get out of the gray area of Phantom.

I wrote a post recently about the SEO nuclear option and how it relates to both Panda and Phantom hits. Let’s face it, with multiple major quality algorithms running like Phantom and Panda, it’s ultra-important for websites to identify and fix all quality problems riddling their sites. If not, then they can see more impact. And in a worst-case scenario, they could get hit by both Phantom and Panda, which is what I call Phantanda. And although Phantanda sounds like a soda or rock band, it’s not cool or fun. You don’t want to experience it.

Is Phantom The Real-time Panda?
In my Phantanda post, I brought up the possibility that Phantom was actually the migration of Panda (or Panda factors) to Google’s real-time algorithm. It’s entirely possible this is the case. I’ve analyzed many sites hit by Phantom 2 and every quality problem I surfaced would have been picked up by a thorough Panda audit. The factors are extremely similar…. almost identical actually.

By the way, after Phantom 2 rolled out on 4/29, Josh Bachynski tweeted a similar thought. I said at the time that he was on to something… and I still believe that.

Josh Bachynski Tweet After Phantom 2

And with Google having all sorts of problems with Panda, this makes even more sense. For all we know, Panda might have problems with Phantom (as Phantom is part of Google’s core ranking algorithm and also focuses on quality). If that’s the case, then the “technical problems” Google has explained could be Panda and Phantom at odds with one another.

That’s total speculation, but there are now two quality cowboys in town. And to me, this SEO town might be too small for the both of them. It wouldn’t surprise me to find out that Panda is slowly being incorporated into Google’s core ranking algorithm. Remember, it’s been almost nine months since the last Panda update (10/24/14), while we’ve had multiple Phantom updates (the initial rollout and then several tremors). Hopefully we’ll find out soon what’s going on with Panda, and its relationship with Phantom.

Moving Forward: Keep Implementing (The Right) Changes
To quickly recap this post, if you were impacted by Phantom 2 in late April or early May, then you very well could have seen further impact during one or more Phantom tremors. I would check the dates I listed above and see if your site saw any drops or spikes during that timeframe.

And more importantly, continue to make the right changes to your website. Audit your site through a quality lens, identify all problems riddling your site, and move quickly to rectify those problems. That’s how you can rid your site of both bamboo and ectoplasm. Good luck.



Phantanda – Why The SEO Nuclear Option Is Important For Sites Hit By Phantom 2 and Panda

The SEO Nuclear Option for Phantanda Victims

Panda can be devastating. We all know that’s the case and it’s been documented to the nth degree since February of 2011. And now we have Phantom (AKA “The Quality Update”), which was a change to Google’s core ranking algorithm regarding how it assesses “quality”. Between the two, you can clearly see that Google is heavily focused on increasing the quality of the search results.

For webmasters, it’s bad enough when you see a big drop in rankings and traffic from Panda alone. Some sites can drop by 60%+ when Panda rolls through. But for many sites riddled with bamboo, little did they know that Panda has a close friend named Phantom who has no problem kicking a site while it’s down.

Since the end of April, I saw a number of sites that were already impacted by Panda see more negative impact from Phantom. And then there were some sites that recovered from Panda previously, only to get hit to some degree by Phantom. And as you can guess from the title of this post, I call these double-hits Phantanda.

In this post, I’ll explain more about Phantanda, how it relates to Panda, I’ll introduce the SEO nuclear option, explain why it’s important, and then I’ll end by providing some recommendations for those that want to go nuclear.

Opportunities for Recovery and Frequency of Quality Updates
As mentioned above, Google now has a one-two quality punch which I’m calling Phantanda. It’s not a soft drink or a rock band, but instead, a devastating mix of algorithms that can wreak havoc on a website’s organic search traffic.

If you’ve been hit by Phantanda, then it’s incredibly important to heavily audit your site through a quality lens. That audit should produce a thorough remediation plan for tackling any problems that were surfaced during the audit. Then you need to move quickly to execute those changes (flawlessly). And then you need Google to recrawl all of those changes and remeasure engagement. Yes, this is not a trivial process by any means…

An example of a Phantom hit on a site that has struggled with Panda:
Hit by The Quality Update (AKA Phantom 2)

In the past, Google used to roll out Panda monthly. That was great, since there were many opportunities for sites to recover as they made changes, removed bamboo, improved the user experience on their websites, and published higher quality content. But as many of you know, Panda hasn’t rolled out since 10/24/14. That’s a horrible situation for many battling the mighty Panda.

Sure, Gary Illyes said the next Panda update is coming soon (within weeks), but it has still been way too long between Panda updates. And that’s especially the case when we saw weekly Panda tremors after Panda 4.0.

The fact is we need more Panda updates and not less (as crazy as that sounds). Hopefully the next Panda update is right around the corner. We’ll see.

John Mueller clarifies Panda tremors after Panda 4.0:
John Mueller Clarifies Panda Tremors

Ectoplasm vs. Bamboo
From a Phantom standpoint, Google implemented changes to its core ranking algorithm with how it assessed “quality”. It was very similar to the first Phantom update, which was in May of 2013. Phantom 2 was eerily similar to Phantom 1 and I’ve done a boatload of research and work with both updates.

The good news is that websites were able to recover from Phantom 1. The bad news is that it took months of remediation work (similar to Panda remediation). I believe the first recovery I saw took approximately three months, while others took longer.

An example of recovery from Phantom 1 in 2013:
Example of Phantom 1 Recovery from 2013

Based on my analysis of Phantom 2 (and my work helping companies that have been impacted), I believe the remediation and recovery process is similar to Phantom 1 (longer-term). And that make sense. The Quality Update (Phantom 2) rolled out on 4/29, so websites need enough time to audit, strategize, execute, and wait for Google to process those changes. Remember, Google needs to crawl and measure the changes. It’s not like going mobile-friendly, which is a binary test for now (yes or no). In my opinion, it’s much more complicated than that.

Phantom 2 victim still negatively impacted:
Sustained Negative Impact from Phantom 2

In my opinion, Phantom remediation is very Panda-like. Take a long-term approach, truly understand the various quality problems riddling a website, and then take aggressive action to rectify those problems. And that leads me to the core point of this post – the nuclear option.

The SEO Nuclear Option – It’s No Walk in the Park, But It’s Worth It
I started referring to the nuclear option in June of 2013 when Google first started talking about Panda going real-time. I explained that if that was the case, then webmasters would have no idea what hit them. And that would make it harder to understand what caused the negative impact and how to fix the problem(s).

And as other algorithms crossed Panda in the wild, I brought up the nuclear option again. For example, when Google rolled out Panda during an extended Penguin rollout. Yes, they did that… To pull a quote from Jurassic Park, “Clever girl…” :)

Panda During Penguin - Clever Girl

When this happened, most people thought they were hit by Penguin, when in fact, they were hit by Panda. And that’s a horrible situation with the potential of disastrous results. Imagine nuking many of your links thinking you were hit by Penguin, when you really needed to improve content quality. I had many confused webmasters reach out to me after the 10/24 Panda update.

And now we have Phantom throwing its hat in the ring. As mentioned earlier, many of the problems I surfaced with Phantom victims were extremely similar to Panda problems. For example, I would have surfaced the same problems when completing a thorough audit (whether a site was impacted by Panda or Phantom). Heck, for all we know Phantom is actually the beginning of Panda being incorporated into Google’s core ranking algorithm. It’s entirely possible.

And of course we still have Penguin, with a release schedule less frequent than Halley’s comet passing earth. So based on what I just explained, what can a webmaster do when all of these algorithms are running around the web? Enter the nuclear option.

What Is The SEO Nuclear Option?
Simply put, the nuclear option involves identifying all SEO problems for a particular website and forming a plan for executing all of the changes over a period of time. That includes both on-site problems (like content quality, user engagement, advertising issues, mobile-friendliness, etc.) and off-site problems (like unnatural links, syndication issues, etc.)

SEO Thermonuclear War

Yes, it’s a lot of work, but again, it’s completely worth it in the long-term (in my opinion). The companies I’ve helped that decided to go down the nuclear path simply couldn’t take it anymore… They were tired of chasing algorithms, tinkering with urls, tweaking this, and then that, only to find themselves hit again by another algorithm update. The grey area of Panda or Phantom is enough to drive a webmaster insane.

Recommendations for Going Nuclear:
As you can imagine, pushing the giant red button is a big undertaking that should not be taken lightly. So based on my experience helping companies with the nuclear option, I’ve provided some recommendations below. My hope is that if you choose to go nuclear, you can follow these tips to ensure you maximize your efforts (and avoid launching duds).

  • Long-Term Process – Understand that you are making changes to achieve long-term success. You are not looking for short-term wins. You want to avoid quality problems (and negative impact from quality algorithms) over the long-term. Understand that you might actually drop in the short-term until Google crawls, processes, and measures the changes.
  • Extensive and Thorough Audit – Tackle on-site quality problems head-on. Have a thorough SEO audit completed through a quality lens. That should help identify problems that could be contributing to both Panda and Phantom. The audit should produce a clear remediation plan organized by priority.
  • Nuke Those Links – Deal quickly and decisively with unnatural links. Have a thorough link audit completed. Identify all unnatural links and deal with them as quickly as possible. That includes removing, nofollowing, or disavowing unnatural links.
  • Go Mobile-friendly – This is NOT just for Google. This is for users as well. Depending on your situation, you may choose to go responsive or you might choose separate mobile urls. Regardless, create a solid mobile user experience. It should pay off on several levels, including more organic search traffic, stronger engagement, more social sharing, and increased conversion.
  • Redesigns Can Be Good (just be careful) – Don’t be afraid to redesign what needs to be improved. Website designs from 1997 will not suffice anymore. Cluttered and disorganized user experiences can kill you from a Panda and Phantom standpoint. Don’t be afraid to redesign your site, or various elements of the site. Just make sure you take the necessary steps to maintain search equity during the launch.
  • Ask Humans (Yes, real people.) – I’ve helped a number of companies deal with Panda and Phantom hits that had serious engagement problems. For example, aggressive advertising, deceptive affiliate issues, horrible user experience, and other nasty problems). Those problems were abundantly clear to me, but not to the companies I was helping. Panda is about user happiness (and so is Phantom to an extent). Have unbiased real people go through your site. Have them provide feedback based on going through a task, or several tasks. Take their feedback to heart and then make changes.
  • Avoid SEO Band-aids – Make the necessary changes no matter how big they are. If you want the greatest chance of recovery for the long-term, then you must be decisive and then execute. Band-aids lead to future algorithm hits. Big changes lead to success.

Summary – It’s OK To Push The Giant Red Button
With multiple quality algorithms crossing streams, it has never been more important to consider the SEO nuclear option. Pushing the giant red button can be a tough decision, but it can also rid your site of nasty quality problems that attract Pandas and Phantoms. And if there’s something worse than having bamboo on your site, it’s adding ectoplasm to the mix. Sure, choosing the SEO nuclear option is a bold move, but that might be exactly what you need to do.