Google’s Unconfirmed Algorithm Updates in 2015 and Their Connection to Panda and Phantom (Including the 11/19/15 Update)

2015 Unconfirmed Google Algorithm Updates

2015 has been an incredibly interesting year from a Google algorithm update standpoint. Although there weren’t many confirmed updates like previous years, it was still a relatively volatile year algo-wise. We had the mobile-friendly algorithm released in April of 2015, Phantom 2 confirmed in early May, and then Panda 4.2 in July.

Although those are three confirmed updates by Google, there were absolutely other significant updates that caused significant movement in the SERPs this year. And in my opinion, several of those updates seemed directly connected to “content quality” like Panda and/or Phantom are.

Note, Google can roll out hundreds of updates per year, but many of those are minor and do not cause major changes in rankings and/or traffic. Actually, John Mueller explained that they rolled out over one thousand changes in 2014. It’s important to understand that the updates I’m covering in my post today would be categorized as significant based on major gains or losses of Google organic traffic overnight.

The Importance of Identifying Unconfirmed Updates For Webmasters
For business owners and webmasters, it’s incredibly important to understand the algorithm updates that have impacted a specific domain. I mentioned that in my previous Search Engine Land column about performing a Search History Analysis. If you don’t know what hit you, then you cannot have a solid strategy for attacking those problems and designing the right remediation plan.

But if you do have a good understanding of which algorithm updates have caused increases or decreases in traffic, then you can analyze the damage through the lens of the update that impacted the site. And that’s a much stronger way to go for companies looking to recover their Google organic search traffic.

I’ll first go through the various unconfirmed updates that have rolled out in 2015 and then explain more about their apparent ties to “content quality”. Again, they definitely seemed connected to Panda and The Quality Update (AKA Phantom 2) based on analyzing many sites that were impacted throughout the year. And if you’re wondering about the November 19 update, I cover that at the end. Based on what I’m seeing, it looks very similar to the others I mention in this post. Let’s begin.

February 5th 2015
It wasn’t long before we saw a significant update in 2015. Specifically, there was a lot of movement on 2/5/15. Based on the large Panda dataset I have access to, as well new companies reaching out to me about a drop in traffic, it was clear there was some connection with “content quality”. Also, many of the sites that were impacted had previously been impacted by Panda. There were many claims throughout the industry of drops or gains in traffic starting on 2/4 and 2/5.

Feb 5 2015 Google Algorithm Update

Feb 5 Update and Phantom 2 in May 2015

Feb 5, Phantom 2, and Sep 2015 Updates

Again, it’s important to note that a number of the companies that experienced a large drop on February 5, 2015 had been previously impacted by Panda updates. I saw it time and time again. And although some claimed it was entirely focused on ecommerce sites, that’s definitely not the case. I know many sites outside of ecommerce that were hit too. It’s also important to note that Barry Schwartz reached out to Google to see if it was Panda or Penguin-related and was told that wasn’t the case.

So definitely take a hard look at February 5, 2015 if you have experienced a drop in Google organic traffic this year. It very well could have been the 2/5 update, which again, seemed to be heavily focused on content quality.

Side Note: RankBrain Introduced
It’s worth noting that RankBrain supposedly started rolling out in early 2015. It doesn’t sound like it had fully rolled out by February, so I’m not sure if it had any impact on the 2/5 update. That said, any algorithm that’s helping Google handle the long tail of search (which is massive), is important to document. Personally, I think the 2/5/15 update had more to do with content quality factors being baked into Google’s core ranking algorithm. More about this soon.

Phantom 2 – Early May 2015
I’m including this confirmed major algorithm update in the list since it was originally unconfirmed. Google never intended to release any information about Phantom, but ended up doing so based on the buzz it generated. If you are unfamiliar with Phantom 2, I picked up what looked to be a major algorithm update in late April and early May of 2015 that impacted many sites across the web. And I’m referring to major swings in rankings and traffic. I named the update “Phantom” based on its mysterious nature.

Based on how large the update was, I ended up being interviewed by CNBC soon after writing my post, and the CNBC article went live the next day. Google finally revealed that there was an update, and it was a change to its core ranking algorithm with how it assessed “quality”. And if you pay attention to SEO at all, then you know that’s a really big deal. By the way, the update has been called “The Quality Update”, but I’m sticking with Phantom. :)

Phantom 2 Hit

Phantom 2 Drop May 2015

Massive Phantom 2 Hit in May of 2015

You can read more about Phantom in my post containing my findings, but it was a huge update that seemed to target many factors that Panda also targeted. And that got me thinking that Phantom could actually be the beginning of Panda being incorporated into Google’s core ranking algorithm. It’s hard to say if that was the case, or if that could have started during the 2/5 update, but both looked to heavily target content quality issues.

July 2015 – A Note About Panda 4.2
The summer brought a confirmed update in Panda 4.2 on July 18, 2015. And it wasn’t pretty, since it was an extended rollout based on “technical problems” Google was having with the algorithm update. Since 7/18/15, many companies have been eagerly checking their reporting to see if P4.2 has impacted their Google organic traffic. Unfortunately, many previous Panda victims (especially those impacted by Panda 4.1 in September 2014 and the 10/24/14 update) are still waiting for recovery. And that’s after making significant changes to their site, content, advertising setup, etc.

Panda 4.2 Recovery in July 2015

Panda 4.2 Hit in July 2015

There was a heavy focus on Panda 4.2 in the industry as we led into the fall of 2015. And that’s when the algo Richter scale in my office started to jump again. Strap yourselves in, the fall of 2015 was a bumpy ride.

September 2 and 16 Updates
In early September, I saw significant movement across sites that had been impacted by Panda and Phantom in the past (notice the trend here?) The first date was 9/2/15 and many sites either surged or dropped on that day. Again, these were sites that previously had content quality problems and had dealt with Panda and/or Phantom situations in the past.

And it wasn’t long until the next update hit the scene. Exactly two weeks later on 9/16, I saw another update roll through. More sites that had dealt with content quality problems saw movement (either up or down). I wrote a post detailing my findings after analyzing a number of sites impacted by the 9/2 and 9/16 updates. Based on my analysis, it was clear that “content quality” was the focus.

So, was it Panda 4.2 continuing its rollout or more core ranking algorithm adjustments like Phantom? Or was it a mix of Panda 4.2 and core ranking algo adjustments? Only Google knows.

September 16 Google Update

Drop During September 16 Google Update

September 16 Google Update Connection

September 16 Surge After Phantom 2 Drop

And We’re Not Done Yet! – November 19, 2015
As I was finalizing this post, it seems that yet another unconfirmed update has been released. There were many claims of movement starting on 11/19/15 from webmasters globally. I am also seeing movement based on the data I have access to, in addition to new companies reaching out to me about the update. So it does look like there was an update pushed last Thursday.

November 19, 2015 Google Update with Connection

November 19 Google Update Increase

November 19, 2015 Google Update Drop

Also, Google’s John Mueller was asked on Twitter if there was an update and he replied with the following response (see below). I’ve heard that type of response quite a bit over the years when there was a core ranking algo update. :)

John Mueller Tweet About November 19 2015 Google Update

When analyzing the sites seeing movement, many of those sites had been previously impacted by Panda and/or Phantom. Once again, the update looks “content quality” related to me and does not look to be connected to links. That said, we are still early on. If this was another unconfirmed update focused on “content quality”, then it joins the others I mentioned in this post that began rolling out in February, continued with Phantom, then in September, and now a late November update.

The rapid release of these updates got me thinking about their connection to Panda and what Google has planned for our cute, black and white friend. I have a theory about this and I’ll cover that next.

Are The Rapid-fire “Quality Updates” The Deconstruction of Panda?
We know that Google intends to bake Panda completely into its core ranking algorithm at some point (and Penguin as well). I mentioned earlier that all of these unconfirmed updates could be Google deconstructing Panda and taking those pieces and baking them into the core ranking algorithm. That could have started with the 2/5 update, continued with Phantom 2 in early May, and then even more with the bi-weekly updates in the fall of 2015, and continued now with the 11/19 update. The summer simply could have been a break from the process as Panda 4.2 rolled out.

Note, that is speculation, but data definitely backs this theory. There were a number of algorithm updates focused on “content quality” and we know Google wants to bake Panda into its real-time algorithm. Hard to say if that’s the case, but it’s important to note.

Dealing With Unconfirmed Quality Updates
As you can guess, this has been extremely frustrating for many business owners dealing with traffic loss from Panda, Phantom, or unconfirmed algorithm updates. Based on helping a number of companies with this situation, I’ve provided a bulleted list below with some recommendations. It is not a comprehensive list of what to do, but can definitely help you get moving in the right direction.

Recommendations for dealing with Google’s various “content quality” updates:

  • Perform a search history analysis, which can help you document the various swings in organic search traffic over time. Then line those dips and surges up with both confirmed and unconfirmed algorithm updates.
  • Understand the content and queries that took a hit. Run a Panda report (which can be used for other algo updates as well) to find the core pages that dropped after a specific date.
  • For quality updates (like Panda, Phantom, and the other updates listed in this post), focus on hunting down low content quality. And as I’ve said before many times, “low quality content” can mean several things. For example, thin content, scraped content, duplicate content, technical problems causing quality issues, advertising problems causing engagement issues, and more.
  • Rectify user engagement issues quickly. Once you find problems causing user engagement issues, fix them as soon as you can. Engagement barriers drive user frustration, which can send those users running back to the SERPs. And low dwell time sends horrible signals back to the mothership. Beware.
  • Avoid user deception at all costs (especially from an advertising standpoint.) Don’t blend ads into your content, don’t surround key elements of the page with ads that can be mistakenly clicked, and don’t provide ad links that look like navigation, but drive users to third party sites. Hell hath no fury like a user scorned. By the way, ad deception is in Google’s Quality Rating Guidelines. You should download the pdf to learn more.
  • Understand how Google sees your site. Use fetch and render to truly understand how Googlebot views your content. I can’t tell you how many times I’ve audited a site and found serious render issues. Fetch and Render is your friend. Use it.
  • Noindex what needs to be removed and focus Google’s attention on your highest quality content. Make hard decisions. Nuke what needs to be nuked. Be objective and aggressive when needed.
  • Rewrite and revamp content that needs a boost. Not all low quality content needs to be noindexed or nuked. If you feel it’s a strong topic, but the content isn’t the best it can be, then work on enhancing its quality. For example, brainstorm ways to enhance the content data-wise, visually, and based on what users are looking for. You don’t have to nuke it if you can boost it.

Summary – Know What Hit You, Respond Accordingly
As you can see, 2015 has been a volatile year from an algorithm update standpoint — yet only a few updates were actually confirmed by Google. In this post, I provided additional important updates that could have impacted your Google organic traffic starting back in February and being released throughout the year. I recommend reviewing your 2015 Google organic trending and identifying any swings in traffic around those important dates. Then form a strong plan of attack for fixing any problems from a content quality standpoint.

You have to know what hit you in order to take the appropriate actions. And that’s sometimes hard to do when Google doesn’t confirm specific updates. I hope this post was helpful, at least from that standpoint. Good luck.



How To Check The X-Robots-Tag For Noindex and Nofollow Directives (7 Plugins, Tools and Crawlers)

X-Robots-Tag in the Header Response

I have previously written about the power (and danger) of the meta robots tag. It’s one line of code that can keep lower quality pages from being indexed, while also telling the engines to not follow any links on the page (i.e. don’t pass any link signals through to the destination page).

That’s helpful when needed, but the meta robots tag can also destroy your SEO if used improperly. For example, if you mistakenly add the meta robots tag to pages using noindex. If that happens, and if it’s widespread, your pages can start dropping from Google’s index. And when that happens, you can lose rankings for those pages and subsequent traffic. In a worst-case scenario, your organic search traffic can plummet in an almost Panda-like fashion. In other words, it can drop off a cliff.

And before you laugh-off that scenario, I can tell you that I’ve seen that happen to companies a number of times during my career. It could be human error, CMS problems, reverting back to an older version of the site, etc. That’s why it’s extremely important to check for the presence of the meta robots tag to ensure the right directives are being used.

But here’s the rub. That’s not the only way to issue noindex, nofollow directives. In addition to the meta robots tag, you can also use the x-robots-tag in the header response. By using this approach, you don’t need a meta tag added to each url, and instead, you can supply directives via the server response.

Here are two examples of the x-robots-tag in action:
Examples of the X-Robots-Tag

Again, those directives are not contained in the html code. They are in the header response, which is invisible to the naked eye. You need to specifically check the header response to see if the x-robots-tag is being used, and which directives are being used.

As you can guess, this can easily slip through the cracks unless you are specifically looking for it. Imagine checking a site for the meta robots tag, thinking all is ok when you can’t see it, but the x-robots-tag is being used with “noindex, nofollow” on every url. Not good, to say the least.

How To Check The X-Robots-Tag in the Header Response

Based on what I explained above, I decided to write this post to explain four different ways to check for the x-robots-tag. By adding this to your checklist, you can ensure that important directives are correct and that you are noindexing and nofollowing the right pages on your site (and not important ones that drive a lot of traffic from Google and/or Bing). The list below contains browser plugins and online tools for checking single urls, as well as crawling tools for checking urls in bulk. Let’s jump in.

1. Browser Plugins

Web Developer Plugin
The web developer plugin is one of my favorite plugins for checking a number of important items, and it’s available for both Firefox and Chrome. By simply clicking the plugin in your browser, then “Information”, and then selecting “Response Headers”, you can view the http header values for the url at hand. And if the x-robots-tag is being used, you will see the values listed.

Checking the X-Robots-Tag Using Web Developer Plugin

SEO Site Tools
I use the SEO Site Tools chrome extension often for checking specific SEO elements for a given page. The x-robots directives are somewhat hidden in this plugin, but you can still find them pretty easily. Just click the plugin in Chrome, then select the “Page Elements” tab, and then scroll all the way down to the bottom of the window. You’ll see the header response there, including the x-robots-tag directives if the tag is being used for the page at hand.

Checking the X-Robots-Tag Using SEO Site Tools

If you want to check headers on the fly, then there’s no better plugin than LiveHTTPHeaders. It’s available for both Chrome and Firefox and it easily enables you to view the header response for each page as you browse the web. For example, you can check headers and track down problems as you traverse a specific website.

Since it provides the header response for each page, you will also see the x-robots-tag directives for each url. Just click the url you want in the window to view the header response. The x-robots-tag will be listed if it’s used for the url at hand.

Checking the X-Robots-Tag Using LiveHTTPHeaders

2. Online Tools For Checking The Header Response
In addition to plugins, you can use a number of online tools that take a url (or several urls) and return the header response. Like plugins, this is a good option when you are checking single urls or just testing a sample of urls.

SEO Tools Server Header Checker
There are two options when using the SEO Tools Server Header Checker. You can check a single url or you can use the bulk url option to check several urls at one time. For the single url option, just enter a url to test and click “Check Headers”. The tool will return the header response for the url at hand, including the x-robots-tag directives.

For the bulk header check, enter a series of urls (one on each line) and click “Check Headers”. You will see each response for each of the urls listed, along with the x-robots-tag if it’s being used.

Checking the X-Robots-Tag Using SEOTools Server Header Checker

URI Valet
URI Valet is a versatile online tool that returns a number of important pieces of information for the url at hand. For example, the header response, performance information, internal links, external links, validation information, etc. You can also select a user agent for checking the response based on various browsers, devices, and search engine bots. There’s quite a bit of functionality built in to this online tool, but I won’t go into detail about all the reports here. That’s because we are focused on the header response (to find the x-robots-tag).

Simply enter the url, select a user-agent (or just keep the default selected), click the “I’m not a robot” button, and then click submit. The header response will be listed below, along with the x-robots-tag directives (if used).

Checking the X-Robots-Tag Using URI Valet

3. Crawling Tools
Now that I’ve covered some plugins and online tools that can help you check the x-robots-tag, let’s check out some robust crawling tools. For example, if you want to crawl many urls in bulk (like 10K, 100K, or 1M+ pages) to check for the presence of the x-robots-tag, then the following tools can be extremely helpful.

If you want to a robust, enterprise-level crawling engine, then DeepCrawl is for you. Note, I’ve been such a big proponent of DeepCrawl that I’m now on the customer advisory board. So yes, I’m a fan. :)

After crawling a site, you can easily check the “Noindex Pages” report to view all pages that are noindexed via the meta robots tag, the x-robots-tag header response, or by using noindex in robots.txt. You can export the list and then filter in Excel to isolate pages noindexed via the x-robots-tag.

Checking the X-Robots Tag Using DeepCrawl

Screaming Frog
I’ve also been a big fan of Screaming Frog for a long time. It’s an essential tool in my SEO arsenal and I often use Screaming Frog in combination with DeepCrawl. For example, I might crawl a large-scale site using DeepCrawl and then isolate certain areas for surgical crawls using Screaming Frog.

Once you crawl a site using Screaming Frog, you can simply click the Directives tab and then look for the x-robots column. If any pages are using the x-robots-tag, then you will see which directives are being used per url.

Checking the X-Robots-Tag Using Screaming Frog

Summary – There’s more than one way to noindex a page…
OK, now there’s no excuse for missing the x-robots-tag during an SEO audit. :) If you notice certain pages are not being indexed, yet the meta robots tag isn’t present in the html code, then you should definitely check for the presence of the x-robots-tag. You just might find important pages being noindexed via the header response. And again, it could be a hidden problem that’s causing serious SEO issues.

Moving forward, I recommend checking out the various plugins, online tools, and crawlers I listed in this post. All can help you surface important directives that can be impacting your SEO efforts.


The Curious Case of The Disappearing and Reappearing Google Featured Snippet

The Disappearing Google Featured Snippet

As Halloween approaches this year, I’m gaining a better understanding of the phrase “Trick or Treat”. Over the past month, I’ve watched Google display a featured snippet for one of my blog posts (the “treat”), only to change that back to a standard snippet (“the trick”). And then back to a featured snippet (“treat”). And you guessed it, back to a standard snippet (“trick”). I really don’t know why that’s happening, but I think it’s an interesting case study.

If you are unfamiliar with featured snippets, then you should read my Search Engine Land column about gaining and losing a featured snippet. In a nutshell, Google can provide an answer at the top of the search results for a query (the featured snippet), along with a link to the third party site containing the content. A featured snippet can also contain visuals, like photos or graphs.

Here’s an example of a featured snippet (since we’ll see plenty of vampires this upcoming Saturday):

Example of a Featured Snippet - Are Vampires Real?

Based on analyzing featured snippets for clients, I know they can drive a massive amount of traffic based on the SERP treatment and the inferred credibility. For example, when Google highlights your content as an answer for a query, separates that content from the rest of the SERP, sometimes provides a visual from your post in the featured snippet, along with a link back to your content. As my case study on SEL documented, you can gain or lose a lot of traffic based on a featured snippet.

It Begins – Pre-processing of a Featured Snippet?
On September 24, 2015, I published my Search Engine Land column explaining how to find queries per url in Google Search Console (GSC). As soon as it was crawled and indexed, I noticed something interesting. It looked like Google was pre-processing a featured snippet.

For example, the description looked like a bulleted list of instructions versus the standard description used for a post. Sure, Google doesn’t always use your meta description, but it also doesn’t always provide a list of items explaining how to do something.

Here is what I saw just 28 minutes after the post was published:

The Pre-processing of a Google Featured Snippet

Needless to say, I wanted to keep an eye on the situation to see if I would receive a featured snippet.

Less Than 2 Weeks Later – It Appears!
I set reminders to check the SERPs daily to see how it was looking. It was about two weeks later that I first saw the featured snippet for a targeted query! And it included a list of six steps from my post along with a visual. It was gorgeous. :)

Featured Snippet With Bulleted List in Google SERPs

So I might have been right! It seems Google was pre-processing the featured snippet by breaking down the steps involved. I remember thinking, “This is AWESOME. I must write about this now.” So I added the topic to my blog post idea bucket and moved on. I was going to try and write the post within a week or so, until…

1 Week Later – Mountain View, We Have a Problem
About a week after noticing the featured snippet, I went to check the SERPs again. I typed in the query, and boom, I saw a standard snippet. That can’t be right… So I opened anther incognito window and checked again. Nope. I opened my chromebook to check on another system. Gone.

Standard Featured Snippet Returns

So it seems Google had already removed the featured snippet! Sure, the SERPs are extremely dynamic, but what could have happened in that one week to remove the featured snippet? The post is a thorough tutorial, it clearly explains how to find queries per url in GSC, and it was published on an authority site (SEL). Come on Google!

So, I took the hit and moved on. But the story doesn’t end there.

2 Weeks Later – I’m Baacckk…
Two weeks later, I was doing some featured snippet detective work for a client and just happened to check the SERP again for my SEL column. And low and behold… the featured snippet was back! But this time it didn’t contain the bulleted list. The visual was still there, but the list was gone. Interesting… but still awesome.

Google Featured Snippet Reappears

Maybe Google realized that the post should yield a featured snippet, but just not in the form of a bulleted list. Or, maybe enough searches for that subject or question occurred to yield a featured snippet. Or maybe this was more testing by Google to see if it warranted a featured snippet at all. Regardless, my featured snippet was back.

The Featured Snippet Returns

3 Days Later – It’s Gone Again
Ugh. Just three days later, the featured snippet was gone again. To be honest, I had no idea what was going on with that featured snippet. I get that featured snippets are algorithmically triggered, so my guess is that the query and/or post is in the gray area for some reason. My post ranks #1 in the SERPs for targeted queries, but does not always yield a featured snippet.

Invisible Man and Google Featured Snippets

Going Straight To The Source: Asking Google
So I decided to go to the source to try and find out what’s going on. I ended up asking both Gary Illyes and John Mueller on Twitter what they thought of the situation. I received an answer, but it was pretty vague.

Gary explained that it’s either ancient aliens (yes, I’m serious) or the fact that Google takes many signals into account when determining when to show a featured snippet. See the Twitter conversation below.

Gary Illyes Regarding Signals That Trigger Featured Snippets:

Gary Illyes Tweet About Featured Snippets

So here we are. Either Google is messing with me or the query/post is in the gray area of featured snippets. Or of course, ancient aliens are involved. :)

Regarding signals that Google takes into account, I started thinking about this more and more after the tweet. What are some of the signals that Google might consider when determining to show a featured snippet? And how could that list help webmasters that are trying to be featured in the SERPs for specific queries?

Based on a quick brainstorm, I have provided some possible signals below. I’m sure this isn’t all of them, but it’s a good start. If you are interesting in driving more featured snippets for your site, then you might want to review this list.

Possible Signals Google Uses When Determining To Show a Featured Snippet:

  • Type of query (i.e. query formed as a question using what, where, who, why, when).
  • Query volume for the question at hand.
  • Content that matches up well with the query (i.e. an answer is clearly provided in an easy to break down format).
  • Authority of the website that the content is published on.
  • SERP engagement. Are users interacting with and clicking through the featured snippet? Or are they bypassing that snippet for listings below?
  • Dwell time while on the destination website. If users bounce back to the SERPs quickly, maybe the content should not yield a featured snippet.
  • Location of the user. Some featured snippets might be location-specific. i.e. Queries pertaining to a specific country or region.
  • Device – Will some featured snippets trigger on desktop vs. mobile, or vice versa?
  • Personalization: Featured snippets based on previous searches and behavior.
  • Author Rank. Does the authority of the author in that given niche play a factor?

Again, I don’t think this is a final list. If you feel there are others signals to add, let me know in the comments below.

Summary – The Gray Area of Featured Snippets
There you have it. A curious case of a disappearing featured snippet in the Google SERPs. To quickly review, I saw what seemed to be Google pre-processing a featured snippet, followed by receiving the featured snippet with bullets, followed by the featured snippet reverting back to a standard snippet, only to jump back to a featured snippet without bullets, and as of now, it’s standard again.

As Gary Illyes said, there are many factors that determine when a featured snippet should be displayed. I understand that, but in this situation, it seems Google cannot make up its mind. I’ll keep monitoring the situation and update the post if anything changes (and if I learn something new).

Personally, I feel as if my post is in the gray area of featured snippets. It’s either that or some Halloween prank by Google. Maybe RankBrain can help. Muahahaha. :)




Analysis and Findings From The September 2015 Google Algorithm Updates (9/2 and 9/16): Panda 4.2 Tremors, Manual Updates, The Methode Philosophy, and More

September 2015 Google Updates

In my last post, I explained what I have seen during the extended rollout of Panda 4.2. I ended up analyzing over seven weeks of Panda data, since P4.2 is going through an extended rollout. And yes, it’s still rolling out now. More about that soon. Overall, it had been very quiet leading up to that post. Sure, there were some sites that had seen impact and movement, but not like typical Panda updates. It was an underwhelming update to say the least.

Well, something has changed and more significant tremors have been seen across the web.

It’s hard to say if what I’ve been seeing is Panda or a tweak to Google’s core ranking algorithm, but September has been an extremely volatile month algorithm-wise. Specifically, there was movement on 9/2 with some sites seeing large drops in Google organic traffic. And then there was significant movement on 9/16 where some sites surged in traffic, while others dropped heavily.

For example, here’s a site that dropped on 9/16:

Drop During September 16 Google Update

And here’s a site that surged:

Surge During September 16  Google Update

Now, Google pushes hundreds of updates per year, with over one thousand last year according to John Mueller. But the September tremors seem to impact sites with both Panda and Phantom baggage. I’ll cover more about that in this post, but to me it seems like the tremors we have seen are tied to content quality. And again, we have two quality cowboys in town now with Phantom (AKA The Quality Update) and Panda. I definitely saw some sites that got hit hard by Phantom jump during the 9/16 update.

So it’s hard to say it was one over the other. And like I said in my post about Phantom, maybe that was the beginning of Panda being baked into Google’s core ranking algorithm. In other words, they might not be so different…

In this post I’ll cover more about what I’ve seen in September, what Googlers have said about Panda 4.2, I’ll explain more about the connection of these latest updates to Panda and Phantom, and I’ll touch on what I’m calling “The Methode Philosophy”, which is a possible change in how Google views Panda. Let’s jump in.

Confirmed: Panda 4.2 Still Rolling Out & Will For Months
Panda 4.2 started rolling out on 7/18/15 and we knew it would be an extended rollout that could take months. But nobody knew how long that extended rollout would last. Is it still rolling out, has it completed already, and how long before the update completes? All good questions that needed to be answered.

Well, at SMX East Google’s Gary Illyes explained that Panda 4.2 is indeed still rolling out. He last checked on 9/30 and Panda 4.2 was still roaming the web. In addition, he said it will continue to roll out for the next few months. Yes, read that again. Panda 4.2 could have a six month rollout (or longer). As you can imagine, that’s a huge change from previous updates.

Panda 4.2 is still rolling out.

Panda 4.2: Manually Pushed Or Auto-Rollout?
Another question about Panda 4.2 revolves around how it’s being released. Was a giant red button pushed on 7/18 and then P4.2 began its long and extended rollout? Or is Google manually pushing Panda at intervals during the extended rollout. There’s an important distinction between the two.

If Google is pushing the updates manually, then they can refine and tweak the algo based on what they are seeing in the SERPs (to ensure optimal results). That’s similar to what I saw with Panda 4.0 tremors, which was clarified by John Mueller. He said that after a large update, Google can and will tweak that algo to ensure everything is working they way they need it to.

{Updated based on clarification from Gary Illyes:}
So I decided to ask Google’s Gary Illyes on Twitter. At first, Gary explained that Panda 4.2 was being manually pushed at various intervals. But that’s actually not correct. He misunderstood the tweet and didn’t mean that Panda was being pushed manually at various intervals. Instead, Gary explained that Panda 4.2 was manually pushed on 7/18/15 and then automatically rolls out over months. From what he said, there have been no refinements to Panda 4.2 since the initial rollout. That’s important to understand, since it can help us better understand what we are seeing now volatility-wise. And a big thank you to Barry Schwartz for questioning the original response. Without that, I think we would still think Panda 4.2 was being manually pushed.

Both twitter conversations are below. First, here’s my question from yesterday.

Gary Illyes confirms manual Panda 4.2 rollout.
And here’s the clarification from Gary this morning:
Gary Illyes Clarifies Panda 4.2 Rollout

September Volatility – 9/2 and 9/16 To Be Specific
I’m sure you’re wondering what type of volatility I’ve seen in September and how that manifested itself in Google rankings and traffic. Below, I’ll provide some screenshots of what I’ve seen across websites, categories, and countries. I’ll focus on September 16 and highlight any connections I’ve seen with previous Panda and/or Phantom impact.

Google Analytics Trending For A Site Impacted By The 9/16 Update:

GA Drop During Sep 16 Google Update

Google Search Console Trending For A Site Impacted By The 9/16 Update:

GSC Drop in Clicks During 9/16 Google Update

SEMrush Trending For A Site Impacted By The 9/16 Update:

Drop During September 16 Google Update

Google Analytics trending showing a drop during the 9/2 update:

Drop From Sep 2 Google Algorithm Update

Searchmetrics showing a Phantom hit and then positive impact during the 9/16 update:

Phantom and Sep 16 Impact

Another site showing negative impact during Phantom and then an increase during the 9/16 update:

More Phantom and September 16 Movement

SEMrush Trending For A Site Positively Impacted By The 9/16 Update:

Surge In Traffic During Sep 16 Google Update

Searchmetrics Trending For A Site Positively Impacted By The 9/16 Update:

More Upward Movement During 9/16 Update

SEMRush Trending For A Site Seeing Positive and Negative Movement During The 9/2 and 9/16 Updates:

Ups and Downs During September Google Updates

The September Updates Seemed To Target Content Quality (Again)
When analyzing the drop in traffic across sites impacted by the 9/2 and 9/16 updates, I saw a number of familiar problems from a content quality standpoint. These are problems I’ve seen many times while helping companies with Panda and/or Phantom hits. Although some of the sites impacted had historical link problems, not all sites impacted had link issues. But they all had content quality problems.

User Happiness FTW
If you’ve read my previous posts about Panda or Phantom, then you know user happiness is extremely important. If you frustrate or anger users, then you can send them fleeing from your site. During my analysis of the September 2 and September 16 updates, I saw pages that dropped in rankings that provided horrible user experiences. For example, heavy ads scattered throughout the page, forced pagination (for monetization purposes), low quality supplemental content, and more. Let’s explore several of the problems below. Note, if you’ve been following Panda over the years, these problems should not surprise you.

Slideshows or Paginated Content with Keyword Content Not Visible On-Load
I saw several examples of pages that dropped out for keywords when those keywords were not visible on-load. For example, imagine searching for a topic, clicking a result in the search results, landing on a slideshow or paginated article, and not seeing the keywords or topic visible.

This could result in a number of problems SEO-wise. The most important being users getting frustrated and jumping back to the search results since they can’t easily find the content they searched for. Remember, low dwell time is a giant invitation to Panda. Also, John Mueller has said many times that hidden content will either be heavily discounted or not indexed at all. So if one doesn’t get you, the other will. What’s interesting is that pages which used to rank highly for queries related to keywords searched for dropped during the 9/16 update.

September 16 Update Pagination Issues

Thin Pages, Yet Hiding Content
This was a strange one, and it somewhat relates to the bullet above. I found pages that were inherently thin (given the niche), yet the company still chose to force a user to trigger additional content (versus providing it on-load). I have no idea why they would do this on an already thin page, but they did. And I saw those pages drop. It’s worth noting several of those sites also saw drops during the Quality Update (Phantom) in early May of 2015.

September 16 Update and Thin Content

Technical Problems Causing Content Problems (or Ad Problems)
There were a few sites that saw substantial impact during the 9/16 update based on technical problems causing content and/or ad problems. For example, one site built using AngularJS using html snapshots ended up having a giant ad plastered across its content (the snapshot that Google indexes).

Thousands of pages were impacted. On 9/16 they lost approximately 40% of their Google organic search traffic. I can’t say 100% that the drop was due to the ad issue, but it sure looks that way. It’s a leading site in its niche with thorough and detailed content. Although I haven’t analyzed the entire site, having your html snapshots polluted by a giant ad is not good. Not good at all actually.

September 16 Update and Ad Problems

Indexed Bulk Thin Content
A large-scale website with 60M+ pages indexed ended up having over 1M ultra-thin, lower quality pages crawled and indexed (by accident). The business owner already nuked those pages, but they have been on the site for a while. This is also a site that typically sees movement with Panda updates based on its niche, size, and complexity. This is a solid example of “bamboo creep” that can occur without showing any major signs. Then boom, it catches up to you very quickly.

September 16 Update and Bulk Thin Content

Simple, 1997-like Design… INCREASED
I have to explain one jump I saw, since it shocked me at first. The site surged during the 9/16 update, so I dug into the increase in rankings and traffic. When I first visited the site, I almost fell out of my seat. It was so simple, so plain, and so 1997-like, that I couldn’t believe the traffic numbers and strong rankings across competitive keywords.

Simple and Plain Design Surges During September 16 Update

But I picked myself up off the ground and took some time going through the site. It didn’t take long to understand why this site ranked well. It provides granular details for a specific niche, it’s well-documented, and up-to-date. So I reviewed a number of important queries for the site and put myself in the shoes of someone searching for that information. And when I did, I was actually quite happy with the site. It was easy to find the information I wanted, it was organized well (even if it was extremely vanilla), and I found what I wanted quickly.

Remember the first bullet from above? User happiness FTW. :)

Are Bi-Weekly Tremors The New “Quality” Release Schedule?
So, we had an update on Wednesday 9/2 and then another exactly two weeks later on Wednesday 9/16. That sounds a little too perfect, doesn’t it? Based on what we’ve seen, does this mean that Google is tweaking Panda 4.2 and pushing smaller updates every two weeks? Gary says no, so it doesn’t seem like that would be Panda. But it could be Panda 4.2 continuing its rollout. It’s just interesting to see very little impact from P4.2 turn into serious turbulence in early September. And then exactly two weeks later, we saw more turbulence targeting content quality.

That’s speculation, but the updates are similar to the “tremors” I saw after Panda 4.0 (as explained earlier). And by the way, if that bi-weekly schedule is remotely correct, then we should have another tremor any day now. I guess we’ll see.

The Methode Philosophy – A Change In Panda Thinking
Before ending this post, I wanted to bring up an important point based on a video I watched with Google’s Gary Illyes. And I believe it’s a change in Panda philosophy by Google. Specifically, Gary explained that many wrongly believe Panda is about hurting or penalizing a website. He says it’s not. Instead, it’s about ensuring Google “adjusts” websites that become overly prominent for keywords that they shouldn’t rank for. Google wants its users to find the best content possible for the query at hand. So Panda is about making sure websites do not become overly prominent for keywords they shouldn’t rank for. You can watch him explain this in a Q&A with Bruce Clay (at 8:04 in the video).

The Methode Philosophy

Now, I totally understand that, and I actually think it’s great. But that’s not how Panda worked in the past. It didn’t just demote pages on an impacted domain for keywords they shouldn’t rank for. Sure, that was part of it, but it crushed websites overall. For example, sites dropped by 60%+ overnight and it wasn’t just overly generic keywords that site shouldn’t rank for. It included core keywords that the site should rank for. That’s why many viewed Panda as an algorithmic penalty.

The reason I bring this up is because Gary’s comments lead me to think that Panda could have changed… and maybe that’s why we are seeing different results from 4.2 than we typically see. Panda 4.2 is supposed to be a refresh. But what if it’s something different? What if it now works based on how Gary explained it?

Again, I like the approach and I think most SEOs would be on board. But I’m concerned for websites that were hit by Panda 4.1 on September 23, 2014 or by the October 24, 2014 update. Many of those websites are still struggling to recover, even after making significant changes. And if Panda has fundamentally changed, then those sites may never recover. They would be in Panda limbo with little hope of coming back, and that would be wrong. It’s hard to say if this is happening, but it’s worth noting.

As The Panda Turns – The Future For Panda and Phantom Victims
After speaking with many webmasters that have been impacted by Panda and/or Phantom, many feel as if they are currently in a horror movie (which fits since Halloween is right around the corner). You know, like being stranded in an abandoned sleep away camp in the woods, knowing a killer could strike at any time, hoping they are spared during the next attack, and trying to send an SOS to local law enforcement. But they just can’t seem to get out of Camp Crystal Lake. Yes, it’s like Friday the 13th, but for SEOs. Just swap Jason for Panda and make it a 6 month movie versus 2 hours. :)

That said, I’m actually hopeful given the volatile September we just experienced. Frequent updates can be good, as more sites can see impact (and hopefully up with the down). Let’s face it, would you rather have 10 months of inactivity or frequent updates? I’m for the latter.

Moving forward, it’s always smart to clean up any problems your site has (both on-site and off-site). This is the same advice I’ve been giving since before Panda 4.2 rolled out and I still believe tackling all quality problems is a smart way to proceed. In a nutshell, you should be ready for more tremors, for Panda 4.3 (if that’s coming), or for the real-time Panda (which we know Google wants to roll out).

That’s all you can do at this point. And of course, you could sip a witches’ brew with a bamboo garnish while trick or treating. But watch out for that Jason Voorhees guy. I heard he’s dressing up as a panda this year. Good luck.  :)



Panda 4.2 Analysis and Findings 7 Weeks Into The Extended Rollout – A Giant Ball of Bamboo Confusion

Panda 4.2 Analysis and Findings

Note: I reached out to Google last week to learn more about the current rollout of Panda 4.2, when it would be completed, and other interesting questions I had based on my analysis. But I haven’t heard anything back directly related to those questions. I will update this post if I receive more information about P4.2.

On Wednesday, July 22nd, Barry Schwartz broke huge SEO news. Google finally started rolling out Panda 4.2, which we’ve been eagerly waiting for since 10/24/14. That was the last Panda update, which was over nine months ago at the time! That’s extremely unusual for Panda, which typically rolled out monthly (and even more frequently at certain times).

Google explained to Barry that Panda began rolling out the weekend prior (July 18th) and that this would be an extended rollout (which was also very strange). Then John Mueller explained in a webmaster hangout that the extended rollout was due to technical problems that Google was having with Panda. They didn’t want to push an extended update, but were forced to.

So according to Google, Panda 4.2 could take months to fully roll out. Here’s a tweet from Google’s Gary Illyes confirming the update.

Gary Illyes Confirms Panda 4.2

I’ll be honest. I was completely shocked when I heard about the extended rollout. Panda usually rolled out quickly and sites that were impacted could easily identify the exact date of the impact.

For example, here is a big hit from Panda 4.0 in May of 2014.

Panda 4.0 Hit

And here is a recovery during Panda 4.1 in September of 2014:

Recovery During Panda 4.1

One day, big impact, and easier to associate with a specific Panda update. Ah, those were the days.

Having a specific date makes it much easier for webmasters to understand what hit them, and then what to fix. With the extended rollout of Panda 4.2, sites could theoretically see impact right after 7/18, a few weeks from then, or even a few months out from 7/18. And with Google pushing hundreds of updates throughout the year (and over one thousand last year according to John Mueller), how are webmasters supposed to know if Panda impacted them, or if it was something else (like Phantom, Penguin, or any of the other updates Google rolls out during year)? Short answer: they can’t.

I’ll expand on this topic later in the post, but for now just understand that you can gradually see impact from Panda 4.2 over time. Some sites will see more impact in a shorter period of time, but it’s entirely possible to see smaller movement over months. And of course, you might see nothing at all. That’s a good segue to the next section.

Analyzing Over 7 Weeks of Panda 4.2 Data
I’ve been heavily analyzing the update since we learned about Panda 4.2, and I specifically held off on publishing this post until I reviewed enough data. Due to the extended rollout, I wanted to make sure I gave our new Panda friend enough time to show his face. Now that over seven weeks have gone by, and I’ve seen a number of interesting things along my Panda travels, I decided to finally publish this post.

Warning: You might like what you read, and you might not. But it is what it is. That’s unfortunately the case. Read on.

All Quiet on the Giant Panda Front – Many Typical Panda Players Not Impacted (Yet)
I have access to an extremely large set of Panda data globally. The data includes many sites that have dealt with Panda problems in the past (and quality problems overall). And that includes some of the biggest sites with previous Panda scars. For example, sites with tens of millions of pages indexed, that are inherently Panda-susceptible, and that have dropped and surged over time as they enter and exit the gray area of Panda.

The large Panda dataset I have access to often enables me to see when Panda rolls out (and when other quality algorithms roll out like Phantom did in late April and early May.) The sites I have access to include ecommerce retailers, news publishers, press release websites, directories, affiliate websites, large-scale blogs, song lyrics websites, coupon websites, and more.

As I’ve been analyzing the trending for these websites since 7/18, it was easy to see that many of the larger, Panda-susceptible sites have seen very little impact since Panda 4.2 rolled out. It’s almost like Google didn’t want to touch these sites during the initial rollout. I’m not saying that’s the case (since Panda is algorithmic), but it sure seemed that way.

For example, I’ve seen a lot of trending that looks like this:

Stable Trending Through Panda 4.2

and this:

More Stable Trending Through Panda 4.2

No movement. At all.

This is important to understand if you are monitoring a large-scale website that has been working hard on Panda remediation. If you have seen very little impact so far, it could be that Panda 4.2 simply hasn’t impacted your site yet (or it won’t impact your site at all).

Like many others in the industry, I fully expected more large-scale sites that have previously been impacted by Panda to see movement. But most of the sites that act as Panda barometers have seen little or no impact. It’s very, very strange to say the least. Time will tell if that changes.

Also, John Mueller explained more about the rollout in a webmaster hangout video. He said that technical problems on Google’s end are forcing them to roll out Panda very slowly. Now, we don’t know what those technical problems are, but it seems that if John is correct, then Panda could still impact your site as time goes on. I haven’t seen that happen for most sites, but I guess it’s still possible. Needless to say, this isn’t optimal for webmasters battling the mighty, I mean aging Panda. And it’s extremely out of the ordinary for Panda.

Extended Rollouts and Going Real-Time – The Easiest Way To Hide Impact From Major Algorithm Updates
Google has stated several times that they intend to incorporate Panda into its core ranking algorithm. I don’t know when that will happen, but we are clearly seeing the first signs of that happening. We had Phantom 2 in May, which was a change to Google’s core ranking algo related to how it assesses “quality”. Now we have an extended rollout of Panda, which means we can’t pin the update on a specific date.

And by the way, Google also wants to incorporate Penguin into its core algo. Yes, say goodbye to external algos that are unleashed on the web in one fell swoop.

Every time Google released Panda and/or Penguin, the web erupted. And as you can guess, many that were negatively impacted screamed bloody murder. So much so, that the mainstream media started reporting on algorithm updates. Needless to say, Google doesn’t want that type of attention. It makes them look bad (even when most people will admit they have a really hard job trying to algorithmically determine what’s high quality, what’s spam, what’s unnatural, etc.)

So what’s the easiest way to avoid the flak created by a specific algorithm update? Roll that update out over months, or even better, bake it into the core ranking algorithm. Once you do, then nobody can pin a date on a specific update, nobody can say “Panda is flawed”, “Google has lost control of Penguin”, “Phantom is scarier than the exorcist”, etc.

Adding a drop of Penguin here, a teaspoon of Panda there…

Baking Panda and Penguin Into Core Ranking Algorithm

Well my friends, this is what we are dealing with now Panda-wise. And from a Penguin standpoint, you have a better chance at seeing Halley’s comet than seeing another update. Both are chained, cloaked, and do not have a specific update schedule planned.

Personally, I believe it became harder and harder to release Panda and Penguin updates without causing a lot of collateral damage. Google has many algos running, and I believe it was very hard to gain accurate results when unleashing Panda and Penguin on the web. So here we are. No dates, no information, no evidence, no screaming, and no mainstream media articles about Google algo updates.

With that out of the way, let’s dig deeper with what I’ve seen across my Panda data.

Panda 4.2 Illusions
During my Panda 4.2 travels, I came across a number of examples of websites that looked like they were heavily impacted by Panda 4.2, but actually weren’t. There were other factors at play that caused the drop or gain in traffic after 7/18/15, but just happened to coincide with the rollout of Panda 4.2.

For example, check out the trending below. Wow, that looks like a severe Panda 4.2 hit. But once I dug in, there were technical SEO issues with that site that was causing traffic to go to a sister website. Basically, as one site’s traffic decreased, Google traffic to the sister site increased.

Panda 4.2 Illusion Due To SEO Technical Problems

And here’s an example of another Panda 4.2 illusion. The site began losing significant traffic the week before Panda 4.2 rolled out.  Maybe it was impact from Panda 4.2 being tested in the wild or from the Phantom tremor I saw in mid-July? No, it ends up the website was in the process of migrating to another domain. And the new domain picked up that traffic.

Panda 4.2 Illusion Due To Migration

The key point here is that context matters. Don’t assume you’ve been impacted by Panda 4.2. It very well could be other factors at play. For example, technical SEO problems can cause big swings in organic search traffic, seasonality can come into play, and other factors could cause traffic changes. And this gets exponentially worse as time goes on… That’s because we don’t have a hard Panda date to base drops or gains of traffic on. Remember, there is an “extended rollout” of Panda 4.2.

Panda 4.2 Fresh Hits
So, there has not been widespread impact, but that doesn’t mean there hasn’t been some impact. And from what I can see, some websites seemed to have been heavily impacted right after Panda 4.2 rolled out. For example, sites seeing a significant drop in Google organic traffic (and a major drop in rankings) immediately following the initial rollout.

Now, it’s important to note that these types of hits were uncommon during Panda 4.2. I did not see many of them (at all). But there were some.

For example, check out the trending for this large-scale website. I have provided Searchmetrics data for the domain. Notice the initial drop when Phantom rolled out and then a bigger hit when Panda 4.2 rolled out:

Panda 4.2. Big Hit

This hit did not shock me at all. I’m confident that many people have cursed at their screens after arriving on the site. It’s filled with aggressive ads, pagination for monetization purposes, has technical SEO issues causing content quality problems, and simply provides a horrible user experience. Well, the site took a big hit after Panda 4.2 rolled out.

And then there were sites impacted more gradually since 7/18. For example, here’s a smaller site that has decreased gradually since 7/18. The site didn’t see full impact immediately, but did see a gradual decline in rankings and Google organic traffic since Panda 4.2 rolled out.

Smaller Site Hit By Panda 4.2 Over Time

Those are just two examples, and there are more like them (with more moderate drops than big hits). But the extended rollout is making it very hard to pin a drop in traffic on Panda 4.2. Again, I believe Google would like that to be the case. And remember, Google rolled out over 1,000 changes last year. Based on that number, there may have been 50-80 changes that have rolled out since 7/18 that weren’t Panda 4.2 related. Again, it is what it is. :)

Panda 4.2. Recovery (or at least improvement from P4.2)
I know what many of you are thinking by reading that subheading… Panda 4.2 recovery is an oxymoron! And overall, I would agree with you. As I explained earlier (and it’s important to highlight this), many sites working hard to recover from past Panda hits have not seen impact yet from P4.2. And some of the largest Panda-susceptible sites have also not seen movement. But there has been some positive impact already (to varying degrees).

Below is an example of a surge in Google organic traffic from a website that was hit hard in September of 2014 (when we had both the 9/5 Panda update and Panda 4.1). The website owner reached out to me this winter for help, but I unfortunately couldn’t help due to a lack of availability. That said, we stayed in touch throughout the year. I received an email the Saturday after Panda 4.2 rolled out (7/25/15) explaining that the website was seeing a surge in Google organic traffic. So I took a look at Google Analytics and Google Search Console to see what was going on.

Here is the sustained surge since 7/25/15. The site is up 495%. Note, sessions have been removed at the request of the business owner.

Surge After Panda 4.2

Now, since I didn’t work on the site, perform a deep audit of the site, guide the changes, etc., it’s hard for me to say this was 100% due to Panda 4.2. That said, I have reviewed sections of the original site and the list of changes that were implemented. The items addressed seemed extremely Panda-like, including handling content challenges, affiliate marketing refinements, and making some advertising fixes. Then after Panda 4.2 rolled out, the site surged.

Moving on, there are other websites that have seen partial recovery since P4.2 rolled out. For example, here is a site that increased after Panda 4.2 rolled out. And the site had been hit hard in the past by Panda, but shot up after July 18th.

Google Organic Increase After Panda 4.2

So I definitely agree that recovery from Panda 4.2 is rare, but there are sites that were positively impacted. Why more sites weren’t impacted is hard to say… Again, I reached out to Google for more information now that I analyzed seven weeks of data, but I haven’t received specific answers to my questions. I’ll update this post if I learn more.

Reversals, Tremors, or Other Disturbances in the Force
I know many in the industry have kept track of one of the more public recovery stories from Panda 4.2. I’m referring to Search Engine Roundtable. Barry Schwartz saw an immediate jump in Google organic traffic once Panda 4.2 rolled out and had been steadily increasing over time. It wasn’t a massive surge, but the site was up ~35% since 7/18.

Although some believed the surge was from people searching for Panda 4.2 related articles, the surge was actually from a number of queries across topics. And even when Barry stripped out the Panda-related articles from his analysis, there was still an increase in Google organic traffic.

Well, the increase was reversed in mid-August. So, was this some type of Panda 4.2 tremor or something else? We know that Google can and will tweak major algorithm updates to ensure they are seeing the right results in the SERPs, so it’s entirely possible. And by the way, there were many other people claiming to see similar changes in mid-August.

Panda 4.2. Reversal at Search Engine Roundtable
Image From SER

Personally, I saw websites experience a similar change. One site reversed a downward slide and shot up on 8/13 while the other change was more recent on 8/24 (the site began to drop after increasing). There are many other reports of changes starting in mid-August, so this isn’t just a few sites.

That said, and I’m sorry to have bring this up again, but since we are so far out from the release of Panda 4.2, it’s hard to say the impact is from P4.2. It could be, but it also could be other changes and tweaks. Remember, we have Phantom (which focuses on “quality”) plus Google rolled out over 1,000 changes last year.

Saying Goodbye to Panda, Penguin, and Other Algos Causing Mass Hysteria
In 2013, I wrote a post about Google starting to roll out Panda over 10 days. In that post, I explained that when this happens, or when Panda goes real-time, there will be massive webmaster confusion. If sites experience serious gains or losses at random times throughout the year, without tying that to a specific algorithm update, then how in the world are those webmasters supposed to know what caused the drop? Quick answer: They won’t.

Prior to Panda 4.2, webmasters already had trouble determining what caused a Panda hit. And now that impact can happen any time during the extended rollout of Panda 4.2, that confusion will exponentially get worse. And we know Google fully intends to incorporate Panda (and Penguin) into its core ranking algorithm. When that happens, there won’t be an extended rollout… it will be running all the time. You can technically see gains or decreases at any time. Good luck trying to figure that one out.

Based on what I explained above, we may never see another Panda or Penguin update again. Read that line again. It’s entirely possible that Panda 4.2 will be the last Panda update and that Penguin has become so hard to manage that maybe just parts of it get baked into Google’s core ranking algo. And if that happens, who knows what happens to Panda and Penguin victims. One thing is for sure, Panda 4.2 and Penguin 3.0 were not effective. Actually, I’d go so far as to say they were a mess. It’s hard to look at it any other way.

So yes, I called this in 2013, and here we are. Fun, isn’t it? :) As more days pass from the initial Panda 4.2 rollout date (7/18/15), it will become harder and harder to determine if Panda is impacting your site, if Phantom found new ectoplasm, or if other algorithms are at play. It’s why I’m recommending the nuclear option more and more recently. Identify all quality problems riddling a site, and fix them all. That’s both on-site and off-site.

Summary – What’s Next For Panda?
Well, that’s what I’ve seen so far. Panda 4.2 started rolling out on 7/18, but it can take months to fully roll out. And Phantom is running all the time. And Penguin hasn’t rolled out in a while, and who knows when (or if) it will roll out again. And Google pushes 500-600 updates each year (with over one thousand last year). There’s a lot going on… that’s for sure.

Regardless, I will continue to analyze the impact from Panda 4.2 (and other miscellaneous disturbances in the force) and I plan to write more about my findings as time goes on. In the meantime, keep improving content quality, fix all quality problems riddling your website (on-site and off), drive stronger user engagement, and fix any technical SEO problems that can be causing issues. That’s all you can do right now from a Panda standpoint. Then just hope that Google fixes Panda or bakes it into its core ranking algorithm.

But of course, if that happens, you will never know what hit or helped you. And I think that’s just fine from Google’s standpoint. Good luck.