Phantom Tremors Continue As SEOs Wait For Panda – Latest Tremor on 7/14/15

Phantom 2 Tremors

As many of you know, Phantom 2 began rolling out on April 29, 2015, just days after the mobile-friendly update was released. Phantom 2 was a big update that Google initially denied. During my analysis of the update, I saw a lot of movement across sites dealing with content quality problems. It was clear from the beginning that the algorithm update was focused on content quality (like Panda). That’s one of the reasons many people (including myself) initially believed it was a Panda update.

Once Google confirmed the update, they explained that it was a change to its core ranking algorithm with how it assessed “quality”. Even though Google explained that Phantom was part of its core ranking algorithm, I had a feeling that websites would not be able to recover quickly. The reason? I had seen this before, specifically with the original Phantom update in May of 2013. Phantom 2 was eerily similar to Phantom 1 (and rolled out almost 2 years to the day that Phantom 1 rolled out). Interesting timing to say the least.

Both Phantom 1 and 2 Were Panda-Like
With Phantom 1, I also saw the update target content quality problems. The companies I helped with fresh hits had a dangerous recipe of user engagement issues, content quality problems, and technical issues that all led to significant drops in organic search traffic. It took nearly three months before I saw the first Phantom 1 recovery, with others following as time went on.

It took a lot of work to see recovery from Phantom 1… it was not a trivial undertaking. And I expect Phantom 2 remediation and recovery to follow suit.

Actually, we are already over two months out from the Phantom 2 update (late April/early May), and I haven’t seen any major recoveries yet. That being said, I’ve seen a lot of movement on specific dates following Phantom. That includes both additional drops in traffic or sudden jumps for Phantom 2 victims. And that’s the topic of this post – Phantom tremors.

Multiple Phantom Tremors Since 4/29
After Panda 4.0 in May of 2014, I saw what looked like near-weekly updates impacting Panda victims. I called these updates Panda tremors, and they were confirmed by Google’s John Mueller.

Panda 4.0 Tremors and John Mueller

Basically, after Google pushes a major update, it can refine and tweak an algorithm and then push smaller updates (to ensure everything is working the way they want). I saw many of those tremors after Panda 4.0. Well, Phantom 2 has been similar. I started seeing Phantom tremors soon after the initial rollout and I have seen several more over time.

And most recently, I saw the latest Phantom tremor starting on 7/14/15. I’ve seen the impact with companies I am helping now, but I’ve also had new companies reach out to me explaining what they are seeing. And when tremors roll out, Phantom victims can see more impact (mostly negative based on my analysis, but there has been some positive movement).

Negative movement makes a lot of sense if the site in question hasn’t made any efforts to improve from a quality standpoint. For example, here is the hourly trending for a site hit by a Phantom tremor on 7/14. Notice the change in trending starting around 11AM ET on 7/14:

Traffic Drop From Phantom Tremor on July 14, 2015

And here is a screenshot of a site that saw positive movement starting around that time:

Traffic Gain From Phantom Tremor on July 14, 2015

Here is another screenshot of negative impact, this time showing the drop in clicks when comparing traffic after the tremor to the timeframe prior (based on the 6/8 tremor):

Traffic Drop Based on 6/8 Phantom Tremor

(Update: Here are some more screenshots of impact from the 7/14 Phantom tremor):

This example shows impact during the late June tremor and then more impact on 7/14:

Impact from two Phantom tremors

And here is an example of a quick jump at the end of June with that gain being rolled back during the 7/14 update:

Temporary Recovery During Phantom Tremor

Documenting Important Phantom Dates:
In order for Phantom victims to track their own trending based on Phantom tremors, I have included important dates below. It’s always important to understand why drops or spikes occur, so I hope this list of dates provides some clarity:

Original Phantom 2 Rollout: 4/29/15
Phantom Tremor: 5/27/15
Phantom Tremor: 6/8/15
Phantom Tremor: 6/17/15
Phantom Tremor: 6/28/15
Phantom Tremor: 7/14/15

What This Means for Panda Victims
Remember, many Phantom victims have been impacted previously by Panda. For example, some sites that were still working to recover from Panda got hit by Phantom, while others that already recovered from Panda got hit too. Yes, some of those sites saw drops after fully recovering from Panda.

For those sites still impacted by previous Panda updates that got hit by Phantom 2, it’s clearly not a great sign. We know that Phantom focuses on “quality” and so does Panda. I can’t imagine Phantom reacting negatively to your site, while Panda reacts positively. If that’s the case, then Google has an algo problem.

Now, if a site has recovered from Panda and then saw negative impact from Phantom, then that does not bode well for the next Panda update. Phantom is clearly picking up quality problems, which could also contribute to a future Panda hit. We are eagerly awaiting the next Panda refresh so recent Phantom hits on sites that have recovered from Panda should be concerning.

Panda Recovery Followed By Phantom 2 Hit

What Are Phantom Tremors?
Great question, and it’s hard to say exactly. Google could be refining the algorithm and then rolling it out again. Or they could be adding more factors to Phantom and rolling it out. Or they could be adjusting the threshold for each factor. All three are possibilities. That’s why it’s so important to get out of the gray area of Phantom.

I wrote a post recently about the SEO nuclear option and how it relates to both Panda and Phantom hits. Let’s face it, with multiple major quality algorithms running like Phantom and Panda, it’s ultra-important for websites to identify and fix all quality problems riddling their sites. If not, then they can see more impact. And in a worst-case scenario, they could get hit by both Phantom and Panda, which is what I call Phantanda. And although Phantanda sounds like a soda or rock band, it’s not cool or fun. You don’t want to experience it.

Is Phantom The Real-time Panda?
In my Phantanda post, I brought up the possibility that Phantom was actually the migration of Panda (or Panda factors) to Google’s real-time algorithm. It’s entirely possible this is the case. I’ve analyzed many sites hit by Phantom 2 and every quality problem I surfaced would have been picked up by a thorough Panda audit. The factors are extremely similar…. almost identical actually.

By the way, after Phantom 2 rolled out on 4/29, Josh Bachynski tweeted a similar thought. I said at the time that he was on to something… and I still believe that.

Josh Bachynski Tweet After Phantom 2

And with Google having all sorts of problems with Panda, this makes even more sense. For all we know, Panda might have problems with Phantom (as Phantom is part of Google’s core ranking algorithm and also focuses on quality). If that’s the case, then the “technical problems” Google has explained could be Panda and Phantom at odds with one another.

That’s total speculation, but there are now two quality cowboys in town. And to me, this SEO town might be too small for the both of them. It wouldn’t surprise me to find out that Panda is slowly being incorporated into Google’s core ranking algorithm. Remember, it’s been almost nine months since the last Panda update (10/24/14), while we’ve had multiple Phantom updates (the initial rollout and then several tremors). Hopefully we’ll find out soon what’s going on with Panda, and its relationship with Phantom.

Moving Forward: Keep Implementing (The Right) Changes
To quickly recap this post, if you were impacted by Phantom 2 in late April or early May, then you very well could have seen further impact during one or more Phantom tremors. I would check the dates I listed above and see if your site saw any drops or spikes during that timeframe.

And more importantly, continue to make the right changes to your website. Audit your site through a quality lens, identify all problems riddling your site, and move quickly to rectify those problems. That’s how you can rid your site of both bamboo and ectoplasm. Good luck.



Phantanda – Why The SEO Nuclear Option Is Important For Sites Hit By Phantom 2 and Panda

The SEO Nuclear Option for Phantanda Victims

Panda can be devastating. We all know that’s the case and it’s been documented to the nth degree since February of 2011. And now we have Phantom (AKA “The Quality Update”), which was a change to Google’s core ranking algorithm regarding how it assesses “quality”. Between the two, you can clearly see that Google is heavily focused on increasing the quality of the search results.

For webmasters, it’s bad enough when you see a big drop in rankings and traffic from Panda alone. Some sites can drop by 60%+ when Panda rolls through. But for many sites riddled with bamboo, little did they know that Panda has a close friend named Phantom who has no problem kicking a site while it’s down.

Since the end of April, I saw a number of sites that were already impacted by Panda see more negative impact from Phantom. And then there were some sites that recovered from Panda previously, only to get hit to some degree by Phantom. And as you can guess from the title of this post, I call these double-hits Phantanda.

In this post, I’ll explain more about Phantanda, how it relates to Panda, I’ll introduce the SEO nuclear option, explain why it’s important, and then I’ll end by providing some recommendations for those that want to go nuclear.

Opportunities for Recovery and Frequency of Quality Updates
As mentioned above, Google now has a one-two quality punch which I’m calling Phantanda. It’s not a soft drink or a rock band, but instead, a devastating mix of algorithms that can wreak havoc on a website’s organic search traffic.

If you’ve been hit by Phantanda, then it’s incredibly important to heavily audit your site through a quality lens. That audit should produce a thorough remediation plan for tackling any problems that were surfaced during the audit. Then you need to move quickly to execute those changes (flawlessly). And then you need Google to recrawl all of those changes and remeasure engagement. Yes, this is not a trivial process by any means…

An example of a Phantom hit on a site that has struggled with Panda:
Hit by The Quality Update (AKA Phantom 2)

In the past, Google used to roll out Panda monthly. That was great, since there were many opportunities for sites to recover as they made changes, removed bamboo, improved the user experience on their websites, and published higher quality content. But as many of you know, Panda hasn’t rolled out since 10/24/14. That’s a horrible situation for many battling the mighty Panda.

Sure, Gary Illyes said the next Panda update is coming soon (within weeks), but it has still been way too long between Panda updates. And that’s especially the case when we saw weekly Panda tremors after Panda 4.0.

The fact is we need more Panda updates and not less (as crazy as that sounds). Hopefully the next Panda update is right around the corner. We’ll see.

John Mueller clarifies Panda tremors after Panda 4.0:
John Mueller Clarifies Panda Tremors

Ectoplasm vs. Bamboo
From a Phantom standpoint, Google implemented changes to its core ranking algorithm with how it assessed “quality”. It was very similar to the first Phantom update, which was in May of 2013. Phantom 2 was eerily similar to Phantom 1 and I’ve done a boatload of research and work with both updates.

The good news is that websites were able to recover from Phantom 1. The bad news is that it took months of remediation work (similar to Panda remediation). I believe the first recovery I saw took approximately three months, while others took longer.

An example of recovery from Phantom 1 in 2013:
Example of Phantom 1 Recovery from 2013

Based on my analysis of Phantom 2 (and my work helping companies that have been impacted), I believe the remediation and recovery process is similar to Phantom 1 (longer-term). And that make sense. The Quality Update (Phantom 2) rolled out on 4/29, so websites need enough time to audit, strategize, execute, and wait for Google to process those changes. Remember, Google needs to crawl and measure the changes. It’s not like going mobile-friendly, which is a binary test for now (yes or no). In my opinion, it’s much more complicated than that.

Phantom 2 victim still negatively impacted:
Sustained Negative Impact from Phantom 2

In my opinion, Phantom remediation is very Panda-like. Take a long-term approach, truly understand the various quality problems riddling a website, and then take aggressive action to rectify those problems. And that leads me to the core point of this post – the nuclear option.

The SEO Nuclear Option – It’s No Walk in the Park, But It’s Worth It
I started referring to the nuclear option in June of 2013 when Google first started talking about Panda going real-time. I explained that if that was the case, then webmasters would have no idea what hit them. And that would make it harder to understand what caused the negative impact and how to fix the problem(s).

And as other algorithms crossed Panda in the wild, I brought up the nuclear option again. For example, when Google rolled out Panda during an extended Penguin rollout. Yes, they did that… To pull a quote from Jurassic Park, “Clever girl…” :)

Panda During Penguin - Clever Girl

When this happened, most people thought they were hit by Penguin, when in fact, they were hit by Panda. And that’s a horrible situation with the potential of disastrous results. Imagine nuking many of your links thinking you were hit by Penguin, when you really needed to improve content quality. I had many confused webmasters reach out to me after the 10/24 Panda update.

And now we have Phantom throwing its hat in the ring. As mentioned earlier, many of the problems I surfaced with Phantom victims were extremely similar to Panda problems. For example, I would have surfaced the same problems when completing a thorough audit (whether a site was impacted by Panda or Phantom). Heck, for all we know Phantom is actually the beginning of Panda being incorporated into Google’s core ranking algorithm. It’s entirely possible.

And of course we still have Penguin, with a release schedule less frequent than Halley’s comet passing earth. So based on what I just explained, what can a webmaster do when all of these algorithms are running around the web? Enter the nuclear option.

What Is The SEO Nuclear Option?
Simply put, the nuclear option involves identifying all SEO problems for a particular website and forming a plan for executing all of the changes over a period of time. That includes both on-site problems (like content quality, user engagement, advertising issues, mobile-friendliness, etc.) and off-site problems (like unnatural links, syndication issues, etc.)

SEO Thermonuclear War

Yes, it’s a lot of work, but again, it’s completely worth it in the long-term (in my opinion). The companies I’ve helped that decided to go down the nuclear path simply couldn’t take it anymore… They were tired of chasing algorithms, tinkering with urls, tweaking this, and then that, only to find themselves hit again by another algorithm update. The grey area of Panda or Phantom is enough to drive a webmaster insane.

Recommendations for Going Nuclear:
As you can imagine, pushing the giant red button is a big undertaking that should not be taken lightly. So based on my experience helping companies with the nuclear option, I’ve provided some recommendations below. My hope is that if you choose to go nuclear, you can follow these tips to ensure you maximize your efforts (and avoid launching duds).

  • Long-Term Process – Understand that you are making changes to achieve long-term success. You are not looking for short-term wins. You want to avoid quality problems (and negative impact from quality algorithms) over the long-term. Understand that you might actually drop in the short-term until Google crawls, processes, and measures the changes.
  • Extensive and Thorough Audit – Tackle on-site quality problems head-on. Have a thorough SEO audit completed through a quality lens. That should help identify problems that could be contributing to both Panda and Phantom. The audit should produce a clear remediation plan organized by priority.
  • Nuke Those Links – Deal quickly and decisively with unnatural links. Have a thorough link audit completed. Identify all unnatural links and deal with them as quickly as possible. That includes removing, nofollowing, or disavowing unnatural links.
  • Go Mobile-friendly – This is NOT just for Google. This is for users as well. Depending on your situation, you may choose to go responsive or you might choose separate mobile urls. Regardless, create a solid mobile user experience. It should pay off on several levels, including more organic search traffic, stronger engagement, more social sharing, and increased conversion.
  • Redesigns Can Be Good (just be careful) – Don’t be afraid to redesign what needs to be improved. Website designs from 1997 will not suffice anymore. Cluttered and disorganized user experiences can kill you from a Panda and Phantom standpoint. Don’t be afraid to redesign your site, or various elements of the site. Just make sure you take the necessary steps to maintain search equity during the launch.
  • Ask Humans (Yes, real people.) – I’ve helped a number of companies deal with Panda and Phantom hits that had serious engagement problems. For example, aggressive advertising, deceptive affiliate issues, horrible user experience, and other nasty problems). Those problems were abundantly clear to me, but not to the companies I was helping. Panda is about user happiness (and so is Phantom to an extent). Have unbiased real people go through your site. Have them provide feedback based on going through a task, or several tasks. Take their feedback to heart and then make changes.
  • Avoid SEO Band-aids – Make the necessary changes no matter how big they are. If you want the greatest chance of recovery for the long-term, then you must be decisive and then execute. Band-aids lead to future algorithm hits. Big changes lead to success.

Summary – It’s OK To Push The Giant Red Button
With multiple quality algorithms crossing streams, it has never been more important to consider the SEO nuclear option. Pushing the giant red button can be a tough decision, but it can also rid your site of nasty quality problems that attract Pandas and Phantoms. And if there’s something worse than having bamboo on your site, it’s adding ectoplasm to the mix. Sure, choosing the SEO nuclear option is a bold move, but that might be exactly what you need to do.



How To Identify and Avoid Technical SEO Optical Illusions

Technical SEO Optical Illusions

Without a clean and crawlable website structure, you’re dead in the water SEO-wise. For example, if you don’t have a solid SEO foundation, you can end up providing serious obstacles for both users and search engines. And that’s never a good idea. And even if you have clean and crawlable structure, problems with various SEO directives can throw a wrench into the situation. And those problems can lie beneath the surface just waiting to kill your SEO efforts. That’s one of the reasons I’ve always believed that a thorough technical audit is the most powerful deliverables in all of SEO.

The Power of Technical SEO Audits: Crawls + Manual Audits = Win
“What lies beneath” can be scary. Really scary… The reality for SEO is that what looks fine on the surface may have serious flaws. And finding those hidden problems and rectifying them as quickly as possible can help turn a site around SEO-wise.

When performing an SEO audit, it’s incredibly important to manually dig through a site to see what’s going on. That’s a given. But it’s also important to crawl the site to pick up potential land mines. In my opinion, the combination of both a manual audit and extensive crawl analysis can help you uncover problems that may be inhibiting the performance of the site SEO-wise. And both might help you surface dangerous optical illusions, which is the core topic of my post today.

Uncovering Optical SEO Illusions
Optical illusions can be fun to check out, but they aren’t so fun when they can negatively impact your business. When your eyes play tricks on you, and your website takes a Google hit due to that illusion, it’s not so fun.

The word “technical” in technical SEO is important to highlight. If your code is even one character off, it could have a big impact on your site SEO-wise. For example, if you implement the meta robots tag on a site with 500,000 pages, then the wrong directives could wreak havoc on your site. Or maybe you are providing urls in multiple languages using hreflang, and those additional urls are adding 30,000 urls to your site. You would definitely want to make sure those hreflang tags are set up correctly.

But what if you thought those directives and tags were set up perfectly when in fact, they aren’t set up correctly. The look right at first glance, but there’s something just not right…

That’s the focus of this post today, and it can happen easier than you think. I’ll walk through several examples of SEO optical illusions, and then explain how to avoid or pick up those illusions.

Abracadabra, let’s begin. :)

Three Examples of Technical SEO Optical Illusions
First, take a quick look at this code:

Technical SEO Problem with hreflang

Did you catch the problem? The code uses “alternative” versus “alternate”. And that was on a site with 2.3M pages indexed, many of which had hreflang implemented pointing to various language pages.

Hreflang Using Alternative vs. Alternate

Now take a look at this code:

SEO Technical Problem with rel canonical

All looks ok, right? At first glance you might miss it, but the code uses “content” versus “href”. If rolled out to a website, it means rel canonical won’t be set up correctly for any pages using the flawed directive. And on sites where rel canonical is extremely important, like sites with urls resolving multiple ways, this can be a huge problem.

Technical SEO problem with rel canonical

Now how about this one?
Technical SEO problem with meta robots

OK, so you are probably getting better at this already. The correct value should be “noindex” and not “no index”. So if you thought you were keeping those 75,000 pages out of Google’s index, you were wrong. Not a good thing to happen while Pandas and Phantoms roam the web.

Meta Robots problem using no index vs. noindex

I think you get the point.

How To Avoid Falling Victim To Optical Illusions?
As mentioned earlier, using an approach that leverages manual audits, site-wide crawls, and then surgical crawls (when needed) can help you nip problems in the bud. And leveraging reporting in Google Search Console (formerly Google Webmaster Tools) is obviously a smart way to proceed as well. Below, I’ll cover several things you can do to identify SEO optical illusions while auditing a site.

SEO Plugins
From a manual audit standpoint, using plugins like Mozbar, SEO Site Tools, and others, can help you quickly identify key elements on the page. For example, you can easily check rel canonical and the meta robots tag via both plugins.

Using Mozbar to identify technical seo problems.

From a crawl perspective, you can use DeepCrawl for larger crawls and Screaming Frog for small to medium size crawls. I often use both DeepCrawl and Screaming Frog on the same site (using “The Frog” for surgical crawls once I identify issues through manual audits or the site-wide crawl).

Each tool provides data about key technical SEO components like rel canonical, meta robots, rel next/prev, and hreflang. Note, DeepCrawl has built-in support for checking hreflang, while Screaming Frog requires a custom search.

Using DeepCrawl to identify SEO technical problems.

Once the crawl is completed, you can double-check the technical implementation of each directive by comparing what you are seeing during the manual audit to the crawl data you have collected. It’s a great way to ensure each element is ok and won’t cause serious problems SEO-wise. And that’s especially the case on larger-scale websites that may have thousands, hundreds of thousands, or millions of pages on the site.

Google Search Console Reports
I mentioned earlier that Google Search Console reports can help identify and avoid optical illusions. Below, I’ll touch on several reports that are important from a technical SEO standpoint.

Index Status
Using index status, you can identify how many pages Google has indexed for the site at hand. And by the way, this can directory-level (which is a smart way to go). Index Status reporting will not identify specific directives or technical problems, but can help you understand if Google is over or under-indexing your site content.

For example, if you have 100,000 pages on your site, but Google has indexed just 35,000, then you probably have an issue…

Using Index Status in Google Search Console to identify indexation problems.

International Targeting
Using the international targeting reporting, you can troubleshoot hreflang implementations. The reporting will identify hreflang errors on specific pages of your site. Hreflang is a confusing topic for many webmasters and the reporting in GSC can get you moving in the right direction troubleshooting-wise.

Using International Targeting reporting in GSC to troubleshoot hreflang.

Fetch as Google

Using Fetch as Google, you can see exactly what Googlebot is crawling and the response code it is receiving. This includes viewing the meta robots tag, rel canonical tags, rel next/prev, and hreflang tags. You can also use fetch and render to see how Googlebot is rendering the page (and compare that to what users are seeing).

Uisng fetch as google to troubleshoot techncial SEO problems.

Robots.txt and Blocked Resources
Using the new robots.txt Tester in Google Search Console enables you to test the current set of robots.txt directives against your actual urls (to see what’s blocked and what’s allowed). You can also use the Tester as a sandbox to change directives and test urls. It’s a great way to identify current problems with your robots.txt file and see if future changes will cause issues.

Using robots.txt Tester to troubleshoot technical SEO problems.

Summary – Don’t Let Optical Illusions Trick You, and Google…
If there’s one thing you take away from this post, it’s that technical SEO problems can be easy to miss. Your eyes can absolutely play tricks on you when directives are even just a few characters off in your code. And those flawed directives can cause serious problems SEO-wise if not caught and refined.

The good news is that you can start checking your own site today. Using the techniques and reports I listed above, you can dig through your own site to ensure all is coded properly. So keep your eyes peeled, and catch those illusions before they cause any damage. Good luck.



More Findings From Google’s Phantom 2 Update (“The Quality Update”) – Panda Overlap, Long-Term Approach, URL Tinkering, and More


Google Phantom 2 Update

My Google Richter Scale was insanely active the week of April 27, 2015. That’s when Google rolled out a change to its core ranking algorithm to better assess quality signals. I called the update Phantom 2, based on the mysterious nature of the update (and since Google would not confirm it at the time). It reminded me a lot of the original Phantom update, which rolled out on May 8, 2013. Both were focused on content quality and both had a big impact on websites across the web.

I heavily analyzed Phantom 2 soon after the update rolled out, and during my analysis, I saw many sites swing 10-20% either up or down, while also seeing some huge hits and major surges. One site I analyzed lost ~80% of its Google organic traffic overnight, while another surged by 375%. And no, I haven’t seen any crazy Phantom 2 recoveries yet (which I never thought would happen so quickly). More about that soon.

Phantom 2 Drop Sustained

Also, based on my first post, I was interviewed by CNBC about the update. Between my original post and the CNBC article, the response across the web was amazing. The CNBC article has now been shared over 4,500 times, which confirms the level of impact Phantom had across the web. Just like Phantom 1 in 2013, Phantom 2 was significant.

Phantom and Panda Overlap
While analyzing the impact of Phantom 2, it wasn’t long before I could clearly see that the update heavily targeted content quality. I was coming across serious low quality content, user engagement problems, advertising issues, etc. Many (including myself) initially thought it was a Panda update based on seeing the heavy targeting of content quality problems.

And that made sense timing-wise, since the last Panda update was over seven months ago (10/24/14). Many have been eagerly awaiting an update or refresh since they have been working heavily to fix any Panda-related problems. It’s not cool that we’ve gone over seven months when Panda used to roll out frequently (usually about once per month).

But we have good news out of SMX Advanced. Gary Illyes explained during his AMA with Danny Sullivan that the next Panda refresh would be within the next 2-4 weeks. That’s excellent news for many webmasters that have been working hard on Panda remediation. So definitely follow me on Twitter since I’ll be covering the next Panda refresh/update in great detail. Now back to Phantom.

After Phantom rolled out, Google denied Panda (and Penguin) and simply said this was a “normal update”. And then a few weeks later, they explained a little more about our ghostly friend. Gary Illyes said it was a change to Google’s core ranking algorithm with how it assesses “quality signals”. Ah, that made a lot of sense based on what I was seeing…

So, it wasn’t Panda, but focuses on quality. And as I’ve said many times before, “content quality” can mean several things. You have thin content, low quality affiliate content, poor user experience, advertising obstacles, scraped content, and more. Almost every problem I came across during my Phantom analysis would have been something I would have targeted from a Panda standpoint. So both Phantom and Panda seem to chew on similar problems. More about this soon.

The Ultimate Panda and Phantom Cocktail

Phantom Recovery and A Long-Term Approach
So, Panda targets low quality content and can drag an entire site down in the search results (it’s a domain-level demotion). And now we have Phantom, which changed how Google’s core ranking algorithm assesses “quality signals”. The change can boost urls with higher quality content (which of course means urls with lower quality content can drop).

It’s not necessarily a filter like Panda, but can act the same way for low quality content on your site. Google says it’s a page-level algorithm, so it presumably won’t drag an entire domain down. The jury is still out on that… In other words, if you have a lot of low quality content, it can sure feel like a domain-level demotion. For example, I’ve seen some websites get obliterated by the Phantom update, losing 70%+ of their traffic starting the week of April 27th. Go tell the owner of that website that Phantom isn’t domain-level. :)

Huge Phantom 2 Hit

URL Tinkering – Bad Idea
So, since Phantom is page-level, many started thinking they could tinker with a page and recover the next time Google crawled the url. I never thought that could happen, and I still don’t. I believe it’s way more complex than that… In addition, that approach could lead to a lot of spinning wheels SEO-wise. Imagine webmasters tinkering endlessly with a url trying to bounce back in the search results. That approach can drive most webmasters to the brink of insanity.

I believe it’s a complex algorithm that also takes other factors into account (like user engagement and some domain-level aspects). For example, there’s a reason that some sites can post new content, have that content crawled and indexed in minutes, and even start ranking for competitive keywords quickly. That’s because they have built up trust in Google’s eyes. And that’s not page-level… it’s domain-level. And from an engagement standpoint, Google cannot remeasure engagement quickly. It needs time, users hitting the page, understanding dwell time, etc. before it can make a decision about recovery.

It’s not binary like the mobile-friendly algorithm. Phantom is more complex than that (in my opinion).

John Mueller’s Phantom Advice – Take a Long-Term Approach To Increasing Quality
Back to “url tinkering” for a minute. Back when Phantom 1 hit in May of 2013, I helped a number of companies deal with the aftermath. Some had lost 60%+ of their Google organic traffic overnight. My approach was very similar to Panda remediation. I heavily analyzed the site from a content quality standpoint and then produced a serious remediation plan for tackling those problems.

I didn’t take a short-term approach, I didn’t believe it was page-level, and I made sure any changes clients would implement would be the best changes for the long-term success of the site. And that worked. A number of those companies recovered from Phantom within a six month period, with some recovering within four months. I am taking the same approach with Phantom 2.

And Google’s John Mueller feels the same way. He was asked about Phantom in a recent webmaster hangout video and explained a few key points. First, he said there is no magic bullet for recovery from a core ranking change like Phantom. He also said to focus on increasing the quality on your site over the long-term. And that one sentence has two key points. John used “site” and not “page”. And he also said “long-term”.

So, if you are tinkering with urls, fixing minor tech issues on a page, etc., then you might drive yourself insane trying to recover from Phantom. Instead, I would heavily tackle major content quality and engagement problems on the site. Make big improvements content-wise, improve the user experience on the site, decrease aggressive advertising tactics, etc. That’s how you can exorcise Phantom.

More Phantom Findings
Over the past month, I’ve had the opportunity to dig into a number of Phantom hits. Those hits range from moderate drops to severe drops. And I’ve analyzed some sites that absolutely surged after the 4/29 update. I already shared several findings from a content quality standpoint in my first post about Phantom 2, but I wanted to share more now that I have additional data.

Just like with Panda, “low quality content” can mean several things. There is never just one type of quality problem on a website. It’s usually a combination of problems that yields a drop. Here are just a few more problems I have come across while analyzing sites negatively impacted by Phantom 2. Note, this is not a full list, but just additional examples of what “low quality content” can look like.

Thin Content and Technical Problems = Ghostly Impact
An ecommerce site that was heavily impacted by Phantom reached out to me for help. Once I dug into the site, the content quality problems were clear. First, the site had a boatload of thin content. Pages were extremely visual with no supporting content. The only additional content was triggered via a tab (and there wasn’t much added to the page once triggered). The site has about 25-30K pages indexed.

The site also used infinite scroll to display the category content. If set up properly SEO-wise, this isn’t a huge deal (although I typically recommend not to use infinite scroll). But this setup had problems. When I dug into Google’s cache to see how the page looked, the page would continually refresh every second or so. Clearly there was some type of issue with how the pages were coded. In addition, when checking the text-only version, the little content that was provided on the page wasn’t even showing up. Thin content became even thinner…

So, you had extremely thin content that was being cut down to no content due to how the site was coded. And this problem was present across many urls on the site.

Ecommerce Site with Phantom Problems

Directory With Search Results Indexed + Followed Links
Another Phantom hit involved a large directory and forum. There are hundreds of thousands of pages indexed and many traditional directory and forum problems are present. For example, the directory listings were thin, there was a serious advertising issue across those pages (including the heavy blending of ads with content), and search results indexed.

In addition, and this was a big problem, all of the directory links were followed. Since those links are added by business owners, and aren’t natural, they should absolutely be nofollowed. I estimate that there are ~80-90K listings in the directory, and all have followed links to external websites that have set up the listings. Not good.

An example of a low quality directory page with followed links to businesses:

Directory with Phantom 2 Problems

Rogue Low Quality Content Not Removed During Initial Panda Work
One business owner reached out to me that had been heavily impacted by Panda and Penguin in the past. But they worked hard to revamp the site, fix content quality problems, etc. They ended up recovering (and surging ) during Panda 4.1.  They were negatively impacted by Phantom (not a huge drop, but about 10%).

Quickly checking the site after Phantom 2 rolled out revealed some strange legacy problems (from the pre-recovery days). For example, I found some horrible low quality content that contained followed links to a number of third party websites. The page was part of a strategy employed by a previous SEO company (years ago). It’s a great example of rogue low quality content that can sit on a site and cause problems down the line.

Note, I’m not saying that one piece of content is going to cause massive problems, but it surely doesn’t help. Beyond that, there were still usability problems on the site, mobile problems, pockets of thin content, and unnatural-looking exact match anchor text links weaved into certain pages. All of this together could be causing Phantom to haunt the site.

Low Quality Content With Followed Links

No, Phantoms Don’t Eat Tracking Codes
I’ll be quick with this last one, but I think it’s important to highlight. I received an email from a business owner that claimed their site was hit by 90%+ after Phantom rolled out. That’s a huge Phantom hit, so I was eager to review the situation.

Quickly checking third party tools revealed no drop at all. All keywords that were leading to the site pre-Phantom 2 were still ranking well. Trending was strong as well. And checking the site didn’t reveal any crazy content quality problems either. Very interesting…

Phantom 2 No Impact

So I reached out to the business owner and asked if it was just Google organic traffic that dropped, or if they were seeing drops across all traffic sources. I explained I wasn’t seeing any drops via third party tools, I wasn’t seeing any crazy content quality problems, etc. The business owner quickly got back to me and said it was a tracking code issue! Google Analytics wasn’t firing, so it looked like there was a serious drop in traffic. Bullet avoided.

Important note: When you believe you’ve been hit by an algo update, don’t simply rely on Google Analytics. Check Google Search Console (formerly Google Webmaster Tools), third party tools, etc. Make sure it’s not a tracking code issue before you freak out.

Quality Boost – URLs That Jumped
Since Google explained that Phantom 2 is actually a change to its core ranking algorithm with how it assesses quality, it’s important to understand where pages fall flat, as well as where other pages excel. For example, some pages will score higher, while others will score lower. I’ve spent a lot of time analyzing the problems Phantom victims have content quality-wise, but I also wanted to dig into the urls that jumped ahead rankings-wise. My goal was to see if the Quality Update truly surfaced higher quality pages based on the query at hand.

Note, I plan to write more about this soon, and will not cover it extensively in this post. But I did think it was important to start looking at the urls that surged in greater detail. I began by analyzing a number of queries where Phantom victim urls dropped in rankings. Then I dug into which urls used to rank on page one and two, and which ones jumped up the rankings.

A Mixed Bag of Quality
Depending on the query, there were times urls did jump up that were higher quality, covered the subject in greater detail, and provided an overall better user experience. For example, a search for an in-depth resource for a specific subject yielded some new urls in the top ten that provided a rich amount of information, organized well, without any barriers from a user engagement standpoint. It was a good example of higher quality content overtaking lower quality, thinner content.

Questions and Forums
For some queries I analyzed that were questions, Google seemed to be providing forum urls that contained strong responses from people that understood the subject matter well. I’m not saying all forums shot up the rankings, but I analyzed a number of queries that yielded high quality forum urls (with a lot of good answers from knowledgeable people). And that’s interesting since many forums have had a hard time with Panda over the years.

Navigational Queries and Relevant Information
I saw some navigational queries yield urls that provided more thorough information than just providing profile data. For example, I saw some queries where local directories all dropped to page two and beyond, while urls containing richer content surface on page one.

Local Example
From a pure local standpoint (someone searching for a local business), I saw some ultra-thin local listings drop, while other listings with richer information increase in rank. For example, pages with just a thumbnail and business name dropped, while other local listings with store locations, images, company background information, hours, reviews, etc. hit page one. Note, these examples do not represent the entire category… They are simply examples based on Phantom victims I am helping now.

In Some Cases, The “Lower Quality Update”?
The examples listed above show higher quality urls rising in the ranks, but that wasn’t always the case. I came across several queries where some of the top listings yielded lower quality pages. They did not cover the subject matter in detail, had popups immediately on load, the pages weren’t organized particularly well, etc.

Now, every algorithm will contain problems, yield some inconsistent results, etc., but I just found it ironic that the “quality update” sometimes surfaced lower quality urls on page one. Again, I plan to dig deeper into the “quality boost” from Phantom 2 in future posts, so stay tuned.

Next Steps with Phantom:
As mentioned earlier, I recommend taking a long-term approach to Phantom remediation. You need to identify and then fix problems riddling your sites from a content quality and engagement standpoint. Don’t tinker with urls. Fix big problems. And if Phantom 2 is similar to Phantom 1 from 2013, then that’s exactly what you need to focus on.

Here is what I recommend:

  • Have a thorough audit conducted through a quality lens. This is not necessarily a full-blown SEO audit. Instead, it’s an audit focused on identifying content quality problems, engagement issues, and other pieces of bamboo that both Panda and Phantom like to chew on.
  • Take the remediation plan and run with it. Don’t put band-aids on your website, and don’t implement 20% of the changes. Fix as many quality problems as you can, and as quickly as you can. Not only will Google need to recrawl all of the changes, but I’m confident that it will need to remeasure user engagement (similar to Panda). This is one reason I don’t think you can bounce back immediately from a Phantom hit.
  • Have humans provide feedback. I’ve brought this up before in previous Panda posts, and it can absolutely help with Phantom too. Ask unbiased users to complete an action (or set of actions) on your site and let them loose. Have them answer questions about their visit, obstacles they came across, things they hated, things they loved, etc. You might be surprised by what they say. And don’t forget about mobile… have them go through the site on their mobile phones too.
  • Continue producing high quality content on your site. Do not stop the process of publishing killer content that can help build links, social shares, brand mentions, etc. Remember that “quality” can be represented in several ways. Similar to Panda victims, you must keep driving forward like you aren’t being impacted.
  • What’s good for Phantom should be good for Panda. Since there seems to be heavy overlap between Phantom and Panda, many of the changes you implement should help you on both levels.


Summary – Exorcise Phantom and Exile Panda
If you take one point away from this post, I hope you maintain a long-term view of increasing quality on your website. If you’ve been hit by Phantom, then don’t tinker with urls and expect immediately recovery. Instead, thoroughly audit your site from a quality standpoint, make serious changes to your content, enhance the user experience, and set realistic expectations with regard to recovery. That’s how you can exorcise Phantom and exile Panda.

Now go ahead and get started. Phantoms won’t show themselves the door. You need to open it for them. :)



Phantom 2 – Analyzing The Google Update That Started On April 29, 2015

Phantom 2 Google Update

Two years ago on May 8, 2013, I began receiving emails from webmasters that saw significant drops in Google organic traffic overnight.  I’m not talking about small drops… I’m referring to huge drops like 60%+. As more emails came in, and I checked more of the data I have access to across websites, it was apparent that Google had pushed a big update.

I called it the Phantom update, since it initially flew under the radar (which gave it a mysterious feel). Google would not confirm Phantom, but I didn’t really need them to. I had a boatload of data that already confirmed that a massive change occurred. Again, some sites reaching out to me saw a 60%+ decrease in Google organic traffic overnight.

Also, Phantom rolled out while the SEO community was waiting for Penguin 2.0, so all attention was on unnatural links. But, after digging into the Phantom update from 5/8/13, it was clear that it was all about content quality and not links.

Phantom 2 – The Sequel Might Be Scarier Than The Original
Almost two years later to the day, we have what I’m calling Phantom 2. There was definitely a lot of chatter the week of April 27 that some type of update was going on. Barry Schwartz was the first to document the chatter on Search Engine Roundtable as more and more webmasters explained what they were seeing.

Now, I have access to a lot of Panda data, but I didn’t initially see much movement. And the movement I saw wasn’t Panda-like. For example, a 10-20% increase or decrease for some sites without any spikes or huge drops doesn’t set off any Panda alarms at G-Squared Interactive. With typical Panda updates, there are always some big swings with either recoveries or fresh hits from new companies reaching out to me. I didn’t initially see movement like that.

But that weekend (5/1 through 5/3), the movement seemed to increase. And on Monday, after having a few days of data to sift through, I saw the first real signs of the update. For example, check out the screenshot below of a huge hit:

Phantom 2 Google Update Fresh Hit

And as more chatter hit the Twitterverse, more emails from companies starting hitting my inbox. Some websites had experienced significant changes in Google organic traffic starting on 4/29 (or even earlier). For example, here’s an example of a huge surge starting the week of 4/27:

Phantom 2 Google Update Surge

I dug into my Panda data, and now that I had almost a full week of Google organic traffic to analyze, I saw a lot of moderate movement across the sites I have access to. Many swung 10-20% either up or down starting around 4/29. As of today, I have an entire sheet of domains that were impacted by Phantom 2. So yes, there was an update. But was it Panda? How about Penguin? Or was this some other type of ranking adjustment that Google implemented? It was hard to tell, so I decided to dig into websites impacted by Phantom 2 to learn more.

Google Denies Panda and Penguin:
With significant swings in traffic, many webmasters automatically think about Panda and Penguin. And that’s for good reason. There aren’t many updates that can rock a website like those two characters. Google came out and explained that it definitely wasn’t Panda or Penguin, and that they push changes all the time (and that this was “normal”).

OK, I get that Google pushes ~500 updates a year, but most do not cause significant impact. Actually, many of those updates get pushed and nobody even picks them up (at all). Whatever happened starting on 4/29 was bigger than a normal “change”.

A Note About Mobile-Friendly:
So, was this part of the mobile algorithm update from 4/21? No, it doesn’t look that way. Many of the sites impacted are mobile-friendly and the impact was to both desktop and mobile rankings. I don’t believe this had anything to do with the mobile-friendly update. You can read more about some of the mobile rankings changes I’ve seen due to that update in a recent post of mine.

Phantom Was Not Part Of Google's Mobile-Friendly Update

Understanding The Signature of Phantom 2:
If you know me at all, then you know I tend dig into algorithm updates. If there’s enough data to warrant heavy analysis, then I’m in. So I collected many domains impacted by the 4/29 update and started to analyze the decrease or increase in Google organic traffic. I analyzed lost keywords, landing pages from organic search, link profiles, link acquisition or loss over the past several months, etc. My hope was that I would surface findings that could help those impacted.  Below, I have documented what I found.

Phantom 2 Findings – Content Quality Problems *Galore*
It didn’t take long to see a trend. Just like with Phantom 1 in 2013, the latest update seemed to focus on content quality problems. I found many examples of serious quality problems across sites heavily impacted by Phantom 2. Checking the lost queries and the destination landing pages that dropped out revealed problems that were extremely Panda-like.

Note, I tend to heavily check pages that used to receive a lot of traffic from Google organic. That’s because Google has a ton of engagement data for those urls and it’s smart to analyze pages that Google was driving a lot of traffic to. You can read my post about running a Panda report to learn more about that.

Did Panda Miss These Sites?
If there were serious content quality problems, then you might be wondering why Panda hadn’t picked up on these sites in the past. Great question. Well, Panda did notice these sites in the past. Many of the sites impacted by Phantom 2 have battled Panda in the past. Again, I saw a number of sites I’m tracking swing 10-20% either up or down (based on the large amount of Panda data I have access to). And the big hits or surges during Phantom 2 also reveal previous Panda problems.

Below, I’ll take you through some of the issues I encountered while analyzing the latest update. I can’t take you through all of the problems I found, or this post would be massive. But, I will cover some of the most important content quality problems I came across. I think you’ll get the picture pretty quickly. Oh, and I’ll touch on links as well later in the post. I wanted to see if new(er) link problems or gains could be causing the ranking changes I was witnessing.

Content Quality Problems and Phantom 2

Tag Pages Ranking – Horrible Bamboo
One of the biggest hits I saw revealed many tag pages that were ranking well for competitive keywords prior to the update. The pages were horrible. Like many tag pages, they simply provided a large list of links to other content on the site. And when there were many links on the page, infinite scroll was used to automatically supply more and more links. This literally made me dizzy as I scrolled down the page.

And to make matters worse, there were many related tags on the page. So you essentially had the perfect spider trap. Send bots from one horrible page to another, then to another, and another. I’m shocked these pages were ranking well to begin with. User happiness had to be rock-bottom with these pages (and they were receiving a boatload of traffic too). And if Phantom is like Panda, then poor user engagement is killer (in a bad way).

So how bad of a problem was this on the site I was analyzing? Bad, really bad. I found over twelve million tag pages on the site that were indexed by Google. Yes, twelve million.

Phantom and Content Quality - Tag Pages

Also, the site was triggering popups as I hit new landing pages from organic search. So if the horrible tag pages weren’t bad enough, now you had horrible popups in your face. I guess Phantoms don’t like that. I know I don’t. :)

Thin, Click-Bait Articles, Low Quality Supplementary Content
Another major hit I analyzed revealed serious content quality problems. Many of the top landing pages from organic search that dropped revealed horrible click-bait articles. The pages were thin, the articles were only a few paragraphs, and the primary content was surrounded by a ton of low quality supplementary content.

If you’ve read some of my previous Panda posts, then you know Google understands and measures the level of supplementary content on the page. You don’t want a lot of low quality supplementary content that can detract from the user experience. Well on this site, the supplementary content was enough to have me running and screaming from the site. Seriously, it was horrible.

I checked many pages that had dropped out of the search results and there weren’t many I would ever want to visit. Thin content, stacked videos (which I’ve mentioned before in Panda posts), poor quality supplementary content, etc.

Low quality pages with many stacked videos can have a strong negative impact on user experience:

Phantom and Content Quality - Stacked Videos

I also saw this site had a potential syndication issue. It was referencing third party sites often from its own pages. When checking those third party pages, you can see some of the content was pulled from those sites. I covered syndication after Panda 4.0 rolled out and this situation fit perfectly into some of the scenarios I explained.

Phantom and Syndication

Navigational Queries, Poor Design, and Low Quality User Generated Content
Another big hit I analyzed revealed even more content quality problems, plus the first signs of impact based on Google SERP changes. First, the site design was out of 1998. It was really tough to get through the content. The font was small, there was a ton of content on each page, there were many links on each page, etc. I’m sure all of this was negatively impacting the user experience.

When checking lost rankings, it was clear to see that many queries were navigational. For example, users entering domain names or company names in Google. This site used to rank well for those, but checking the SERPs revealed truncated results. For example, there were only five listings now for some of those queries. There were times that the site in question dropped to page two, but there were times it dropped much more. And for some queries, there were only three pages listed in the SERPs.

An example of just five listings for a navigational query:

Phantom and Truncated SERPs
So when you combine giant sitelinks, truncated SERPs, limited SERP listings, and then some type of major ranking adjustment, you can see why a site like this would get hammered.

There was also user-generated content problems on the site. Each page had various levels of user comments, but they were either worthless or just old. I found comments from years ago that had nothing to do with the current situation. And then you had comments that simply provided no value at all (from the beginning). John Mueller explained that comments help make up the content on the page, so you definitely don’t want a boatload of low quality comments. You can check 8:37 in the video to learn more. So when you add low quality comments to low quality content you get… a Phantom hit, apparently. :)

Content Farms, Thin Content, Popups, and Knowledge Graph
Another interesting example of a domain heavily impacted by the 4/29 update involved a traditional content farm. If you’re familiar with the model, then you already know the problems I’m about to explain. The pages are relatively thin, don’t heavily cover the content at hand, and have ads all over the place.

In addition, the user experience gets interrupted by horrible popups, there’s low quality supplementary content, ads that blend with the results, and low quality user-generated content. Yes, all of this together on one site.

Also, when checking the drop in rankings across keywords, I often came across queries that yielded knowledge graph answers. It’s an interesting side note. The site has over 100K pages with content targeting “what is” queries. And many of those queries now yield KG answers. When you combine a ranking shift with a knowledge graph result taking up a large portion of the SERP, you’ve got a big problem for sure. Just ask lyrics websites how that works.

Phantom and Knowledge Graph Answers

Driving Users To Heavy Ad Pages, Spider Trap
One thing I saw several times while analyzing sites negatively impacted by the 4/29 update related to ad-heavy pages. For example, the landing page that used to rank well had prominent links to pages that simply provided a boatload of text ads (they contained sponsored ads galore). And often, those pages linked to more ad-heavy pages (like a spider trap). Those pages are low quality and negatively impact the user experience. That’s a dangerous recipe for sure.

Directories – The Same Old Problems
I reviewed some directory sites that were impacted by the 4/29 update and saw some of the classic problems that directories face. For example, disorganized content, thin content, and low quality supplementary content. I also saw deceiving ads that blended way too much with the content, which could cause users to mistakenly click those ads and be driven off the site (deception). And then there were pages indexed that should never be indexed (search results-like pages). Many of them…

An example of ads blending with content (deceiving users):

Phantom and Ad Deception

It’s also worth noting the truncated SERP situation I mentioned earlier. For example, SERPs of only five or seven listings for navigational queries and then there were some SERPs with only three pages of listings again.

I can keep going here, but I’ll stop due to the length of the post. But I hope you see the enormous content quality problems riddling sites impacted by Phantom 2. But to be thorough, I wanted to check links as well. I cover that next.

The Impact of Links – Possible, But Inconclusive
Now what about links? We know that many sites impacted had serious content quality problems, but did links factor into the update? It’s extremely hard to say if that was the case… I dug into the link profiles for a number of the sites both positively and negatively impacted and came out with mixed findings.

First, a number of the sites I analyzed have huge link profiles. I’m not talking about a few thousand links. I’m talking about millions and tens of millions of links per domain. That makes it much harder to nail down a link problem that could have contributed to the recent impact. There were definitely red flags for some domains, but not across every site I analyzed.

For example, some sites I analyzed definitely had a surge of inbound links since January of 2015, and you could see a certain percentage seemed unnatural. Those included strange inbound links from low quality sites, partner links (followed), and company-owned domains (also followed links). But again, the profiles were so large that it’s hard to say if those new(er) links caused enough of a problem to cause a huge drop in rankings during Phantom 2.

On the flip side, I saw some sites that were positively impacted gain many powerful inbound links over the past six to twelve months. Those included links from large publishers, larger brands, and other powerful domains. But again, there’s a lot of noise in each link profile. It’s very hard to say how much those links impacted the situation for this specific update.

Example of domains impacted by Phantom 2, but had relatively stable link profiles over the past year:

Phantom and Links

Phantom and Links - Stable Profile

And to make matters even more complex, there were some sites that gained during the 4/29 update that had lower quality link profiles overall. So if links were a driving force here, then the sites with lower quality profiles should not have gained like they did.

My money is on content quality, not links. But hey, anything is possible. :)

Next Steps for Phantom 2 Victims:
If you have been impacted by the 4/29 update, here is what I recommend doing:

  • I would take a hard look at content quality problems riddling your website. Just like Phantom 1 in 5/2013, I would audit your site through a content quality lens. Once you thoroughly analyze your content, then you should form a remediation plan for tackling those problems as quickly as possible.
  • Understand the queries that dropped, the landing pages from Google organic that used to receive a lot of traffic, find engagement problems on the site, and address those problems. Try to improve content quality across the site and then hope you can recover like previous Phantom victims did.
  • From a links standpoint, truly understand the links you’ve built over the past six to twelve months. Were they manually built, naturally received, etc? Even though my money is on content quality, I still think it’s smart to tackle any link problems you can surface. That includes removing or nofollowing unnatural links, and disavowing what you can’t get to.


Summary – The Phantom Lives
It was fascinating to analyze Phantom 2 starting on 4/29 and to see the similarities with the original Phantom from 5/8/13. After digging into a number of sites impacted by the latest update, it was clear to see major content quality problems across the domains. I don’t know if Phantom is cleaning up where Panda missed out, or if it’s something completely separate, but there’s a lot of crossover for sure.

And remember, Penguin 2.0 rolled out just a few weeks after Phantom 1. It’s going to be very interesting to see if the next Penguin update follows that model. I guess we’ll find out soon enough. :)