Two years ago on May 8, 2013, I began receiving emails from webmasters that saw significant drops in Google organic traffic overnight. I’m not talking about small drops… I’m referring to huge drops like 60%+. As more emails came in, and I checked more of the data I have access to across websites, it was apparent that Google had pushed a big update.
I called it the Phantom update, since it initially flew under the radar (which gave it a mysterious feel). Google would not confirm Phantom, but I didn’t really need them to. I had a boatload of data that already confirmed that a massive change occurred. Again, some sites reaching out to me saw a 60%+ decrease in Google organic traffic overnight.
Also, Phantom rolled out while the SEO community was waiting for Penguin 2.0, so all attention was on unnatural links. But, after digging into the Phantom update from 5/8/13, it was clear that it was all about content quality and not links.
Phantom 2 – The Sequel Might Be Scarier Than The Original
Almost two years later to the day, we have what I’m calling Phantom 2. There was definitely a lot of chatter the week of April 27 that some type of update was going on. Barry Schwartz was the first to document the chatter on Search Engine Roundtable as more and more webmasters explained what they were seeing.
Now, I have access to a lot of Panda data, but I didn’t initially see much movement. And the movement I saw wasn’t Panda-like. For example, a 10-20% increase or decrease for some sites without any spikes or huge drops doesn’t set off any Panda alarms at G-Squared Interactive. With typical Panda updates, there are always some big swings with either recoveries or fresh hits from new companies reaching out to me. I didn’t initially see movement like that.
But that weekend (5/1 through 5/3), the movement seemed to increase. And on Monday, after having a few days of data to sift through, I saw the first real signs of the update. For example, check out the screenshot below of a huge hit:
And as more chatter hit the Twitterverse, more emails from companies starting hitting my inbox. Some websites had experienced significant changes in Google organic traffic starting on 4/29 (or even earlier). For example, here’s an example of a huge surge starting the week of 4/27:
I dug into my Panda data, and now that I had almost a full week of Google organic traffic to analyze, I saw a lot of moderate movement across the sites I have access to. Many swung 10-20% either up or down starting around 4/29. As of today, I have an entire sheet of domains that were impacted by Phantom 2. So yes, there was an update. But was it Panda? How about Penguin? Or was this some other type of ranking adjustment that Google implemented? It was hard to tell, so I decided to dig into websites impacted by Phantom 2 to learn more.
Google Denies Panda and Penguin:
With significant swings in traffic, many webmasters automatically think about Panda and Penguin. And that’s for good reason. There aren’t many updates that can rock a website like those two characters. Google came out and explained that it definitely wasn’t Panda or Penguin, and that they push changes all the time (and that this was “normal”).
OK, I get that Google pushes ~500 updates a year, but most do not cause significant impact. Actually, many of those updates get pushed and nobody even picks them up (at all). Whatever happened starting on 4/29 was bigger than a normal “change”.
A Note About Mobile-Friendly:
So, was this part of the mobile algorithm update from 4/21? No, it doesn’t look that way. Many of the sites impacted are mobile-friendly and the impact was to both desktop and mobile rankings. I don’t believe this had anything to do with the mobile-friendly update. You can read more about some of the mobile rankings changes I’ve seen due to that update in a recent post of mine.
Understanding The Signature of Phantom 2:
If you know me at all, then you know I tend dig into algorithm updates. If there’s enough data to warrant heavy analysis, then I’m in. So I collected many domains impacted by the 4/29 update and started to analyze the decrease or increase in Google organic traffic. I analyzed lost keywords, landing pages from organic search, link profiles, link acquisition or loss over the past several months, etc. My hope was that I would surface findings that could help those impacted. Below, I have documented what I found.
Phantom 2 Findings – Content Quality Problems *Galore*
It didn’t take long to see a trend. Just like with Phantom 1 in 2013, the latest update seemed to focus on content quality problems. I found many examples of serious quality problems across sites heavily impacted by Phantom 2. Checking the lost queries and the destination landing pages that dropped out revealed problems that were extremely Panda-like.
Note, I tend to heavily check pages that used to receive a lot of traffic from Google organic. That’s because Google has a ton of engagement data for those urls and it’s smart to analyze pages that Google was driving a lot of traffic to. You can read my post about running a Panda report to learn more about that.
Did Panda Miss These Sites?
If there were serious content quality problems, then you might be wondering why Panda hadn’t picked up on these sites in the past. Great question. Well, Panda did notice these sites in the past. Many of the sites impacted by Phantom 2 have battled Panda in the past. Again, I saw a number of sites I’m tracking swing 10-20% either up or down (based on the large amount of Panda data I have access to). And the big hits or surges during Phantom 2 also reveal previous Panda problems.
Below, I’ll take you through some of the issues I encountered while analyzing the latest update. I can’t take you through all of the problems I found, or this post would be massive. But, I will cover some of the most important content quality problems I came across. I think you’ll get the picture pretty quickly. Oh, and I’ll touch on links as well later in the post. I wanted to see if new(er) link problems or gains could be causing the ranking changes I was witnessing.
Content Quality Problems and Phantom 2
Tag Pages Ranking – Horrible Bamboo
One of the biggest hits I saw revealed many tag pages that were ranking well for competitive keywords prior to the update. The pages were horrible. Like many tag pages, they simply provided a large list of links to other content on the site. And when there were many links on the page, infinite scroll was used to automatically supply more and more links. This literally made me dizzy as I scrolled down the page.
And to make matters worse, there were many related tags on the page. So you essentially had the perfect spider trap. Send bots from one horrible page to another, then to another, and another. I’m shocked these pages were ranking well to begin with. User happiness had to be rock-bottom with these pages (and they were receiving a boatload of traffic too). And if Phantom is like Panda, then poor user engagement is killer (in a bad way).
So how bad of a problem was this on the site I was analyzing? Bad, really bad. I found over twelve million tag pages on the site that were indexed by Google. Yes, twelve million.
Also, the site was triggering popups as I hit new landing pages from organic search. So if the horrible tag pages weren’t bad enough, now you had horrible popups in your face. I guess Phantoms don’t like that. I know I don’t. :)
Thin, Click-Bait Articles, Low Quality Supplementary Content
Another major hit I analyzed revealed serious content quality problems. Many of the top landing pages from organic search that dropped revealed horrible click-bait articles. The pages were thin, the articles were only a few paragraphs, and the primary content was surrounded by a ton of low quality supplementary content.
If you’ve read some of my previous Panda posts, then you know Google understands and measures the level of supplementary content on the page. You don’t want a lot of low quality supplementary content that can detract from the user experience. Well on this site, the supplementary content was enough to have me running and screaming from the site. Seriously, it was horrible.
I checked many pages that had dropped out of the search results and there weren’t many I would ever want to visit. Thin content, stacked videos (which I’ve mentioned before in Panda posts), poor quality supplementary content, etc.
Low quality pages with many stacked videos can have a strong negative impact on user experience:
I also saw this site had a potential syndication issue. It was referencing third party sites often from its own pages. When checking those third party pages, you can see some of the content was pulled from those sites. I covered syndication after Panda 4.0 rolled out and this situation fit perfectly into some of the scenarios I explained.
Navigational Queries, Poor Design, and Low Quality User Generated Content
Another big hit I analyzed revealed even more content quality problems, plus the first signs of impact based on Google SERP changes. First, the site design was out of 1998. It was really tough to get through the content. The font was small, there was a ton of content on each page, there were many links on each page, etc. I’m sure all of this was negatively impacting the user experience.
When checking lost rankings, it was clear to see that many queries were navigational. For example, users entering domain names or company names in Google. This site used to rank well for those, but checking the SERPs revealed truncated results. For example, there were only five listings now for some of those queries. There were times that the site in question dropped to page two, but there were times it dropped much more. And for some queries, there were only three pages listed in the SERPs.
An example of just five listings for a navigational query:
So when you combine giant sitelinks, truncated SERPs, limited SERP listings, and then some type of major ranking adjustment, you can see why a site like this would get hammered.
There was also user-generated content problems on the site. Each page had various levels of user comments, but they were either worthless or just old. I found comments from years ago that had nothing to do with the current situation. And then you had comments that simply provided no value at all (from the beginning). John Mueller explained that comments help make up the content on the page, so you definitely don’t want a boatload of low quality comments. You can check 8:37 in the video to learn more. So when you add low quality comments to low quality content you get… a Phantom hit, apparently. :)
Content Farms, Thin Content, Popups, and Knowledge Graph
Another interesting example of a domain heavily impacted by the 4/29 update involved a traditional content farm. If you’re familiar with the model, then you already know the problems I’m about to explain. The pages are relatively thin, don’t heavily cover the content at hand, and have ads all over the place.
In addition, the user experience gets interrupted by horrible popups, there’s low quality supplementary content, ads that blend with the results, and low quality user-generated content. Yes, all of this together on one site.
Also, when checking the drop in rankings across keywords, I often came across queries that yielded knowledge graph answers. It’s an interesting side note. The site has over 100K pages with content targeting “what is” queries. And many of those queries now yield KG answers. When you combine a ranking shift with a knowledge graph result taking up a large portion of the SERP, you’ve got a big problem for sure. Just ask lyrics websites how that works.
Driving Users To Heavy Ad Pages, Spider Trap
One thing I saw several times while analyzing sites negatively impacted by the 4/29 update related to ad-heavy pages. For example, the landing page that used to rank well had prominent links to pages that simply provided a boatload of text ads (they contained sponsored ads galore). And often, those pages linked to more ad-heavy pages (like a spider trap). Those pages are low quality and negatively impact the user experience. That’s a dangerous recipe for sure.
Directories – The Same Old Problems
I reviewed some directory sites that were impacted by the 4/29 update and saw some of the classic problems that directories face. For example, disorganized content, thin content, and low quality supplementary content. I also saw deceiving ads that blended way too much with the content, which could cause users to mistakenly click those ads and be driven off the site (deception). And then there were pages indexed that should never be indexed (search results-like pages). Many of them…
An example of ads blending with content (deceiving users):
It’s also worth noting the truncated SERP situation I mentioned earlier. For example, SERPs of only five or seven listings for navigational queries and then there were some SERPs with only three pages of listings again.
I can keep going here, but I’ll stop due to the length of the post. But I hope you see the enormous content quality problems riddling sites impacted by Phantom 2. But to be thorough, I wanted to check links as well. I cover that next.
The Impact of Links – Possible, But Inconclusive
Now what about links? We know that many sites impacted had serious content quality problems, but did links factor into the update? It’s extremely hard to say if that was the case… I dug into the link profiles for a number of the sites both positively and negatively impacted and came out with mixed findings.
First, a number of the sites I analyzed have huge link profiles. I’m not talking about a few thousand links. I’m talking about millions and tens of millions of links per domain. That makes it much harder to nail down a link problem that could have contributed to the recent impact. There were definitely red flags for some domains, but not across every site I analyzed.
For example, some sites I analyzed definitely had a surge of inbound links since January of 2015, and you could see a certain percentage seemed unnatural. Those included strange inbound links from low quality sites, partner links (followed), and company-owned domains (also followed links). But again, the profiles were so large that it’s hard to say if those new(er) links caused enough of a problem to cause a huge drop in rankings during Phantom 2.
On the flip side, I saw some sites that were positively impacted gain many powerful inbound links over the past six to twelve months. Those included links from large publishers, larger brands, and other powerful domains. But again, there’s a lot of noise in each link profile. It’s very hard to say how much those links impacted the situation for this specific update.
Example of domains impacted by Phantom 2, but had relatively stable link profiles over the past year:
And to make matters even more complex, there were some sites that gained during the 4/29 update that had lower quality link profiles overall. So if links were a driving force here, then the sites with lower quality profiles should not have gained like they did.
My money is on content quality, not links. But hey, anything is possible. :)
Next Steps for Phantom 2 Victims:
If you have been impacted by the 4/29 update, here is what I recommend doing:
- I would take a hard look at content quality problems riddling your website. Just like Phantom 1 in 5/2013, I would audit your site through a content quality lens. Once you thoroughly analyze your content, then you should form a remediation plan for tackling those problems as quickly as possible.
- Understand the queries that dropped, the landing pages from Google organic that used to receive a lot of traffic, find engagement problems on the site, and address those problems. Try to improve content quality across the site and then hope you can recover like previous Phantom victims did.
- From a links standpoint, truly understand the links you’ve built over the past six to twelve months. Were they manually built, naturally received, etc? Even though my money is on content quality, I still think it’s smart to tackle any link problems you can surface. That includes removing or nofollowing unnatural links, and disavowing what you can’t get to.
Summary – The Phantom Lives
It was fascinating to analyze Phantom 2 starting on 4/29 and to see the similarities with the original Phantom from 5/8/13. After digging into a number of sites impacted by the latest update, it was clear to see major content quality problems across the domains. I don’t know if Phantom is cleaning up where Panda missed out, or if it’s something completely separate, but there’s a lot of crossover for sure.
And remember, Penguin 2.0 rolled out just a few weeks after Phantom 1. It’s going to be very interesting to see if the next Penguin update follows that model. I guess we’ll find out soon enough. :)