Phantom 2 – Analyzing The Google Update That Started On April 29, 2015

Phantom 2 Google Update

Two years ago on May 8, 2013, I began receiving emails from webmasters that saw significant drops in Google organic traffic overnight.  I’m not talking about small drops… I’m referring to huge drops like 60%+. As more emails came in, and I checked more of the data I have access to across websites, it was apparent that Google had pushed a big update.

I called it the Phantom update, since it initially flew under the radar (which gave it a mysterious feel). Google would not confirm Phantom, but I didn’t really need them to. I had a boatload of data that already confirmed that a massive change occurred. Again, some sites reaching out to me saw a 60%+ decrease in Google organic traffic overnight.

Also, Phantom rolled out while the SEO community was waiting for Penguin 2.0, so all attention was on unnatural links. But, after digging into the Phantom update from 5/8/13, it was clear that it was all about content quality and not links.

Phantom 2 – The Sequel Might Be Scarier Than The Original
Almost two years later to the day, we have what I’m calling Phantom 2. There was definitely a lot of chatter the week of April 27 that some type of update was going on. Barry Schwartz was the first to document the chatter on Search Engine Roundtable as more and more webmasters explained what they were seeing.

Now, I have access to a lot of Panda data, but I didn’t initially see much movement. And the movement I saw wasn’t Panda-like. For example, a 10-20% increase or decrease for some sites without any spikes or huge drops doesn’t set off any Panda alarms at G-Squared Interactive. With typical Panda updates, there are always some big swings with either recoveries or fresh hits from new companies reaching out to me. I didn’t initially see movement like that.

But that weekend (5/1 through 5/3), the movement seemed to increase. And on Monday, after having a few days of data to sift through, I saw the first real signs of the update. For example, check out the screenshot below of a huge hit:

Phantom 2 Google Update Fresh Hit

And as more chatter hit the Twitterverse, more emails from companies starting hitting my inbox. Some websites had experienced significant changes in Google organic traffic starting on 4/29 (or even earlier). For example, here’s an example of a huge surge starting the week of 4/27:

Phantom 2 Google Update Surge

I dug into my Panda data, and now that I had almost a full week of Google organic traffic to analyze, I saw a lot of moderate movement across the sites I have access to. Many swung 10-20% either up or down starting around 4/29. As of today, I have an entire sheet of domains that were impacted by Phantom 2. So yes, there was an update. But was it Panda? How about Penguin? Or was this some other type of ranking adjustment that Google implemented? It was hard to tell, so I decided to dig into websites impacted by Phantom 2 to learn more.

Google Denies Panda and Penguin:
With significant swings in traffic, many webmasters automatically think about Panda and Penguin. And that’s for good reason. There aren’t many updates that can rock a website like those two characters. Google came out and explained that it definitely wasn’t Panda or Penguin, and that they push changes all the time (and that this was “normal”).

OK, I get that Google pushes ~500 updates a year, but most do not cause significant impact. Actually, many of those updates get pushed and nobody even picks them up (at all). Whatever happened starting on 4/29 was bigger than a normal “change”.

A Note About Mobile-Friendly:
So, was this part of the mobile algorithm update from 4/21? No, it doesn’t look that way. Many of the sites impacted are mobile-friendly and the impact was to both desktop and mobile rankings. I don’t believe this had anything to do with the mobile-friendly update. You can read more about some of the mobile rankings changes I’ve seen due to that update in a recent post of mine.

Phantom Was Not Part Of Google's Mobile-Friendly Update

Understanding The Signature of Phantom 2:
If you know me at all, then you know I tend dig into algorithm updates. If there’s enough data to warrant heavy analysis, then I’m in. So I collected many domains impacted by the 4/29 update and started to analyze the decrease or increase in Google organic traffic. I analyzed lost keywords, landing pages from organic search, link profiles, link acquisition or loss over the past several months, etc. My hope was that I would surface findings that could help those impacted.  Below, I have documented what I found.

Phantom 2 Findings – Content Quality Problems *Galore*
It didn’t take long to see a trend. Just like with Phantom 1 in 2013, the latest update seemed to focus on content quality problems. I found many examples of serious quality problems across sites heavily impacted by Phantom 2. Checking the lost queries and the destination landing pages that dropped out revealed problems that were extremely Panda-like.

Note, I tend to heavily check pages that used to receive a lot of traffic from Google organic. That’s because Google has a ton of engagement data for those urls and it’s smart to analyze pages that Google was driving a lot of traffic to. You can read my post about running a Panda report to learn more about that.

Did Panda Miss These Sites?
If there were serious content quality problems, then you might be wondering why Panda hadn’t picked up on these sites in the past. Great question. Well, Panda did notice these sites in the past. Many of the sites impacted by Phantom 2 have battled Panda in the past. Again, I saw a number of sites I’m tracking swing 10-20% either up or down (based on the large amount of Panda data I have access to). And the big hits or surges during Phantom 2 also reveal previous Panda problems.

Below, I’ll take you through some of the issues I encountered while analyzing the latest update. I can’t take you through all of the problems I found, or this post would be massive. But, I will cover some of the most important content quality problems I came across. I think you’ll get the picture pretty quickly. Oh, and I’ll touch on links as well later in the post. I wanted to see if new(er) link problems or gains could be causing the ranking changes I was witnessing.

Content Quality Problems and Phantom 2

Tag Pages Ranking – Horrible Bamboo
One of the biggest hits I saw revealed many tag pages that were ranking well for competitive keywords prior to the update. The pages were horrible. Like many tag pages, they simply provided a large list of links to other content on the site. And when there were many links on the page, infinite scroll was used to automatically supply more and more links. This literally made me dizzy as I scrolled down the page.

And to make matters worse, there were many related tags on the page. So you essentially had the perfect spider trap. Send bots from one horrible page to another, then to another, and another. I’m shocked these pages were ranking well to begin with. User happiness had to be rock-bottom with these pages (and they were receiving a boatload of traffic too). And if Phantom is like Panda, then poor user engagement is killer (in a bad way).

So how bad of a problem was this on the site I was analyzing? Bad, really bad. I found over twelve million tag pages on the site that were indexed by Google. Yes, twelve million.

Phantom and Content Quality - Tag Pages

Also, the site was triggering popups as I hit new landing pages from organic search. So if the horrible tag pages weren’t bad enough, now you had horrible popups in your face. I guess Phantoms don’t like that. I know I don’t. :)

Thin, Click-Bait Articles, Low Quality Supplementary Content
Another major hit I analyzed revealed serious content quality problems. Many of the top landing pages from organic search that dropped revealed horrible click-bait articles. The pages were thin, the articles were only a few paragraphs, and the primary content was surrounded by a ton of low quality supplementary content.

If you’ve read some of my previous Panda posts, then you know Google understands and measures the level of supplementary content on the page. You don’t want a lot of low quality supplementary content that can detract from the user experience. Well on this site, the supplementary content was enough to have me running and screaming from the site. Seriously, it was horrible.

I checked many pages that had dropped out of the search results and there weren’t many I would ever want to visit. Thin content, stacked videos (which I’ve mentioned before in Panda posts), poor quality supplementary content, etc.

Low quality pages with many stacked videos can have a strong negative impact on user experience:

Phantom and Content Quality - Stacked Videos

I also saw this site had a potential syndication issue. It was referencing third party sites often from its own pages. When checking those third party pages, you can see some of the content was pulled from those sites. I covered syndication after Panda 4.0 rolled out and this situation fit perfectly into some of the scenarios I explained.

Phantom and Syndication


Navigational Queries, Poor Design, and Low Quality User Generated Content
Another big hit I analyzed revealed even more content quality problems, plus the first signs of impact based on Google SERP changes. First, the site design was out of 1998. It was really tough to get through the content. The font was small, there was a ton of content on each page, there were many links on each page, etc. I’m sure all of this was negatively impacting the user experience.

When checking lost rankings, it was clear to see that many queries were navigational. For example, users entering domain names or company names in Google. This site used to rank well for those, but checking the SERPs revealed truncated results. For example, there were only five listings now for some of those queries. There were times that the site in question dropped to page two, but there were times it dropped much more. And for some queries, there were only three pages listed in the SERPs.

An example of just five listings for a navigational query:

Phantom and Truncated SERPs
So when you combine giant sitelinks, truncated SERPs, limited SERP listings, and then some type of major ranking adjustment, you can see why a site like this would get hammered.

There was also user-generated content problems on the site. Each page had various levels of user comments, but they were either worthless or just old. I found comments from years ago that had nothing to do with the current situation. And then you had comments that simply provided no value at all (from the beginning). John Mueller explained that comments help make up the content on the page, so you definitely don’t want a boatload of low quality comments. You can check 8:37 in the video to learn more. So when you add low quality comments to low quality content you get… a Phantom hit, apparently. :)

Content Farms, Thin Content, Popups, and Knowledge Graph
Another interesting example of a domain heavily impacted by the 4/29 update involved a traditional content farm. If you’re familiar with the model, then you already know the problems I’m about to explain. The pages are relatively thin, don’t heavily cover the content at hand, and have ads all over the place.

In addition, the user experience gets interrupted by horrible popups, there’s low quality supplementary content, ads that blend with the results, and low quality user-generated content. Yes, all of this together on one site.

Also, when checking the drop in rankings across keywords, I often came across queries that yielded knowledge graph answers. It’s an interesting side note. The site has over 100K pages with content targeting “what is” queries. And many of those queries now yield KG answers. When you combine a ranking shift with a knowledge graph result taking up a large portion of the SERP, you’ve got a big problem for sure. Just ask lyrics websites how that works.

Phantom and Knowledge Graph Answers


Driving Users To Heavy Ad Pages, Spider Trap
One thing I saw several times while analyzing sites negatively impacted by the 4/29 update related to ad-heavy pages. For example, the landing page that used to rank well had prominent links to pages that simply provided a boatload of text ads (they contained sponsored ads galore). And often, those pages linked to more ad-heavy pages (like a spider trap). Those pages are low quality and negatively impact the user experience. That’s a dangerous recipe for sure.

Directories – The Same Old Problems
I reviewed some directory sites that were impacted by the 4/29 update and saw some of the classic problems that directories face. For example, disorganized content, thin content, and low quality supplementary content. I also saw deceiving ads that blended way too much with the content, which could cause users to mistakenly click those ads and be driven off the site (deception). And then there were pages indexed that should never be indexed (search results-like pages). Many of them…

An example of ads blending with content (deceiving users):

Phantom and Ad Deception

It’s also worth noting the truncated SERP situation I mentioned earlier. For example, SERPs of only five or seven listings for navigational queries and then there were some SERPs with only three pages of listings again.

I can keep going here, but I’ll stop due to the length of the post. But I hope you see the enormous content quality problems riddling sites impacted by Phantom 2. But to be thorough, I wanted to check links as well. I cover that next.

The Impact of Links – Possible, But Inconclusive
Now what about links? We know that many sites impacted had serious content quality problems, but did links factor into the update? It’s extremely hard to say if that was the case… I dug into the link profiles for a number of the sites both positively and negatively impacted and came out with mixed findings.

First, a number of the sites I analyzed have huge link profiles. I’m not talking about a few thousand links. I’m talking about millions and tens of millions of links per domain. That makes it much harder to nail down a link problem that could have contributed to the recent impact. There were definitely red flags for some domains, but not across every site I analyzed.

For example, some sites I analyzed definitely had a surge of inbound links since January of 2015, and you could see a certain percentage seemed unnatural. Those included strange inbound links from low quality sites, partner links (followed), and company-owned domains (also followed links). But again, the profiles were so large that it’s hard to say if those new(er) links caused enough of a problem to cause a huge drop in rankings during Phantom 2.

On the flip side, I saw some sites that were positively impacted gain many powerful inbound links over the past six to twelve months. Those included links from large publishers, larger brands, and other powerful domains. But again, there’s a lot of noise in each link profile. It’s very hard to say how much those links impacted the situation for this specific update.

Example of domains impacted by Phantom 2, but had relatively stable link profiles over the past year:

Phantom and Links

Phantom and Links - Stable Profile

And to make matters even more complex, there were some sites that gained during the 4/29 update that had lower quality link profiles overall. So if links were a driving force here, then the sites with lower quality profiles should not have gained like they did.

My money is on content quality, not links. But hey, anything is possible. :)

Next Steps for Phantom 2 Victims:
If you have been impacted by the 4/29 update, here is what I recommend doing:

  • I would take a hard look at content quality problems riddling your website. Just like Phantom 1 in 5/2013, I would audit your site through a content quality lens. Once you thoroughly analyze your content, then you should form a remediation plan for tackling those problems as quickly as possible.
  • Understand the queries that dropped, the landing pages from Google organic that used to receive a lot of traffic, find engagement problems on the site, and address those problems. Try to improve content quality across the site and then hope you can recover like previous Phantom victims did.
  • From a links standpoint, truly understand the links you’ve built over the past six to twelve months. Were they manually built, naturally received, etc? Even though my money is on content quality, I still think it’s smart to tackle any link problems you can surface. That includes removing or nofollowing unnatural links, and disavowing what you can’t get to.

 

Summary – The Phantom Lives
It was fascinating to analyze Phantom 2 starting on 4/29 and to see the similarities with the original Phantom from 5/8/13. After digging into a number of sites impacted by the latest update, it was clear to see major content quality problems across the domains. I don’t know if Phantom is cleaning up where Panda missed out, or if it’s something completely separate, but there’s a lot of crossover for sure.

And remember, Penguin 2.0 rolled out just a few weeks after Phantom 1. It’s going to be very interesting to see if the next Penguin update follows that model. I guess we’ll find out soon enough. :)

GG

 

From SEO Tools To Emulation To Devices, How To Check Smartphone Rankings As Google’s Mobile-Friendly Algorithm Rolls Out

How To Check Mobile Rankings

We are now ten days into the mobile-friendly algorithm rollout, and to be honest, the impact has been somewhat underwhelming. I’ve been tracking many websites across categories and countries as the algorithm rolled out and it has been interesting to how some verticals were impacted, while others experienced no change. I didn’t personally see any fluctuations until last Thursday, but then started to pick up more examples as time went on. I’ve documented several examples in my last post in case you want to check them out, and have been updating that post when I come across new examples.

As I mentioned above, there are still some categories that remain completely unaffected. For example, there are websites that aren’t mobile-friendly still ranking extremely well with no impact at all. Google’s Gary Illyes explained this morning that the rollout is complete, but that some urls have not been reindexed yet. That means those urls don’t have the new scores yet (so rankings could change when that’s completed). Although the impact seems light right now, it would be smart to give it a little more time before coming to any conclusions. I definitely plan to write a post with my analysis once enough time goes by, so stay tuned.

Checking Smartphone Rankings
Now that the mobile-friendly algorithm has rolled out, I have received a lot of questions from business owners about how to best check their mobile performance over time. For example, how to identify mobile rankings fluctuations, how to view trending for mobile search traffic, which tools can help track those changes, etc. I’ve decided to focus on smartphone rankings in this post.

Below, I have provided five ways you can identify and track changes to your mobile rankings over time. And to clarify, I am referring to smartphone rankings, and not tablet. Tablet rankings are not being impacted by the mobile friendly algorithm, which I know has confused some people. By using the steps below, you should be able to gauge the impact of the mobile-friendly algorithm on your website(s). Let’s jump in.

1. Search Analytics Reporting in Google Webmaster Tools
I have been testing the new Search Analytics reporting (now in beta) in Google Webmaster Tools since early March. It used to be called the “Search Impact” report, but that changed during the alpha. There is some outstanding functionality in the new search analytics reporting and I expect Google to roll it out soon to everyone. One reason I think they should roll it out is based on how you can track mobile versus desktop rankings. Using the Devices dimension, you can compare rankings across both desktop and mobile, which quickly enables you to identify a mobile rankings demotion.

Mobile Rankings in Search Analytics Reporting (beta)

I already wrote a post explaining how to do this, and I highly l recommend you check out that post for more information. If you have access to the new reporting, then follow my tutorial and compare your rankings. If you don’t yet, then hang in there. Again, I expect it to roll out to everyone sooner than later. In a nutshell, you can view desktop and mobile rankings side by side. You can also compare timeframes for mobile rankings, and then compare mobile impressions and clicks to previous timeframes. Below, I’ve compared clicks after the mobile-friendly algorithm rolled out to prior.

Comparing Mobile Traffic in Search Analytics Reporting (beta)


2. SEMrush Mobile Reporting (New!)
On 4/21 I fired up SEMrush to check the desktop rankings for a company I was analyzing when I noticed something very interesting. There was now a desktop/mobile toggle on the overview page. Clicking “mobile” brought up some very interesting mobile reports! It ends up SEMrush launched their new mobile reporting right on 4/21. Awesome.

SEMrush Mobile Reporting - Overview

On the overview page, you can view a graph showing the number of pages in the top 20 results from that domain that are mobile-friendly versus non mobile-friendly. It’s a great way to get a lay of the land. You can also view a search performance trend for mobile keywords, the top keywords from a mobile standpoint, and the position distribution for those keywords.

SEMrush Mobile Reporting - Trending

Then you can access the “Positions” report to view all keyword data for mobile, including rank. You can click the toggle up top to switch from mobile to desktop. And you can export the results to Excel where you can use vlookup to compare desktop and mobile rankings for each keyword. If you notice a significant discrepancy between the two, then you could be negatively (or positively) impacted by the mobile-friendly algorithm.

SEMrush Mobile Reporting - Positions


3. Searchmetrics Mobile Reporting
Searchmetrics also launched a mobile reporting beta. On the overview page for a domain, you can quickly view the search visibility across desktop and mobile.Searchmetrics Mobile Reporting - Overview

And clicking the “mobile” tab brings up a report showing both the desktop and mobile rankings for the keyword at hand. This clearly makes it easy to identify a mobile rankings demotion. You will see icons for desktop versus mobile for each keyword, along with the rank for each.

Searchmetrics Mobile Reporting - Compare Rankings


4. Manually Via Mobile Devices
Yes, you can still check rankings manually via your mobile phone. For example, fire up Chrome on your mobile phone, go incognito, and test searches. Just keep in mind that your results can be impacted by your location. But you can easily turn off location services to see the difference when it’s on versus off. It’s not perfect, but can supplement other methods for checking mobile rankings.

Checking Mobile Rankings Via Mobile Device
Also, if you are checking sites targeting other countries, make sure to use the Google property for that country. For example, Google UK, Canada, Australia, etc. If not, you can obviously see different results. Again, not perfect, but can work. And definitely try and get your hands on multiple mobile devices. I have several I use to test sites during audits, including both Android and iOS devices.

5. Use Chrome Developer Tools To Emulate Mobile Devices
Many people still don’t realize that Chrome can do this… and it’s awesome. Right from Chrome developer tools, you can emulate any mobile device you want. This enables you to quickly check if a site is mobile-friendly or not. And as you have probably guessed by now, you can check Google rankings too.

Access Chrome Developer Tools by clicking the menu icon in Chrome, then Tools, and then Developer Tools. Or just click control->shift->i to bring up dev tools. Then click the icon for “Toggle Device Mode” (the mobile phone icon).
Checking Mobile Rankings Using Chrome Developer Tools

Once you do, you can choose the device you want to test and then refresh the page. Boom, you’re now emulating that device. You can see I’m emulating an iphone 6 in the screenshot below.

Emulating An iPhone 6 in Chrome Developer Tools

Also, when you hover your mouse over the screen, the cursor changes to a circle to signify tapping and swiping (like a person would do when using the device). Now access Google and search away. You will see the smartphone search results and you can check the rankings of target queries right from Chrome.


Summary – Check Mobile Rankings To Help Gauge *Your* Impact
Now that the mobile-friendly algorithm has rolled out, it’s important to check your mobile rankings for queries leading to your site. Using the methods listed above, you can quickly identify mobile rankings changes across keywords. And if you do find ranking differences, dig into the situation to find out why. Ensure all of your pages are mobile-friendly, implement any necessary fixes, and regain lost rankings.

Again, I plan to write a post detailing the impact of the mobile-friendly algorithm (once a little more time goes by). So stay tuned. :)

GG

 

Now We Have Liftoff: The First Examples of Google’s Mobile Friendly Algorithm In Action

Mobile-Friendly Algorithm Liftoff

4/21/15 was a date marked down in every SEO’s calendar with a giant red star next to it. That’s the day Google planned to roll out its mobile-friendly algorithm. They preannounced the update, which is rare, and Google explained it would have a significant impact on the smartphone search results.

But then 4/21 arrived and nothing changed. And I mean nothing… I’ve been checking a wide range of sites across categories and countries, and nothing changed at all.

That’s until noon today. :)

Then I started to notice differences in the desktop rankings versus smartphone rankings across a number of sites I’m tracking. Google did say it could take a week to roll out, so I’m sure this is just the beginning.

I’ve provided screenshots below of the initial examples. I’ll provide the domain, query, desktop ranking, smartphone ranking, and screenshots for both mobile and desktop. Note, I’ll try and update this post with more examples as time goes on.

Here we go:

Example 1:
Domain: potterybarnkids.com
Notes:
Most of potterybarnkids.com is mobile-friendly, but there are still pages that aren’t. I ran across this example today while researching the niche. The page ranks #5 in desktop, but slips to #11 in the smartphone results.
Query: girls bedroom ideas
Desktop Ranking: 5
Smartphone Ranking: 11

Desktop: #5
Mobile Friendly Algorithm Pottery Barn Kids

Smartphone: #11

Mobile Friendly Algorithm Pottery Barn Kids Smartphone

Example 2:
Domain: Bargainist.com
Query: Eddie Bauer Sale
Desktop Ranking: 7
Smartphone Ranking: 16

Desktop: #7
Bargainist Mobile Friendly Algorithm

Smartphone: #16
Bargainist Smartphone Rankings Mobile Friendly Algo

Example 3:
Domain: Moz.com

Query: redirect
Desktop Ranking: 3
Smartphone Ranking: 13

Desktop: #3
Moz Desktop Ranking Mobile Friendly Algo

Smartphone: #13
Moz Smartphone Ranking Mobile Friendly Algo


Example 4:
Domain: atu2.com

Query: with or without you
Desktop Ranking: 11
Smartphone Ranking: 23

Desktop: #11
atu2 Desktop Ranking Mobile Friendly Algo

Smartphone: #23
atU2 Smartphone Rankings Mobile Friendly Algo

Example 5:
{Update 4/26:} I picked up another example of the mobile friendly algorithm in action. It’s an interesting one, as one url moved from #6 to #10, another jumped from page 2 to page 1, while another url slid to page 3 from page 2. Needless to say, moving from page 1 to 2 (or beyond), or vice versa, can have a big impact on visibility and traffic.

For the query “all of me lyrics”, MusixMatch.com (mobile-friendly) moved from #11 in the desktop SERPs to #8 in the smartphone rankings. Songlyrics.com (not mobile-friendly) moved from #12 in the desktop SERPs to #22 in the smartphone rankings (page 3), and lyricsmode.com (not mobile-friendly) moved from #6 in the desktop SERPs to #10 in the smartphone rankings.

Domains: MusixMatch.com, Songlyrics.com, Lyricsmode.com, Songtexte.com
Query: all of me lyrics
MusixMatch.com Desktop Ranking: 11
MusixMatch.com Smartphone Ranking: 8

Songlyrics.com Desktop Ranking: 12
Songlyrics.com Smartphone Ranking: 22 (page 3)

Lyricsmode.com Desktop Ranking: 6
Lyricsmode.com Smartphone Ranking: 10

Desktop SERPs (Page 1):

Mobile Friendly Algorithm All Of Me Lyrics

Desktop SERPs (Page 2):
Mobile Friendly Algo MusixMatch Songlyrics

Smartphone Rankings (Page 1):
Mobile Friendly Algorithm All Of Me Smartphone SERPs

Smartphone Rankings (Page 3): Songlyrics.com moved to #22 from #12 in desktop SERPs:
Mobile Friendly Algorithm Songlyrics.com Mobile SERPs

Example 6:
{Update: 4/27} – I’ve added another example based on my research. The screenshots below show the results for boxofficemojo.com. The site ranks #8 in the desktop SERPs for the query “kingsman”, but slides to #19 in the smartphone rankings.

Domain: boxofficemojo.com
Query: kingsman
Desktop Ranking: 8
Smartphone Ranking: 19

Desktop: #8
Mobile Friendly Algorithm Box Office Mojo

Smartphone: #19
Mobile Friendly Algorithm Box Office Mojo Smartphone SERPs


Example 7:
{Update: 4/28} – I’ve added another example based on my research. It’s an interesting one, since the domain (tomsguide.com) seems to be dropping significantly in the smartphone search results, while ranking on page one in the desktop SERPs.

The screenshots below show tomsguide.com ranking #9 in the desktop SERPs for the query “amazon black friday ad”, but slides to #45 in the smartphone rankings (page 5). Yes, that’s a drop of 36 spots. And the page on tomsguide.com is actually mobile-friendly. Very strange, but definitely worth noting. I’m not sure if this is a mistake on Google’s part or if something else is going on.

Domain: tomsguide.com
Query: amazon black friday ad
Desktop Ranking: 9
Smartphone Ranking: 45 (page 5)

Desktop: #9
Mobile Friendly Algorithm Tom's Guide

Smartphone: #45 (page 5)

Mobile Friendly Algorithm Tom's Guide

It was interesting to see the changes per query. For each of the sites, some rankings remained untouched, while others dropped. And then the rankings drop was sometimes only a few spots, where others were more extreme. This is similar to the smartphones ranking demotion case study I wrote about last year. Anyway, there’s a lot more data to go through before coming to any conclusions.

And these aren’t the only examples I’m seeing. Again, I’ll try and update this post with more examples soon.

Now back to testing. :)

GG

 

Sinister 404s – The Hidden SEO Danger of Returning The Wrong Header Response Code [Case Study]

Hidden SEO Danger 404 Response Code

A few weeks ago, I was contacted by a small business owner about my SEO services. And what started out as a simple check of a website turned into an interesting case study about hidden SEO dangers. The company has been in business for a long time (30+ years), and the owner was looking to boost the site’s SEO performance over the long-term. From the email and voicemail I received, it sounded like they were struggling to rank well across important target queries and wanted to address that ASAP. I also knew they were running AdWords to provide air cover for SEO (which is smart, but definitely not a long-term plan for their business).

Unfortunately, my schedule has been crazy and I knew I couldn’t take them on as a longer-term client. But, I still wanted to quickly check out their website to get a better feel for what was going on. And it took me about three minutes to notice a massive problem (one that is killing their efforts to rank for many queries). And that’s a shame because they probably should rank for those keywords based on their history, services, content, etc.

Surfacing a Giant SEO Problem
As I browsed the site, I noticed they had a good amount of content for a small business. The site had a professional design, it was relatively clean from a layout perspective, and provided strong content about their business, their history, news about the organization, the services they provided, and more.

But then it hit me. Actually, it was staring me right in the face. I noticed a small 404 icon when hitting one of their service pages (via the Redirect Path Chrome extension). OK, so that’s odd… The page renders fine, the content and design show up perfectly, but the page 404s (returning a Page Not Found error). It’s like the opposite of a soft 404. That’s where the page looks like a 404, but actually returns a 200 code. Well in this situation, the page look like a 200, but returns a 404 instead. I guess you can call it a “soft 200″.

404 Header Response Code in Redirect Path Chrome Extension

So I started to visit other pages on the site and more 404 header response codes followed. Actually, almost every single page on the site was throwing a 404 header response code. Holy cow, the initial 404 was just the tip of the iceberg.

After seeing 404s pop up all over the site, I quickly decided to crawl the website via Screaming Frog. I wanted to see how widespread of a problem it was. And it ends up that my initial assessment was spot on. Almost every page on the site returned a 404 header response code. The only pages that didn’t were the homepage and some pdfs. But every other page, including the services pages, news pages, about page, contact, etc. returned a 404.

Header Response Codes in Screaming Frog

For those of you familiar with SEO, then you know how this problem can impact a website. But for those of you unfamiliar with 404s and how they impact SEO, I’ll provide a quick rundown next. Then I’ll jump back to the story.

What is a 404 Header Response Code?
Every time a webpage is requested, the server will return a header response code. There are many that can be returned, but there are some standard codes you’ll come across. For example, 200 means the page returned OK, 301 means permanent redirect, 302 is a temporary redirect, 500 is an application error, 403 is forbidden, and 404 means page not found.

Header response codes are extremely important to understand for SEO. If you want a webpage indexed, then you definitely want it to return a 200 response code (which again, means OK, the request has succeeded). But if the page returns a 404, then that tells the engines that the page was not found and that it should be removed from the index. Yes, read that last line again. 404s basically inform Google and Bing that the page is gone and that it can be removed from each respective index. That means it will have no shot of ranking for target keywords.

And from an inbound links perspective, 404s are a killer. If a page 404s, then it cannot benefit from any inbound links pointing at the url. And the domain itself cannot benefit either (at an aggregate level). So 404s will get urls removed from Google’s index and can hamper your link equity (at the url level and at the domain level). Not good, to say the least.

Side Note: Checking Response  Codes
Based on what I’ve explained, some of you reading this post might be wondering how to easily check your header response codes. And you definitely should. I won’t cover the process in detail in this post, but I will point you in the right direction. There are several tools to choose from and I’ll include a few below.

You could Fetch as Google in Google Webmaster Tools to check the response sent to Googlebot (which includes the header response code). You can also use a browser plugin like Web Developer Tools or Redirect Path to quickly check header response codes on a url by url basis.

Web Developer Plugin Header Response Code

Fetch as Google and browser plugins are great, but they only let you process one url at a time. But what if you wanted to check your entire site in one shot? For situations like that, you could use a tool that crawls an entire website (or sections of a site). For example, you could use Xenu or Screaming Frog for small to medium sized sites and then a tool like Deep Crawl for larger-scale sites. All three will return a boatload of information about your pages, including the header response codes. Now back to the case study.

Dangerous, But Invisible to the Naked Eye
Remember, the entire site was returning 404 header response codes, other than the homepage and a few pdfs. But this 404 situation was sinister since the webpages looked like they resolved ok. You didn’t see a standard 404 page, but instead, you saw the actual page and content. But, the pages were actually 404ing and not being indexed. Like I said, it was a sinister problem.

Based on what I just explained, you could tell why an SMB owner would be baffled and simply not understand why their website wasn’t ranking well. They could see their site, their content, the various pages resolving, but they couldn’t see the underlying problem. Header response codes are hidden to the naked eye, and most people don’t even realize they are being returned at all. But the response code returned is critically important for how the search engines process your webpages.

Swingers Find Hidden 404s

My Response – “You’re At SEO Defcon 2”
This was a tough situation for me. I absolutely wanted to help the business longer-term, but couldn’t based on my schedule. But I absolutely wanted to make sure they understood the problem I came across while quickly checking out their website.

So I crafted a quick email explaining that I couldn’t help them at this time, but that I found a big problem on their site. As quickly and concisely as I could, I explained the 404 situation, provided a few screenshots, and explained they should get in touch with their designer, developer, or hosting provider to rectify the situation ASAP. That means ensuring their webpages return the proper header response codes. Basically, I told them that if their webpages should be indexed, then they should return a 200 header response code and not the 404s being returned now.

I hit “Send” and the ball was in their court.

Their Response – “We hear you and we’re on the right track – we think.”
I heard back from the business owner who explained they started working with someone to rectify the problem. They clearly didn’t know this was going on and they were hoping to have the situation fixed soon.

But as of today, the problem is still there. The site still returns 404 header response codes on almost every page. That’s unfortunate, since again, the pages returning a 404 have no chance at all of ranking in search and cannot help them from a link equity standpoint. The pages aren’t indexed and the site is basically telling Google and Bing to not index any of the core pages on the site.

I’m going to keep an eye on the situation to see when the changes take hold. And I hope that’s soon. It’s a great example of how hidden technical dangers can destroy SEO.

Opening Up The Site – How Will The Engines Respond?
My hope is that when the pages return the proper response codes that Google and Bing will begin indexing the pages and ranking them appropriately. And that will help on several levels. The website can drive more prospective customers via organic search, while the business can probably pull back on AdWords spend. And the site can grow its power from an inbound link standpoint as well, now that the pages are being indexed properly.

But as I often say about SEO, it’s all about the execution. If they don’t implement the necessary changes, then their situation will remain as-is. I’ll try an update this post if the situation improves.

Summary – Know Your Header Response Codes
Although hidden to the naked eye, header response codes are critically important for SEO. The right codes will enable the engines to properly crawl and index your webpages, while the wrong codes could lead to SEO disaster. I recommend checking your site today (via both manual checks and a crawl). You might find you’re in the clear with 200s, but you also might find some sinister 404s. So check now.

GG

 

How To Identify A Mobile Rankings Demotion Using The New Search Analytics Report in Google Webmaster Tools

Search Impact Reporting in Google Webmaster Tools

{Update: The Search Impact report was renamed to “Search Analytics” during the beta. The screenshots below will show “Search Impact” when the new report in Google Webmaster Tools is labeled “Search Analytics”.}

April 21, 2015 is an important date. That’s the day when Google will begin using mobile friendliness as a ranking signal. There’s been a lot of talk about how that’s actually going to work, how much of an impact it will have, etc. Well, more and more information has been surfacing over the past few days about the changes.

For example, Gary Illyes spoke at SMX West heavily about the new mobile UX algo and provided some outstanding information. Jennifer Slegg wrote up a recap of that session, which I highly recommend reading. She provided some amazing nuggets of information, including information about mobile friendly techniques, how the algo will handle specific urls, if 4/21 is hard date for the rollout, if Google is building a mobile index (which they are), and more.

So, as 4/21 quickly approaches, many webmasters are working hard to get their sites in order from a mobile UX standpoint. As documented by John Mueller and Gary Illyes (and really Google itself), you can use any of the three options for providing a mobile-friendly version of your website. For example, you can use responsive design, dynamic delivery, or even a separate mobile site. I’ve seen all three techniques work well for clients, so the path you choose should be based on your own site and business. But definitely move quickly… April 21 will roll up quickly.

 

The *Current* Smartphone Rankings Demotion – A Glimpse Into the Future
Many people don’t realize this, but Google already has a smartphone rankings demotion in place for specific situations. For example, when there are faulty redirects from the desktop version of the content to the mobile version, or if there are other mobile-only errors.

I caught one of those situations in the wild and wrote a two-part case study about it. I first detailed the problems I saw on Electronista.com and then documented the improvements in rankings and traffic once the problems were fixed. Based on what Gary Illyes and John Mueller have both said about the mobile UX algo, it sounds like the new algorithm will work in a very similar fashion to the current smartphone rankings demotion. Therefore, I definitely recommend you review the two-part case study.

Checking For Faulty Mobile Redirects

For example, the current smartphone rankings demotion is on a url by url basis. Just because you have faulty redirects or mobile-only errors does not mean the entire domain should suffer (algorithmically). Also, the desktop urls are unaffected (which makes absolute sense). Also, and this is important, the algorithm is running in real-time and will impact urls during the normal crawling process.

That means urls can be demoted as Google comes across mobile problems, but the demotion can also be lifted as Google crawls the urls and notices that the problems are fixed. And that’s exactly what I saw with the smartphone rankings demotion situations I have helped with.

 

Checking Mobile Rankings and The (New) Search Analytics Report
Google is currently testing a new search queries report in Google Webmaster Tools (called the Search Analytics report). Note, the report used to be called “Search Impact”, but was changed during the alpha. I have been testing the new version of the Search Analytics reporting and it provides some great functionality beyond what the current Search Queries reporting provides. I plan to write more about that soon, but for now, let’s focus on the mobile friendliness algorithm rolling out on 4/21.

There are six dimensions you can segment your data by in the new Search Analytics reporting. One of those dimensions is “Devices”. Using this report, you can filter data by desktop, mobile, and tablet. See below:

The Devices Dimension in The Search Impact Reporting

But don’t get fooled by the simplicity of the default report. By combining dimensions, you can view some elaborate reports that tell you a lot in a short amount of time.

When working on a smartphone rankings demotion (the current algo in place), I had to identify queries where a site ranked well in the desktop results, and then jump to the search queries reporting using the “mobile” filter for search property. When doing this for a large amount of queries, it could easily get monotonous.

But the new Search Analytics report comes to the rescue and provides a nifty way to see side by side rankings when comparing desktop to mobile. Below, I’m going to show you how to quickly run this report to see a side by side comparison of clicks and average position by query. By doing so, you can quickly identify a smartphone rankings demotion. That’s for the current smartphone rankings demotion, and should work for the new mobile UX algo rolling out on 4/21/15. Let’s jump into the report.

 

How To Check Rankings By Device
First, if you’re not part of the alpha testing program, then you won’t be able to access the Search Analytics report. But don’t fear, I can only imagine that Google wants to roll it out prior to 4/21/15 (based on the device reporting I’m showing you in this post).

To access the reporting, click “Search Traffic” and then “Search Analytics” in the left-side menu:

Accessing The Search Impact Reporting in Google Webmaster Tools

The default view will show you clicks for the past 30 days. The first thing you need to do is click the “Queries” dimension. That will present all of the queries your site ranks for during the timeframe you selected.

Using The Queries Dimension In The Search Impact Reporting

Next, click the filter dropdown underneath “Devices”, which should say “No filter” (since there isn’t a filter in place yet). Click the dropdown and the select “Compare devices”.

Filtering By Device In The Search Impact Reporting

Keep “Desktop VS. Mobile” as the selection and then click “Compare”.

Comparing By Device In The Search Impact Reporting

You should now see a comparison of clicks per query for both desktop and mobile. That’s great, but we need to know how the site ranks for each query across both desktop and mobile. To see that, click the checkbox for the “Avg. Position” metric.  This will add average position for each query to the report.

Adding The Average Position Metric In The Search Impact Reporting

To view more queries than the default ten, you can use the dropdown at the top of the report. For example, you can show up to 500 rows in the report in Google Webmaster Tools.

Now you can start checking rankings for queries across both desktop and mobile. Don’t expect them to be exactly the same for every query… But they should be close. For example, the first three listed below are very close (two are identical and one is off by just .1).

Comparing Average Position by Query In The Search Impact Reporting

In my experience, when you have a smartphone rankings demotion, there will be a clear difference. For example, some smartphone rankings will be 10+ positions lower (or even non-existent in certain situations). So, if you see rows like the following, then you might have a problem.

Identifying a Rankings Difference In The Search Impact Reporting

 

How To Identify Problems and Lift The Smartphone Rankings Demotion
If you find that there is a smartphone rankings demotion in place, then you should run to the “Mobile Usability” reporting in Google Webmaster Tools. Google will provide the problems it encountered while crawling your site. I highly recommend fixing those mobile usability issues asap.

Mobile Usability Reporting in Google Webmaster Tools

You can also use the mobile friendly test via the Google Developers site. That will also highlight problems on a url by url basis.
https://www.google.com/webmasters/tools/mobile-friendly/

Using Google's Mobile Friendly Test

You can also check the crawl errors reporting in Google Webmaster Tools to see if there are smartphone errors or faulty redirects.

Smartphone Crawl Errors in Google Webmaster Tools

And you can crawl your site as Googlebot for Smartphones to check how your site is handling requests for the desktop pages (if you have mobile redirects in place). Doing so can surface problems sitting below the surface that are sometimes hard to pick up manually.

Crawl As Googlebot for Smartphones

 

Summary – The Search Analytics Report Can Make An Impact
We all knew that mobile UX would become a ranking signal at some point, but now we have a specific date from Google for the rollout (4/21/15). When the new mobile algo launches, many will be wondering if they have been impacted, if their website dropped in rankings, and which urls are causing problems. As I demonstrated above, the new Search Analytics reporting can help webmasters quickly identify problems by comparing the rankings across desktop and mobile (quickly and efficiently).

If you don’t have access to the Search Analytics reporting yet, don’t worry. Again, I believe Google is going to roll this out before the 4/21 deadline. That would make complete sense, since the “Devices” dimension could prove to be extremely helpful when a smartphone rankings demotion is in place. One thing is for sure. The changes rolling out on (or around) April 21 will be fascinating to analyze. Google said this change will have a “significant impact” on the smartphone search results. And that impact can translate into many lost visitors, conversions, and revenue. Good luck.

GG