Challenging Murphy’s Law – 8 Immediate SEO Checks After A CMS Migration Goes Live

Murphy's Law for CMS Migrations

CMS migrations are a necessary evil for many companies. If your current technical setup is inhibiting your business from doing what it needs to be successful, then a CMS migration could be your way out. But migrations should not be taken lightly, and especially for large-scale websites that are changing urls. Any time you change urls, you run the risk of destroying SEO. And if SEO is an important driver of traffic for your site, then a migration could cause serious problems for your business.

Unfortunately, SEOs know Murphy’s Law all too well. It states, “Anything that can go wrong, will go wrong”. Well, large-scale migrations have many moving parts and the chances of a migration going live without some type of hiccup are rare. And you might find many hiccups, gremlins, or flat-out mistakes once the button is pushed and a migration goes live. But to me, it’s how you react once those gremlins are found that can make or break your migration. Quick checks and fast refinements can end up saving your SEO. Speed and accuracy matter.

The Summer of Migrations
For whatever reason, this is the summer of migrations and redesigns for me. Several of my large-scale clients are either migrating to a new CMS or they are redesigning their websites. Therefore, I’ve been neck deep in the strategy, testing, implementation, and auditing of each move. And based on my work this summer, I decided to write a post explaining what you can do as soon as the migration goes live to ensure all is ok (or to nip SEO problems in the bud).

Specifically, I have provided eight immediate checks you should make as soon as your new site goes live. The eight checks cover what you can do immediately following a push to production and can help catch serious problems before they become bigger ones. For SEOs, large-scale CMS migrations are not for the faint of heart. They are stressful, there’s a lot on the line, and there are many areas to monitor to ensure all goes well.

The Power of a Crawlable and Testable Staging Server
Before I hop into the top eight checks you should make after a migration goes live, I wanted to touch on a very important element to a CMS migration – the staging server.

If you can have your new site up and running on a staging server that’s accessible and crawlable to your SEO consultant or agency, then you can (typically) thoroughly test that setup prior to the migration going live. For clients that have this available, I’m able to crawl the new site in staging, test redirects, analyze technical SEO elements, browse with multiple devices, and more. It’s a much easier transition for a site that has a staging server accessible and crawlable than one that doesn’t.

If you don’t thoroughly test the new site in staging, including testing the redirects, you’ll have to test like mad as soon as the new site goes live. And believe me, you will find problems. Then you’ll need to fix those problems on the fly. And those fixes could lead to other problems, that will need to be fixed quickly as well… so on and so forth. It’s a slippery slope.

Keep this in mind if you are planning a CMS migration. Don’t proceed on faith alone… Test and know. That’s how you can nip serious SEO problems in the bud.

Some Prerequisites
Before we move on, you’ll need a few things in place. First, make sure you have all variations of your site set up in Google Search Console (GSC). That includes www, non-www, and if applicable, https www, and https non-www. And if specific subdomains are part of the migration, make sure you have them set up as well. For example,

Second, you should have already collected your top landing pages from the old site. You should export all top landing pages from Google Search Console, Google Analytics, Bing Webmaster Tools, and various link analysis tools. Then you should combine them and dedupe them. That’s your core list of urls to check after the migration goes live. I’ll cover more about this soon.

Third, you’ll need a tool that can crawl your site. The two crawlers I use extensively are DeepCrawl (for large-scale crawls) and Screaming Frog (for smaller crawls or for laser-focused crawls). This is how you will check your redirects in bulk. Note, I’m on the customer advisory board for DeepCrawl. It’s one of my favorite tools for enterprise crawls and I’ve been using it for years.

OK, now that you have the necessary elements in place, it’s time to perform eight immediate checks once the migration goes live. Note, the following list is just the beginning of your testing process. You definitely want to continue analyzing the migration over time to ensure all is ok. What I’m providing below should be checked as the button is pushed and your new site goes live.

1. Robots.txt and robots.txt Tester
Yes, robots.txt is a simple text file, but one that can cause all sorts of problems. Developers will often use a staging-specific robots.txt file which can easily get pushed to production by accident. And if it blocks important files or directories from being crawled, you could kill your SEO traffic.

So check this first after the new site goes live. Make sure it’s the version that should be in production and that the directives are free of errors. And make sure important areas of the site are not being blocked. That includes CSS and JavaScript that are necessary for Google to render the page properly. More about that soon.

And use the robots.txt Tester in Google Search Console. It’s a sandbox that enables you to test urls on your site. If you notice urls being blocked that shouldn’t be blocked, hunt down the directives causing the problems. You can edit the robots.txt file in GSC to test your changes (without impacting your actual robots.txt file).

Checking Robots.txt After CMS Migration

2. Check For Google Analytics and/or Other Analytics Code
Before you and your client’s executive team check Google Analytics after the migration goes live, and have a heart attack, make sure you add the necessary analytics code to your new site. If not, you will see traffic drop off a cliff, when in fact, that’s not really happening. It’s a scary sight for sure.

And more importantly, you will lose visibility into how the migration is going. I continually check various sources of traffic over time after a migration goes live to ensure all pages are being handled properly. If you drop your tracking code, then you’ll be out of luck.

Check Google Analytics Code After CMS Migration

Quick Tip: Providing Analytics Air Cover
It’s not a bad idea to have multiple analytics packages running on a site. For example, I have some clients running both Google Analytics and Adobe Analytics on the same site. When a recent migration went live without GA tagging (by accident), we could check Adobe Analytics to see how the migration was going. It provided air cover as GA tracking was put in place.

3. Canonical URL Tag Errors
Rel canonical is a good example of a single line of code that can wreak havoc on your site SEO-wise. When the migration goes live, quickly check core page types on the site to ensure canonical url tags are set up properly. You can also quickly crawl a snapshot of the site to get a feel for how rel canonical is being handled in bulk. If not, you can destroy rankings and subsequent traffic to pages that were performing extremely well before the migration.

You can check my post about rel canonical problems to learn about the various issues that can be implemented by accident. For example, all canonicals pointing to a homepage, canonicals pointing to 404s, endless canonical loops, etc. A quick check of the canonical url tag after a crawl can save your SEO.

Checking Rel Canonical After CMS Migration

4. Meta Robots Tag Problems
Along the same lines, the meta robots tag could have a serious impact on your site if the wrong content values are used. For example, you could be noindexing important pages, or on the flip side, you could be opening up pages that shouldn’t be indexed.

Again, manual checks plus a snapshot crawl will give you a view at the meta robots tag across many pages. You can start to pick up patterns and make changes before serious damage can be done.

Checking Meta Robots Tag After CMS Migration

5. Mobile-friendly Test (MFT)
Since Google announced the mobile-friendly algorithm, many companies have taken the plunge and moved to a responsive design or mobile urls. So when you make a big change, like moving to a new CMS, you definitely want to check the mobile-friendliness of your new urls.

Unfortunately, you can’t run the mobile-friendly test on a staging server that requires authentication. You can definitely run others tests while in staging to ensure the site is mobile-friendly, which you should do. But remember Murphy’s Law… don’t trust that your staging urls are behaving the same way as your production urls. That’s why you should absolutely run Google’s official mobile-friendly test once the new site is live.

To begin, I recommend testing key urls by category. That would include your homepage, category urls, specific products or services, and other important types of pages on the site.

Running New URLs Through Google's Mobile-Friendly Test

And then you can use a tool like Url Profiler to check mobile-friendliness in bulk. Import a list of urls on the new site and fire away. The resulting spreadsheet will provide details about mobile-friendliness. Then you can hunt down any problems and rectify them quickly.

Checking Mobile-Friendliness In Bulk Via URL Profiler

6. Fetch and Render in GSC
To ensure Google can fetch the necessary resources to accurately render the page at hand, you should run important urls through fetch and render in Google Search Console. Similar to the mobile-friendly test, you cannot run fetch and render on a staging server that requires authentication. Therefore, you’ll need to test this out as soon as the site goes live.

Google has explained repeatedly that if you block resources, including CSS and JavaScript, then that will impact how Google indexes your pages. Google wants to render the page just like a user would in a browser. So as I said on Twitter a few weeks ago, “If you want to rock, don’t block.” :)

Using Fetch and Render on New URLs

7. Check XML Sitemaps
XML sitemaps are an important supplement to a traditional web crawl. Using sitemaps, you can feed both Google and Bing all of your canonical urls. After going live with a new CMS, you will have a list of new urls and old urls. In the short-term, you should submit both your old urls and your new ones via xml sitemaps. Continuing to submit your old urls for a short time will enable the engines to quickly find the 301 redirects to the new urls. That can quicken up the process of getting the new urls crawled and indexed.

Based on the migration, you will definitely want to check your new xml sitemaps to ensure they are being published accurately. First, you should check the sitemaps reporting in Google Search Console for both warnings and errors. If you see anything out of the ordinary, identify the problems and send to your developers ASAP. You want to nip problems in the bud.Checking XML Sitemap Warnings in Google Search Console

Second, you should crawl your new sitemaps to ensure they lead to the right urls that resolve with 200 header response codes. I can’t tell you how many times I’ve crawled new sitemaps and found 404s, 500s, and redirects. Note, if your old urls are changing, then they should 301, since the old urls are redirecting to your new ones. But the new sitemap should not have any redirects or non-200 response codes. You want clean sitemaps, not “dirty” ones.

You don’t want your crawl graph to look like this:

Crawl Graph of XML Sitemap URLs

Crawling XML Sitemaps To Check Response Codes

And while we’re on the topic of crawling urls, let’s jump to an incredibly important check – crawling your top urls from before the migration!

8. Crawl Top Landing Pages From The Old Site (and Check Redirect Chains)
When you migrate to a new CMS, there’s a good chance you’ll be changing urls. And even if one character changes in your url, then it’s a brand new one to Google. So, in order to maintain search equity during the migration, it’s critically important to organize and then crawl your top landing pages to ensure they resolve accurately. Your top landing pages are urls that ranked well prior to the migration, the ones driving traffic, the ones that have built inbound links, and obviously the ones sustaining your business. Don’t assume your redirects are working well. There are many reasons they might not be.

The first thing you need to do is collect all important landing pages (as explained earlier). You can find these landing pages in Google Analytics, Google Search Console, Bing Webmaster Tools, and from various link analysis tools. Once you download them all, you should combine them, and then dedupe them.

Deduping URLs in Excel

I use both DeepCrawl and Screaming Frog to check the redirects list. Depending on the size of your list, you might have a few hundred urls, or you might have a few hundred thousand urls (or even more). DeepCrawl is extraordinarily good at crawling many urls (over 100K), while Screaming Frog is outstanding for laser-focused crawls (under 50K urls).

When crawling with Screaming Frog, ensure that “follow redirects” is enabled in your settings. This will allow the Frog to not only check the url you feed it, but it will also follow the redirect chain. That’s incredibly important, since just setting up a 301 isn’t enough… That 301 needs to lead to a 200 code at the true destination url.

Following Redirects in Screaming Frog

One of the biggest mistakes I’ve seen is assuming all 301s work perfectly. In other words, you crawl your top landing pages and they all 301. That’s great, but where do they lead? If they don’t lead to the new url that resolves with a 200 code, then you could be killing your SEO.

In Screaming Frog, crawl your urls and then export the redirect chains report (which is accessible from the reports dropdown). When opening that file in Excel, you will see the initial url crawled, the header response code, and the number of redirects encountered. You can follow the sheet along to the right to see the next url in the chain, along with its response code.

Analyzing Redirect Chains After A CMS Migration

Does the destination url resolve with a 200 code or does that redirect again? If it redirects again, you are now daisy-chaining redirects. That’s not great, as Google will only follow a certain number of redirects before giving up. And as you can guess, you can keep following the chain to the right to see how each url in the list resolves. You might be surprised what you find… like three, four, or even more redirects in the chain. And even worse, you might find daisy-chained redirects that lead to 404s. Not good.

In DeepCrawl, you can export the 301 redirect report, which will also provide the redirect chain. If you have many urls to check, then DeepCrawl can be extremely helpful. It doesn’t run locally and can handle hundreds or thousands, or even millions of urls.

Once your export the urls, you’ll need to use Excel’s “text to columns” functionality to break apart the redirect chain column. Once you do, you can follow the chain to the right to see how each url resolves. Again, you might find 404s, excessive redirect chains, or redirects that lead to 500s or other errors. The core point is that you won’t know until you crawl the old urls, and the redirect chain. So crawl away.

Checking Redirect Chains in DeepCrawl


Summary – Never Assume All Is OK With A Migration
Unfortunately, Murphy’s Law almost always comes into play with a CMS migration. And that’s especially the case for larger-scale websites with a lot of moving parts. Even if you’ve heavily checked a site in staging, make sure you perform the necessary checks once the new site is pushed to production. If not, you run the risk of having major problems severely impact SEO.

And the point of migrating to a new CMS isn’t to destroy SEO… it’s to increase the effectiveness of your website! Although the list I provided above is just a starting point, they are important factors to check once the new site goes live. Good luck.



How To Identify and Avoid Technical SEO Optical Illusions

Technical SEO Optical Illusions

Without a clean and crawlable website structure, you’re dead in the water SEO-wise. For example, if you don’t have a solid SEO foundation, you can end up providing serious obstacles for both users and search engines. And that’s never a good idea. And even if you have clean and crawlable structure, problems with various SEO directives can throw a wrench into the situation. And those problems can lie beneath the surface just waiting to kill your SEO efforts. That’s one of the reasons I’ve always believed that a thorough technical audit is the most powerful deliverables in all of SEO.

The Power of Technical SEO Audits: Crawls + Manual Audits = Win
“What lies beneath” can be scary. Really scary… The reality for SEO is that what looks fine on the surface may have serious flaws. And finding those hidden problems and rectifying them as quickly as possible can help turn a site around SEO-wise.

When performing an SEO audit, it’s incredibly important to manually dig through a site to see what’s going on. That’s a given. But it’s also important to crawl the site to pick up potential land mines. In my opinion, the combination of both a manual audit and extensive crawl analysis can help you uncover problems that may be inhibiting the performance of the site SEO-wise. And both might help you surface dangerous optical illusions, which is the core topic of my post today.

Uncovering Optical SEO Illusions
Optical illusions can be fun to check out, but they aren’t so fun when they can negatively impact your business. When your eyes play tricks on you, and your website takes a Google hit due to that illusion, it’s not so fun.

The word “technical” in technical SEO is important to highlight. If your code is even one character off, it could have a big impact on your site SEO-wise. For example, if you implement the meta robots tag on a site with 500,000 pages, then the wrong directives could wreak havoc on your site. Or maybe you are providing urls in multiple languages using hreflang, and those additional urls are adding 30,000 urls to your site. You would definitely want to make sure those hreflang tags are set up correctly.

But what if you thought those directives and tags were set up perfectly when in fact, they aren’t set up correctly. The look right at first glance, but there’s something just not right…

That’s the focus of this post today, and it can happen easier than you think. I’ll walk through several examples of SEO optical illusions, and then explain how to avoid or pick up those illusions.

Abracadabra, let’s begin. :)

Three Examples of Technical SEO Optical Illusions
First, take a quick look at this code:

Technical SEO Problem with hreflang

Did you catch the problem? The code uses “alternative” versus “alternate”. And that was on a site with 2.3M pages indexed, many of which had hreflang implemented pointing to various language pages.

Hreflang Using Alternative vs. Alternate

Now take a look at this code:

SEO Technical Problem with rel canonical

All looks ok, right? At first glance you might miss it, but the code uses “content” versus “href”. If rolled out to a website, it means rel canonical won’t be set up correctly for any pages using the flawed directive. And on sites where rel canonical is extremely important, like sites with urls resolving multiple ways, this can be a huge problem.

Technical SEO problem with rel canonical

Now how about this one?
Technical SEO problem with meta robots

OK, so you are probably getting better at this already. The correct value should be “noindex” and not “no index”. So if you thought you were keeping those 75,000 pages out of Google’s index, you were wrong. Not a good thing to happen while Pandas and Phantoms roam the web.

Meta Robots problem using no index vs. noindex

I think you get the point.

How To Avoid Falling Victim To Optical Illusions?
As mentioned earlier, using an approach that leverages manual audits, site-wide crawls, and then surgical crawls (when needed) can help you nip problems in the bud. And leveraging reporting in Google Search Console (formerly Google Webmaster Tools) is obviously a smart way to proceed as well. Below, I’ll cover several things you can do to identify SEO optical illusions while auditing a site.

SEO Plugins
From a manual audit standpoint, using plugins like Mozbar, SEO Site Tools, and others, can help you quickly identify key elements on the page. For example, you can easily check rel canonical and the meta robots tag via both plugins.

Using Mozbar to identify technical seo problems.

From a crawl perspective, you can use DeepCrawl for larger crawls and Screaming Frog for small to medium size crawls. I often use both DeepCrawl and Screaming Frog on the same site (using “The Frog” for surgical crawls once I identify issues through manual audits or the site-wide crawl).

Each tool provides data about key technical SEO components like rel canonical, meta robots, rel next/prev, and hreflang. Note, DeepCrawl has built-in support for checking hreflang, while Screaming Frog requires a custom search.

Using DeepCrawl to identify SEO technical problems.

Once the crawl is completed, you can double-check the technical implementation of each directive by comparing what you are seeing during the manual audit to the crawl data you have collected. It’s a great way to ensure each element is ok and won’t cause serious problems SEO-wise. And that’s especially the case on larger-scale websites that may have thousands, hundreds of thousands, or millions of pages on the site.

Google Search Console Reports
I mentioned earlier that Google Search Console reports can help identify and avoid optical illusions. Below, I’ll touch on several reports that are important from a technical SEO standpoint.

Index Status
Using index status, you can identify how many pages Google has indexed for the site at hand. And by the way, this can directory-level (which is a smart way to go). Index Status reporting will not identify specific directives or technical problems, but can help you understand if Google is over or under-indexing your site content.

For example, if you have 100,000 pages on your site, but Google has indexed just 35,000, then you probably have an issue…

Using Index Status in Google Search Console to identify indexation problems.

International Targeting
Using the international targeting reporting, you can troubleshoot hreflang implementations. The reporting will identify hreflang errors on specific pages of your site. Hreflang is a confusing topic for many webmasters and the reporting in GSC can get you moving in the right direction troubleshooting-wise.

Using International Targeting reporting in GSC to troubleshoot hreflang.

Fetch as Google

Using Fetch as Google, you can see exactly what Googlebot is crawling and the response code it is receiving. This includes viewing the meta robots tag, rel canonical tags, rel next/prev, and hreflang tags. You can also use fetch and render to see how Googlebot is rendering the page (and compare that to what users are seeing).

Uisng fetch as google to troubleshoot techncial SEO problems.

Robots.txt and Blocked Resources
Using the new robots.txt Tester in Google Search Console enables you to test the current set of robots.txt directives against your actual urls (to see what’s blocked and what’s allowed). You can also use the Tester as a sandbox to change directives and test urls. It’s a great way to identify current problems with your robots.txt file and see if future changes will cause issues.

Using robots.txt Tester to troubleshoot technical SEO problems.

Summary – Don’t Let Optical Illusions Trick You, and Google…
If there’s one thing you take away from this post, it’s that technical SEO problems can be easy to miss. Your eyes can absolutely play tricks on you when directives are even just a few characters off in your code. And those flawed directives can cause serious problems SEO-wise if not caught and refined.

The good news is that you can start checking your own site today. Using the techniques and reports I listed above, you can dig through your own site to ensure all is coded properly. So keep your eyes peeled, and catch those illusions before they cause any damage. Good luck.



From SEO Tools To Emulation To Devices, How To Check Smartphone Rankings As Google’s Mobile-Friendly Algorithm Rolls Out

How To Check Mobile Rankings

We are now ten days into the mobile-friendly algorithm rollout, and to be honest, the impact has been somewhat underwhelming. I’ve been tracking many websites across categories and countries as the algorithm rolled out and it has been interesting to how some verticals were impacted, while others experienced no change. I didn’t personally see any fluctuations until last Thursday, but then started to pick up more examples as time went on. I’ve documented several examples in my last post in case you want to check them out, and have been updating that post when I come across new examples.

As I mentioned above, there are still some categories that remain completely unaffected. For example, there are websites that aren’t mobile-friendly still ranking extremely well with no impact at all. Google’s Gary Illyes explained this morning that the rollout is complete, but that some urls have not been reindexed yet. That means those urls don’t have the new scores yet (so rankings could change when that’s completed). Although the impact seems light right now, it would be smart to give it a little more time before coming to any conclusions. I definitely plan to write a post with my analysis once enough time goes by, so stay tuned.

Checking Smartphone Rankings
Now that the mobile-friendly algorithm has rolled out, I have received a lot of questions from business owners about how to best check their mobile performance over time. For example, how to identify mobile rankings fluctuations, how to view trending for mobile search traffic, which tools can help track those changes, etc. I’ve decided to focus on smartphone rankings in this post.

Below, I have provided five ways you can identify and track changes to your mobile rankings over time. And to clarify, I am referring to smartphone rankings, and not tablet. Tablet rankings are not being impacted by the mobile friendly algorithm, which I know has confused some people. By using the steps below, you should be able to gauge the impact of the mobile-friendly algorithm on your website(s). Let’s jump in.

1. Search Analytics Reporting in Google Webmaster Tools
I have been testing the new Search Analytics reporting (now in beta) in Google Webmaster Tools since early March. It used to be called the “Search Impact” report, but that changed during the alpha. There is some outstanding functionality in the new search analytics reporting and I expect Google to roll it out soon to everyone. One reason I think they should roll it out is based on how you can track mobile versus desktop rankings. Using the Devices dimension, you can compare rankings across both desktop and mobile, which quickly enables you to identify a mobile rankings demotion.

Mobile Rankings in Search Analytics Reporting (beta)

I already wrote a post explaining how to do this, and I highly l recommend you check out that post for more information. If you have access to the new reporting, then follow my tutorial and compare your rankings. If you don’t yet, then hang in there. Again, I expect it to roll out to everyone sooner than later. In a nutshell, you can view desktop and mobile rankings side by side. You can also compare timeframes for mobile rankings, and then compare mobile impressions and clicks to previous timeframes. Below, I’ve compared clicks after the mobile-friendly algorithm rolled out to prior.

Comparing Mobile Traffic in Search Analytics Reporting (beta)

2. SEMrush Mobile Reporting (New!)
On 4/21 I fired up SEMrush to check the desktop rankings for a company I was analyzing when I noticed something very interesting. There was now a desktop/mobile toggle on the overview page. Clicking “mobile” brought up some very interesting mobile reports! It ends up SEMrush launched their new mobile reporting right on 4/21. Awesome.

SEMrush Mobile Reporting - Overview

On the overview page, you can view a graph showing the number of pages in the top 20 results from that domain that are mobile-friendly versus non mobile-friendly. It’s a great way to get a lay of the land. You can also view a search performance trend for mobile keywords, the top keywords from a mobile standpoint, and the position distribution for those keywords.

SEMrush Mobile Reporting - Trending

Then you can access the “Positions” report to view all keyword data for mobile, including rank. You can click the toggle up top to switch from mobile to desktop. And you can export the results to Excel where you can use vlookup to compare desktop and mobile rankings for each keyword. If you notice a significant discrepancy between the two, then you could be negatively (or positively) impacted by the mobile-friendly algorithm.

SEMrush Mobile Reporting - Positions

3. Searchmetrics Mobile Reporting
Searchmetrics also launched a mobile reporting beta. On the overview page for a domain, you can quickly view the search visibility across desktop and mobile.Searchmetrics Mobile Reporting - Overview

And clicking the “mobile” tab brings up a report showing both the desktop and mobile rankings for the keyword at hand. This clearly makes it easy to identify a mobile rankings demotion. You will see icons for desktop versus mobile for each keyword, along with the rank for each.

Searchmetrics Mobile Reporting - Compare Rankings

4. Manually Via Mobile Devices
Yes, you can still check rankings manually via your mobile phone. For example, fire up Chrome on your mobile phone, go incognito, and test searches. Just keep in mind that your results can be impacted by your location. But you can easily turn off location services to see the difference when it’s on versus off. It’s not perfect, but can supplement other methods for checking mobile rankings.

Checking Mobile Rankings Via Mobile Device
Also, if you are checking sites targeting other countries, make sure to use the Google property for that country. For example, Google UK, Canada, Australia, etc. If not, you can obviously see different results. Again, not perfect, but can work. And definitely try and get your hands on multiple mobile devices. I have several I use to test sites during audits, including both Android and iOS devices.

5. Use Chrome Developer Tools To Emulate Mobile Devices
Many people still don’t realize that Chrome can do this… and it’s awesome. Right from Chrome developer tools, you can emulate any mobile device you want. This enables you to quickly check if a site is mobile-friendly or not. And as you have probably guessed by now, you can check Google rankings too.

Access Chrome Developer Tools by clicking the menu icon in Chrome, then Tools, and then Developer Tools. Or just click control->shift->i to bring up dev tools. Then click the icon for “Toggle Device Mode” (the mobile phone icon).
Checking Mobile Rankings Using Chrome Developer Tools

Once you do, you can choose the device you want to test and then refresh the page. Boom, you’re now emulating that device. You can see I’m emulating an iphone 6 in the screenshot below.

Emulating An iPhone 6 in Chrome Developer Tools

Also, when you hover your mouse over the screen, the cursor changes to a circle to signify tapping and swiping (like a person would do when using the device). Now access Google and search away. You will see the smartphone search results and you can check the rankings of target queries right from Chrome.

Summary – Check Mobile Rankings To Help Gauge *Your* Impact
Now that the mobile-friendly algorithm has rolled out, it’s important to check your mobile rankings for queries leading to your site. Using the methods listed above, you can quickly identify mobile rankings changes across keywords. And if you do find ranking differences, dig into the situation to find out why. Ensure all of your pages are mobile-friendly, implement any necessary fixes, and regain lost rankings.

Again, I plan to write a post detailing the impact of the mobile-friendly algorithm (once a little more time goes by). So stay tuned. :)



Sinister 404s – The Hidden SEO Danger of Returning The Wrong Header Response Code [Case Study]

Hidden SEO Danger 404 Response Code

A few weeks ago, I was contacted by a small business owner about my SEO services. And what started out as a simple check of a website turned into an interesting case study about hidden SEO dangers. The company has been in business for a long time (30+ years), and the owner was looking to boost the site’s SEO performance over the long-term. From the email and voicemail I received, it sounded like they were struggling to rank well across important target queries and wanted to address that ASAP. I also knew they were running AdWords to provide air cover for SEO (which is smart, but definitely not a long-term plan for their business).

Unfortunately, my schedule has been crazy and I knew I couldn’t take them on as a longer-term client. But, I still wanted to quickly check out their website to get a better feel for what was going on. And it took me about three minutes to notice a massive problem (one that is killing their efforts to rank for many queries). And that’s a shame because they probably should rank for those keywords based on their history, services, content, etc.

Surfacing a Giant SEO Problem
As I browsed the site, I noticed they had a good amount of content for a small business. The site had a professional design, it was relatively clean from a layout perspective, and provided strong content about their business, their history, news about the organization, the services they provided, and more.

But then it hit me. Actually, it was staring me right in the face. I noticed a small 404 icon when hitting one of their service pages (via the Redirect Path Chrome extension). OK, so that’s odd… The page renders fine, the content and design show up perfectly, but the page 404s (returning a Page Not Found error). It’s like the opposite of a soft 404. That’s where the page looks like a 404, but actually returns a 200 code. Well in this situation, the page look like a 200, but returns a 404 instead. I guess you can call it a “soft 200″.

404 Header Response Code in Redirect Path Chrome Extension

So I started to visit other pages on the site and more 404 header response codes followed. Actually, almost every single page on the site was throwing a 404 header response code. Holy cow, the initial 404 was just the tip of the iceberg.

After seeing 404s pop up all over the site, I quickly decided to crawl the website via Screaming Frog. I wanted to see how widespread of a problem it was. And it ends up that my initial assessment was spot on. Almost every page on the site returned a 404 header response code. The only pages that didn’t were the homepage and some pdfs. But every other page, including the services pages, news pages, about page, contact, etc. returned a 404.

Header Response Codes in Screaming Frog

For those of you familiar with SEO, then you know how this problem can impact a website. But for those of you unfamiliar with 404s and how they impact SEO, I’ll provide a quick rundown next. Then I’ll jump back to the story.

What is a 404 Header Response Code?
Every time a webpage is requested, the server will return a header response code. There are many that can be returned, but there are some standard codes you’ll come across. For example, 200 means the page returned OK, 301 means permanent redirect, 302 is a temporary redirect, 500 is an application error, 403 is forbidden, and 404 means page not found.

Header response codes are extremely important to understand for SEO. If you want a webpage indexed, then you definitely want it to return a 200 response code (which again, means OK, the request has succeeded). But if the page returns a 404, then that tells the engines that the page was not found and that it should be removed from the index. Yes, read that last line again. 404s basically inform Google and Bing that the page is gone and that it can be removed from each respective index. That means it will have no shot of ranking for target keywords.

And from an inbound links perspective, 404s are a killer. If a page 404s, then it cannot benefit from any inbound links pointing at the url. And the domain itself cannot benefit either (at an aggregate level). So 404s will get urls removed from Google’s index and can hamper your link equity (at the url level and at the domain level). Not good, to say the least.

Side Note: Checking Response  Codes
Based on what I’ve explained, some of you reading this post might be wondering how to easily check your header response codes. And you definitely should. I won’t cover the process in detail in this post, but I will point you in the right direction. There are several tools to choose from and I’ll include a few below.

You could Fetch as Google in Google Webmaster Tools to check the response sent to Googlebot (which includes the header response code). You can also use a browser plugin like Web Developer Tools or Redirect Path to quickly check header response codes on a url by url basis.

Web Developer Plugin Header Response Code

Fetch as Google and browser plugins are great, but they only let you process one url at a time. But what if you wanted to check your entire site in one shot? For situations like that, you could use a tool that crawls an entire website (or sections of a site). For example, you could use Xenu or Screaming Frog for small to medium sized sites and then a tool like Deep Crawl for larger-scale sites. All three will return a boatload of information about your pages, including the header response codes. Now back to the case study.

Dangerous, But Invisible to the Naked Eye
Remember, the entire site was returning 404 header response codes, other than the homepage and a few pdfs. But this 404 situation was sinister since the webpages looked like they resolved ok. You didn’t see a standard 404 page, but instead, you saw the actual page and content. But, the pages were actually 404ing and not being indexed. Like I said, it was a sinister problem.

Based on what I just explained, you could tell why an SMB owner would be baffled and simply not understand why their website wasn’t ranking well. They could see their site, their content, the various pages resolving, but they couldn’t see the underlying problem. Header response codes are hidden to the naked eye, and most people don’t even realize they are being returned at all. But the response code returned is critically important for how the search engines process your webpages.

Swingers Find Hidden 404s

My Response – “You’re At SEO Defcon 2”
This was a tough situation for me. I absolutely wanted to help the business longer-term, but couldn’t based on my schedule. But I absolutely wanted to make sure they understood the problem I came across while quickly checking out their website.

So I crafted a quick email explaining that I couldn’t help them at this time, but that I found a big problem on their site. As quickly and concisely as I could, I explained the 404 situation, provided a few screenshots, and explained they should get in touch with their designer, developer, or hosting provider to rectify the situation ASAP. That means ensuring their webpages return the proper header response codes. Basically, I told them that if their webpages should be indexed, then they should return a 200 header response code and not the 404s being returned now.

I hit “Send” and the ball was in their court.

Their Response – “We hear you and we’re on the right track – we think.”
I heard back from the business owner who explained they started working with someone to rectify the problem. They clearly didn’t know this was going on and they were hoping to have the situation fixed soon.

But as of today, the problem is still there. The site still returns 404 header response codes on almost every page. That’s unfortunate, since again, the pages returning a 404 have no chance at all of ranking in search and cannot help them from a link equity standpoint. The pages aren’t indexed and the site is basically telling Google and Bing to not index any of the core pages on the site.

I’m going to keep an eye on the situation to see when the changes take hold. And I hope that’s soon. It’s a great example of how hidden technical dangers can destroy SEO.

Opening Up The Site – How Will The Engines Respond?
My hope is that when the pages return the proper response codes that Google and Bing will begin indexing the pages and ranking them appropriately. And that will help on several levels. The website can drive more prospective customers via organic search, while the business can probably pull back on AdWords spend. And the site can grow its power from an inbound link standpoint as well, now that the pages are being indexed properly.

But as I often say about SEO, it’s all about the execution. If they don’t implement the necessary changes, then their situation will remain as-is. I’ll try an update this post if the situation improves.

Summary – Know Your Header Response Codes
Although hidden to the naked eye, header response codes are critically important for SEO. The right codes will enable the engines to properly crawl and index your webpages, while the wrong codes could lead to SEO disaster. I recommend checking your site today (via both manual checks and a crawl). You might find you’re in the clear with 200s, but you also might find some sinister 404s. So check now.



How To Identify A Mobile Rankings Demotion Using The New Search Analytics Report in Google Webmaster Tools

Search Impact Reporting in Google Webmaster Tools

{Update: The Search Impact report was renamed to “Search Analytics” during the beta. The screenshots below will show “Search Impact” when the new report in Google Webmaster Tools is labeled “Search Analytics”.}

April 21, 2015 is an important date. That’s the day when Google will begin using mobile friendliness as a ranking signal. There’s been a lot of talk about how that’s actually going to work, how much of an impact it will have, etc. Well, more and more information has been surfacing over the past few days about the changes.

For example, Gary Illyes spoke at SMX West heavily about the new mobile UX algo and provided some outstanding information. Jennifer Slegg wrote up a recap of that session, which I highly recommend reading. She provided some amazing nuggets of information, including information about mobile friendly techniques, how the algo will handle specific urls, if 4/21 is hard date for the rollout, if Google is building a mobile index (which they are), and more.

So, as 4/21 quickly approaches, many webmasters are working hard to get their sites in order from a mobile UX standpoint. As documented by John Mueller and Gary Illyes (and really Google itself), you can use any of the three options for providing a mobile-friendly version of your website. For example, you can use responsive design, dynamic delivery, or even a separate mobile site. I’ve seen all three techniques work well for clients, so the path you choose should be based on your own site and business. But definitely move quickly… April 21 will roll up quickly.


The *Current* Smartphone Rankings Demotion – A Glimpse Into the Future
Many people don’t realize this, but Google already has a smartphone rankings demotion in place for specific situations. For example, when there are faulty redirects from the desktop version of the content to the mobile version, or if there are other mobile-only errors.

I caught one of those situations in the wild and wrote a two-part case study about it. I first detailed the problems I saw on and then documented the improvements in rankings and traffic once the problems were fixed. Based on what Gary Illyes and John Mueller have both said about the mobile UX algo, it sounds like the new algorithm will work in a very similar fashion to the current smartphone rankings demotion. Therefore, I definitely recommend you review the two-part case study.

Checking For Faulty Mobile Redirects

For example, the current smartphone rankings demotion is on a url by url basis. Just because you have faulty redirects or mobile-only errors does not mean the entire domain should suffer (algorithmically). Also, the desktop urls are unaffected (which makes absolute sense). Also, and this is important, the algorithm is running in real-time and will impact urls during the normal crawling process.

That means urls can be demoted as Google comes across mobile problems, but the demotion can also be lifted as Google crawls the urls and notices that the problems are fixed. And that’s exactly what I saw with the smartphone rankings demotion situations I have helped with.


Checking Mobile Rankings and The (New) Search Analytics Report
Google is currently testing a new search queries report in Google Webmaster Tools (called the Search Analytics report). Note, the report used to be called “Search Impact”, but was changed during the alpha. I have been testing the new version of the Search Analytics reporting and it provides some great functionality beyond what the current Search Queries reporting provides. I plan to write more about that soon, but for now, let’s focus on the mobile friendliness algorithm rolling out on 4/21.

There are six dimensions you can segment your data by in the new Search Analytics reporting. One of those dimensions is “Devices”. Using this report, you can filter data by desktop, mobile, and tablet. See below:

The Devices Dimension in The Search Impact Reporting

But don’t get fooled by the simplicity of the default report. By combining dimensions, you can view some elaborate reports that tell you a lot in a short amount of time.

When working on a smartphone rankings demotion (the current algo in place), I had to identify queries where a site ranked well in the desktop results, and then jump to the search queries reporting using the “mobile” filter for search property. When doing this for a large amount of queries, it could easily get monotonous.

But the new Search Analytics report comes to the rescue and provides a nifty way to see side by side rankings when comparing desktop to mobile. Below, I’m going to show you how to quickly run this report to see a side by side comparison of clicks and average position by query. By doing so, you can quickly identify a smartphone rankings demotion. That’s for the current smartphone rankings demotion, and should work for the new mobile UX algo rolling out on 4/21/15. Let’s jump into the report.


How To Check Rankings By Device
First, if you’re not part of the alpha testing program, then you won’t be able to access the Search Analytics report. But don’t fear, I can only imagine that Google wants to roll it out prior to 4/21/15 (based on the device reporting I’m showing you in this post).

To access the reporting, click “Search Traffic” and then “Search Analytics” in the left-side menu:

Accessing The Search Impact Reporting in Google Webmaster Tools

The default view will show you clicks for the past 30 days. The first thing you need to do is click the “Queries” dimension. That will present all of the queries your site ranks for during the timeframe you selected.

Using The Queries Dimension In The Search Impact Reporting

Next, click the filter dropdown underneath “Devices”, which should say “No filter” (since there isn’t a filter in place yet). Click the dropdown and the select “Compare devices”.

Filtering By Device In The Search Impact Reporting

Keep “Desktop VS. Mobile” as the selection and then click “Compare”.

Comparing By Device In The Search Impact Reporting

You should now see a comparison of clicks per query for both desktop and mobile. That’s great, but we need to know how the site ranks for each query across both desktop and mobile. To see that, click the checkbox for the “Avg. Position” metric.  This will add average position for each query to the report.

Adding The Average Position Metric In The Search Impact Reporting

To view more queries than the default ten, you can use the dropdown at the top of the report. For example, you can show up to 500 rows in the report in Google Webmaster Tools.

Now you can start checking rankings for queries across both desktop and mobile. Don’t expect them to be exactly the same for every query… But they should be close. For example, the first three listed below are very close (two are identical and one is off by just .1).

Comparing Average Position by Query In The Search Impact Reporting

In my experience, when you have a smartphone rankings demotion, there will be a clear difference. For example, some smartphone rankings will be 10+ positions lower (or even non-existent in certain situations). So, if you see rows like the following, then you might have a problem.

Identifying a Rankings Difference In The Search Impact Reporting


How To Identify Problems and Lift The Smartphone Rankings Demotion
If you find that there is a smartphone rankings demotion in place, then you should run to the “Mobile Usability” reporting in Google Webmaster Tools. Google will provide the problems it encountered while crawling your site. I highly recommend fixing those mobile usability issues asap.

Mobile Usability Reporting in Google Webmaster Tools

You can also use the mobile friendly test via the Google Developers site. That will also highlight problems on a url by url basis.

Using Google's Mobile Friendly Test

You can also check the crawl errors reporting in Google Webmaster Tools to see if there are smartphone errors or faulty redirects.

Smartphone Crawl Errors in Google Webmaster Tools

And you can crawl your site as Googlebot for Smartphones to check how your site is handling requests for the desktop pages (if you have mobile redirects in place). Doing so can surface problems sitting below the surface that are sometimes hard to pick up manually.

Crawl As Googlebot for Smartphones


Summary – The Search Analytics Report Can Make An Impact
We all knew that mobile UX would become a ranking signal at some point, but now we have a specific date from Google for the rollout (4/21/15). When the new mobile algo launches, many will be wondering if they have been impacted, if their website dropped in rankings, and which urls are causing problems. As I demonstrated above, the new Search Analytics reporting can help webmasters quickly identify problems by comparing the rankings across desktop and mobile (quickly and efficiently).

If you don’t have access to the Search Analytics reporting yet, don’t worry. Again, I believe Google is going to roll this out before the 4/21 deadline. That would make complete sense, since the “Devices” dimension could prove to be extremely helpful when a smartphone rankings demotion is in place. One thing is for sure. The changes rolling out on (or around) April 21 will be fascinating to analyze. Google said this change will have a “significant impact” on the smartphone search results. And that impact can translate into many lost visitors, conversions, and revenue. Good luck.