Panda 4.2 Analysis and Findings 7 Weeks Into The Extended Rollout – A Giant Ball of Bamboo Confusion

Panda 4.2 Analysis and Findings

Note: I reached out to Google last week to learn more about the current rollout of Panda 4.2, when it would be completed, and other interesting questions I had based on my analysis. But I haven’t heard anything back directly related to those questions. I will update this post if I receive more information about P4.2.

On Wednesday, July 22nd, Barry Schwartz broke huge SEO news. Google finally started rolling out Panda 4.2, which we’ve been eagerly waiting for since 10/24/14. That was the last Panda update, which was over nine months ago at the time! That’s extremely unusual for Panda, which typically rolled out monthly (and even more frequently at certain times).

Google explained to Barry that Panda began rolling out the weekend prior (July 18th) and that this would be an extended rollout (which was also very strange). Then John Mueller explained in a webmaster hangout that the extended rollout was due to technical problems that Google was having with Panda. They didn’t want to push an extended update, but were forced to.

So according to Google, Panda 4.2 could take months to fully roll out. Here’s a tweet from Google’s Gary Illyes confirming the update.

Gary Illyes Confirms Panda 4.2

I’ll be honest. I was completely shocked when I heard about the extended rollout. Panda usually rolled out quickly and sites that were impacted could easily identify the exact date of the impact.

For example, here is a big hit from Panda 4.0 in May of 2014.

Panda 4.0 Hit

And here is a recovery during Panda 4.1 in September of 2014:

Recovery During Panda 4.1

One day, big impact, and easier to associate with a specific Panda update. Ah, those were the days.

Having a specific date makes it much easier for webmasters to understand what hit them, and then what to fix. With the extended rollout of Panda 4.2, sites could theoretically see impact right after 7/18, a few weeks from then, or even a few months out from 7/18. And with Google pushing hundreds of updates throughout the year (and over one thousand last year according to John Mueller), how are webmasters supposed to know if Panda impacted them, or if it was something else (like Phantom, Penguin, or any of the other updates Google rolls out during year)? Short answer: they can’t.

I’ll expand on this topic later in the post, but for now just understand that you can gradually see impact from Panda 4.2 over time. Some sites will see more impact in a shorter period of time, but it’s entirely possible to see smaller movement over months. And of course, you might see nothing at all. That’s a good segue to the next section.

Analyzing Over 7 Weeks of Panda 4.2 Data
I’ve been heavily analyzing the update since we learned about Panda 4.2, and I specifically held off on publishing this post until I reviewed enough data. Due to the extended rollout, I wanted to make sure I gave our new Panda friend enough time to show his face. Now that over seven weeks have gone by, and I’ve seen a number of interesting things along my Panda travels, I decided to finally publish this post.

Warning: You might like what you read, and you might not. But it is what it is. That’s unfortunately the case. Read on.

All Quiet on the Giant Panda Front – Many Typical Panda Players Not Impacted (Yet)
I have access to an extremely large set of Panda data globally. The data includes many sites that have dealt with Panda problems in the past (and quality problems overall). And that includes some of the biggest sites with previous Panda scars. For example, sites with tens of millions of pages indexed, that are inherently Panda-susceptible, and that have dropped and surged over time as they enter and exit the gray area of Panda.

The large Panda dataset I have access to often enables me to see when Panda rolls out (and when other quality algorithms roll out like Phantom did in late April and early May.) The sites I have access to include ecommerce retailers, news publishers, press release websites, directories, affiliate websites, large-scale blogs, song lyrics websites, coupon websites, and more.

As I’ve been analyzing the trending for these websites since 7/18, it was easy to see that many of the larger, Panda-susceptible sites have seen very little impact since Panda 4.2 rolled out. It’s almost like Google didn’t want to touch these sites during the initial rollout. I’m not saying that’s the case (since Panda is algorithmic), but it sure seemed that way.

For example, I’ve seen a lot of trending that looks like this:

Stable Trending Through Panda 4.2

and this:

More Stable Trending Through Panda 4.2

No movement. At all.

This is important to understand if you are monitoring a large-scale website that has been working hard on Panda remediation. If you have seen very little impact so far, it could be that Panda 4.2 simply hasn’t impacted your site yet (or it won’t impact your site at all).

Like many others in the industry, I fully expected more large-scale sites that have previously been impacted by Panda to see movement. But most of the sites that act as Panda barometers have seen little or no impact. It’s very, very strange to say the least. Time will tell if that changes.

Also, John Mueller explained more about the rollout in a webmaster hangout video. He said that technical problems on Google’s end are forcing them to roll out Panda very slowly. Now, we don’t know what those technical problems are, but it seems that if John is correct, then Panda could still impact your site as time goes on. I haven’t seen that happen for most sites, but I guess it’s still possible. Needless to say, this isn’t optimal for webmasters battling the mighty, I mean aging Panda. And it’s extremely out of the ordinary for Panda.

Extended Rollouts and Going Real-Time – The Easiest Way To Hide Impact From Major Algorithm Updates
Google has stated several times that they intend to incorporate Panda into its core ranking algorithm. I don’t know when that will happen, but we are clearly seeing the first signs of that happening. We had Phantom 2 in May, which was a change to Google’s core ranking algo related to how it assesses “quality”. Now we have an extended rollout of Panda, which means we can’t pin the update on a specific date.

And by the way, Google also wants to incorporate Penguin into its core algo. Yes, say goodbye to external algos that are unleashed on the web in one fell swoop.

Every time Google released Panda and/or Penguin, the web erupted. And as you can guess, many that were negatively impacted screamed bloody murder. So much so, that the mainstream media started reporting on algorithm updates. Needless to say, Google doesn’t want that type of attention. It makes them look bad (even when most people will admit they have a really hard job trying to algorithmically determine what’s high quality, what’s spam, what’s unnatural, etc.)

So what’s the easiest way to avoid the flak created by a specific algorithm update? Roll that update out over months, or even better, bake it into the core ranking algorithm. Once you do, then nobody can pin a date on a specific update, nobody can say “Panda is flawed”, “Google has lost control of Penguin”, “Phantom is scarier than the exorcist”, etc.

Adding a drop of Penguin here, a teaspoon of Panda there…

Baking Panda and Penguin Into Core Ranking Algorithm

Well my friends, this is what we are dealing with now Panda-wise. And from a Penguin standpoint, you have a better chance at seeing Halley’s comet than seeing another update. Both are chained, cloaked, and do not have a specific update schedule planned.

Personally, I believe it became harder and harder to release Panda and Penguin updates without causing a lot of collateral damage. Google has many algos running, and I believe it was very hard to gain accurate results when unleashing Panda and Penguin on the web. So here we are. No dates, no information, no evidence, no screaming, and no mainstream media articles about Google algo updates.

With that out of the way, let’s dig deeper with what I’ve seen across my Panda data.

Panda 4.2 Illusions
During my Panda 4.2 travels, I came across a number of examples of websites that looked like they were heavily impacted by Panda 4.2, but actually weren’t. There were other factors at play that caused the drop or gain in traffic after 7/18/15, but just happened to coincide with the rollout of Panda 4.2.

For example, check out the trending below. Wow, that looks like a severe Panda 4.2 hit. But once I dug in, there were technical SEO issues with that site that was causing traffic to go to a sister website. Basically, as one site’s traffic decreased, Google traffic to the sister site increased.

Panda 4.2 Illusion Due To SEO Technical Problems

And here’s an example of another Panda 4.2 illusion. The site began losing significant traffic the week before Panda 4.2 rolled out.  Maybe it was impact from Panda 4.2 being tested in the wild or from the Phantom tremor I saw in mid-July? No, it ends up the website was in the process of migrating to another domain. And the new domain picked up that traffic.

Panda 4.2 Illusion Due To Migration

The key point here is that context matters. Don’t assume you’ve been impacted by Panda 4.2. It very well could be other factors at play. For example, technical SEO problems can cause big swings in organic search traffic, seasonality can come into play, and other factors could cause traffic changes. And this gets exponentially worse as time goes on… That’s because we don’t have a hard Panda date to base drops or gains of traffic on. Remember, there is an “extended rollout” of Panda 4.2.

Panda 4.2 Fresh Hits
So, there has not been widespread impact, but that doesn’t mean there hasn’t been some impact. And from what I can see, some websites seemed to have been heavily impacted right after Panda 4.2 rolled out. For example, sites seeing a significant drop in Google organic traffic (and a major drop in rankings) immediately following the initial rollout.

Now, it’s important to note that these types of hits were uncommon during Panda 4.2. I did not see many of them (at all). But there were some.

For example, check out the trending for this large-scale website. I have provided Searchmetrics data for the domain. Notice the initial drop when Phantom rolled out and then a bigger hit when Panda 4.2 rolled out:

Panda 4.2. Big Hit

This hit did not shock me at all. I’m confident that many people have cursed at their screens after arriving on the site. It’s filled with aggressive ads, pagination for monetization purposes, has technical SEO issues causing content quality problems, and simply provides a horrible user experience. Well, the site took a big hit after Panda 4.2 rolled out.

And then there were sites impacted more gradually since 7/18. For example, here’s a smaller site that has decreased gradually since 7/18. The site didn’t see full impact immediately, but did see a gradual decline in rankings and Google organic traffic since Panda 4.2 rolled out.

Smaller Site Hit By Panda 4.2 Over Time

Those are just two examples, and there are more like them (with more moderate drops than big hits). But the extended rollout is making it very hard to pin a drop in traffic on Panda 4.2. Again, I believe Google would like that to be the case. And remember, Google rolled out over 1,000 changes last year. Based on that number, there may have been 50-80 changes that have rolled out since 7/18 that weren’t Panda 4.2 related. Again, it is what it is. :)

Panda 4.2. Recovery (or at least improvement from P4.2)
I know what many of you are thinking by reading that subheading… Panda 4.2 recovery is an oxymoron! And overall, I would agree with you. As I explained earlier (and it’s important to highlight this), many sites working hard to recover from past Panda hits have not seen impact yet from P4.2. And some of the largest Panda-susceptible sites have also not seen movement. But there has been some positive impact already (to varying degrees).

Below is an example of a surge in Google organic traffic from a website that was hit hard in September of 2014 (when we had both the 9/5 Panda update and Panda 4.1). The website owner reached out to me this winter for help, but I unfortunately couldn’t help due to a lack of availability. That said, we stayed in touch throughout the year. I received an email the Saturday after Panda 4.2 rolled out (7/25/15) explaining that the website was seeing a surge in Google organic traffic. So I took a look at Google Analytics and Google Search Console to see what was going on.

Here is the sustained surge since 7/25/15. The site is up 495%. Note, sessions have been removed at the request of the business owner.

Surge After Panda 4.2

Now, since I didn’t work on the site, perform a deep audit of the site, guide the changes, etc., it’s hard for me to say this was 100% due to Panda 4.2. That said, I have reviewed sections of the original site and the list of changes that were implemented. The items addressed seemed extremely Panda-like, including handling content challenges, affiliate marketing refinements, and making some advertising fixes. Then after Panda 4.2 rolled out, the site surged.

Moving on, there are other websites that have seen partial recovery since P4.2 rolled out. For example, here is a site that increased after Panda 4.2 rolled out. And the site had been hit hard in the past by Panda, but shot up after July 18th.

Google Organic Increase After Panda 4.2

So I definitely agree that recovery from Panda 4.2 is rare, but there are sites that were positively impacted. Why more sites weren’t impacted is hard to say… Again, I reached out to Google for more information now that I analyzed seven weeks of data, but I haven’t received specific answers to my questions. I’ll update this post if I learn more.

Reversals, Tremors, or Other Disturbances in the Force
I know many in the industry have kept track of one of the more public recovery stories from Panda 4.2. I’m referring to Search Engine Roundtable. Barry Schwartz saw an immediate jump in Google organic traffic once Panda 4.2 rolled out and had been steadily increasing over time. It wasn’t a massive surge, but the site was up ~35% since 7/18.

Although some believed the surge was from people searching for Panda 4.2 related articles, the surge was actually from a number of queries across topics. And even when Barry stripped out the Panda-related articles from his analysis, there was still an increase in Google organic traffic.

Well, the increase was reversed in mid-August. So, was this some type of Panda 4.2 tremor or something else? We know that Google can and will tweak major algorithm updates to ensure they are seeing the right results in the SERPs, so it’s entirely possible. And by the way, there were many other people claiming to see similar changes in mid-August.

Panda 4.2. Reversal at Search Engine Roundtable
Image From SER

Personally, I saw websites experience a similar change. One site reversed a downward slide and shot up on 8/13 while the other change was more recent on 8/24 (the site began to drop after increasing). There are many other reports of changes starting in mid-August, so this isn’t just a few sites.

That said, and I’m sorry to have bring this up again, but since we are so far out from the release of Panda 4.2, it’s hard to say the impact is from P4.2. It could be, but it also could be other changes and tweaks. Remember, we have Phantom (which focuses on “quality”) plus Google rolled out over 1,000 changes last year.

Saying Goodbye to Panda, Penguin, and Other Algos Causing Mass Hysteria
In 2013, I wrote a post about Google starting to roll out Panda over 10 days. In that post, I explained that when this happens, or when Panda goes real-time, there will be massive webmaster confusion. If sites experience serious gains or losses at random times throughout the year, without tying that to a specific algorithm update, then how in the world are those webmasters supposed to know what caused the drop? Quick answer: They won’t.

Prior to Panda 4.2, webmasters already had trouble determining what caused a Panda hit. And now that impact can happen any time during the extended rollout of Panda 4.2, that confusion will exponentially get worse. And we know Google fully intends to incorporate Panda (and Penguin) into its core ranking algorithm. When that happens, there won’t be an extended rollout… it will be running all the time. You can technically see gains or decreases at any time. Good luck trying to figure that one out.

Based on what I explained above, we may never see another Panda or Penguin update again. Read that line again. It’s entirely possible that Panda 4.2 will be the last Panda update and that Penguin has become so hard to manage that maybe just parts of it get baked into Google’s core ranking algo. And if that happens, who knows what happens to Panda and Penguin victims. One thing is for sure, Panda 4.2 and Penguin 3.0 were not effective. Actually, I’d go so far as to say they were a mess. It’s hard to look at it any other way.

So yes, I called this in 2013, and here we are. Fun, isn’t it? :) As more days pass from the initial Panda 4.2 rollout date (7/18/15), it will become harder and harder to determine if Panda is impacting your site, if Phantom found new ectoplasm, or if other algorithms are at play. It’s why I’m recommending the nuclear option more and more recently. Identify all quality problems riddling a site, and fix them all. That’s both on-site and off-site.

Summary – What’s Next For Panda?
Well, that’s what I’ve seen so far. Panda 4.2 started rolling out on 7/18, but it can take months to fully roll out. And Phantom is running all the time. And Penguin hasn’t rolled out in a while, and who knows when (or if) it will roll out again. And Google pushes 500-600 updates each year (with over one thousand last year). There’s a lot going on… that’s for sure.

Regardless, I will continue to analyze the impact from Panda 4.2 (and other miscellaneous disturbances in the force) and I plan to write more about my findings as time goes on. In the meantime, keep improving content quality, fix all quality problems riddling your website (on-site and off), drive stronger user engagement, and fix any technical SEO problems that can be causing issues. That’s all you can do right now from a Panda standpoint. Then just hope that Google fixes Panda or bakes it into its core ranking algorithm.

But of course, if that happens, you will never know what hit or helped you. And I think that’s just fine from Google’s standpoint. Good luck.




Challenging Murphy’s Law – 8 Immediate SEO Checks After A CMS Migration Goes Live

Murphy's Law for CMS Migrations

CMS migrations are a necessary evil for many companies. If your current technical setup is inhibiting your business from doing what it needs to be successful, then a CMS migration could be your way out. But migrations should not be taken lightly, and especially for large-scale websites that are changing urls. Any time you change urls, you run the risk of destroying SEO. And if SEO is an important driver of traffic for your site, then a migration could cause serious problems for your business.

Unfortunately, SEOs know Murphy’s Law all too well. It states, “Anything that can go wrong, will go wrong”. Well, large-scale migrations have many moving parts and the chances of a migration going live without some type of hiccup are rare. And you might find many hiccups, gremlins, or flat-out mistakes once the button is pushed and a migration goes live. But to me, it’s how you react once those gremlins are found that can make or break your migration. Quick checks and fast refinements can end up saving your SEO. Speed and accuracy matter.

The Summer of Migrations
For whatever reason, this is the summer of migrations and redesigns for me. Several of my large-scale clients are either migrating to a new CMS or they are redesigning their websites. Therefore, I’ve been neck deep in the strategy, testing, implementation, and auditing of each move. And based on my work this summer, I decided to write a post explaining what you can do as soon as the migration goes live to ensure all is ok (or to nip SEO problems in the bud).

Specifically, I have provided eight immediate checks you should make as soon as your new site goes live. The eight checks cover what you can do immediately following a push to production and can help catch serious problems before they become bigger ones. For SEOs, large-scale CMS migrations are not for the faint of heart. They are stressful, there’s a lot on the line, and there are many areas to monitor to ensure all goes well.

The Power of a Crawlable and Testable Staging Server
Before I hop into the top eight checks you should make after a migration goes live, I wanted to touch on a very important element to a CMS migration – the staging server.

If you can have your new site up and running on a staging server that’s accessible and crawlable to your SEO consultant or agency, then you can (typically) thoroughly test that setup prior to the migration going live. For clients that have this available, I’m able to crawl the new site in staging, test redirects, analyze technical SEO elements, browse with multiple devices, and more. It’s a much easier transition for a site that has a staging server accessible and crawlable than one that doesn’t.

If you don’t thoroughly test the new site in staging, including testing the redirects, you’ll have to test like mad as soon as the new site goes live. And believe me, you will find problems. Then you’ll need to fix those problems on the fly. And those fixes could lead to other problems, that will need to be fixed quickly as well… so on and so forth. It’s a slippery slope.

Keep this in mind if you are planning a CMS migration. Don’t proceed on faith alone… Test and know. That’s how you can nip serious SEO problems in the bud.

Some Prerequisites
Before we move on, you’ll need a few things in place. First, make sure you have all variations of your site set up in Google Search Console (GSC). That includes www, non-www, and if applicable, https www, and https non-www. And if specific subdomains are part of the migration, make sure you have them set up as well. For example,

Second, you should have already collected your top landing pages from the old site. You should export all top landing pages from Google Search Console, Google Analytics, Bing Webmaster Tools, and various link analysis tools. Then you should combine them and dedupe them. That’s your core list of urls to check after the migration goes live. I’ll cover more about this soon.

Third, you’ll need a tool that can crawl your site. The two crawlers I use extensively are DeepCrawl (for large-scale crawls) and Screaming Frog (for smaller crawls or for laser-focused crawls). This is how you will check your redirects in bulk. Note, I’m on the customer advisory board for DeepCrawl. It’s one of my favorite tools for enterprise crawls and I’ve been using it for years.

OK, now that you have the necessary elements in place, it’s time to perform eight immediate checks once the migration goes live. Note, the following list is just the beginning of your testing process. You definitely want to continue analyzing the migration over time to ensure all is ok. What I’m providing below should be checked as the button is pushed and your new site goes live.

1. Robots.txt and robots.txt Tester
Yes, robots.txt is a simple text file, but one that can cause all sorts of problems. Developers will often use a staging-specific robots.txt file which can easily get pushed to production by accident. And if it blocks important files or directories from being crawled, you could kill your SEO traffic.

So check this first after the new site goes live. Make sure it’s the version that should be in production and that the directives are free of errors. And make sure important areas of the site are not being blocked. That includes CSS and JavaScript that are necessary for Google to render the page properly. More about that soon.

And use the robots.txt Tester in Google Search Console. It’s a sandbox that enables you to test urls on your site. If you notice urls being blocked that shouldn’t be blocked, hunt down the directives causing the problems. You can edit the robots.txt file in GSC to test your changes (without impacting your actual robots.txt file).

Checking Robots.txt After CMS Migration

2. Check For Google Analytics and/or Other Analytics Code
Before you and your client’s executive team check Google Analytics after the migration goes live, and have a heart attack, make sure you add the necessary analytics code to your new site. If not, you will see traffic drop off a cliff, when in fact, that’s not really happening. It’s a scary sight for sure.

And more importantly, you will lose visibility into how the migration is going. I continually check various sources of traffic over time after a migration goes live to ensure all pages are being handled properly. If you drop your tracking code, then you’ll be out of luck.

Check Google Analytics Code After CMS Migration

Quick Tip: Providing Analytics Air Cover
It’s not a bad idea to have multiple analytics packages running on a site. For example, I have some clients running both Google Analytics and Adobe Analytics on the same site. When a recent migration went live without GA tagging (by accident), we could check Adobe Analytics to see how the migration was going. It provided air cover as GA tracking was put in place.

3. Canonical URL Tag Errors
Rel canonical is a good example of a single line of code that can wreak havoc on your site SEO-wise. When the migration goes live, quickly check core page types on the site to ensure canonical url tags are set up properly. You can also quickly crawl a snapshot of the site to get a feel for how rel canonical is being handled in bulk. If not, you can destroy rankings and subsequent traffic to pages that were performing extremely well before the migration.

You can check my post about rel canonical problems to learn about the various issues that can be implemented by accident. For example, all canonicals pointing to a homepage, canonicals pointing to 404s, endless canonical loops, etc. A quick check of the canonical url tag after a crawl can save your SEO.

Checking Rel Canonical After CMS Migration

4. Meta Robots Tag Problems
Along the same lines, the meta robots tag could have a serious impact on your site if the wrong content values are used. For example, you could be noindexing important pages, or on the flip side, you could be opening up pages that shouldn’t be indexed.

Again, manual checks plus a snapshot crawl will give you a view at the meta robots tag across many pages. You can start to pick up patterns and make changes before serious damage can be done.

Checking Meta Robots Tag After CMS Migration

5. Mobile-friendly Test (MFT)
Since Google announced the mobile-friendly algorithm, many companies have taken the plunge and moved to a responsive design or mobile urls. So when you make a big change, like moving to a new CMS, you definitely want to check the mobile-friendliness of your new urls.

Unfortunately, you can’t run the mobile-friendly test on a staging server that requires authentication. You can definitely run others tests while in staging to ensure the site is mobile-friendly, which you should do. But remember Murphy’s Law… don’t trust that your staging urls are behaving the same way as your production urls. That’s why you should absolutely run Google’s official mobile-friendly test once the new site is live.

To begin, I recommend testing key urls by category. That would include your homepage, category urls, specific products or services, and other important types of pages on the site.

Running New URLs Through Google's Mobile-Friendly Test

And then you can use a tool like Url Profiler to check mobile-friendliness in bulk. Import a list of urls on the new site and fire away. The resulting spreadsheet will provide details about mobile-friendliness. Then you can hunt down any problems and rectify them quickly.

Checking Mobile-Friendliness In Bulk Via URL Profiler

6. Fetch and Render in GSC
To ensure Google can fetch the necessary resources to accurately render the page at hand, you should run important urls through fetch and render in Google Search Console. Similar to the mobile-friendly test, you cannot run fetch and render on a staging server that requires authentication. Therefore, you’ll need to test this out as soon as the site goes live.

Google has explained repeatedly that if you block resources, including CSS and JavaScript, then that will impact how Google indexes your pages. Google wants to render the page just like a user would in a browser. So as I said on Twitter a few weeks ago, “If you want to rock, don’t block.” :)

Using Fetch and Render on New URLs

7. Check XML Sitemaps
XML sitemaps are an important supplement to a traditional web crawl. Using sitemaps, you can feed both Google and Bing all of your canonical urls. After going live with a new CMS, you will have a list of new urls and old urls. In the short-term, you should submit both your old urls and your new ones via xml sitemaps. Continuing to submit your old urls for a short time will enable the engines to quickly find the 301 redirects to the new urls. That can quicken up the process of getting the new urls crawled and indexed.

Based on the migration, you will definitely want to check your new xml sitemaps to ensure they are being published accurately. First, you should check the sitemaps reporting in Google Search Console for both warnings and errors. If you see anything out of the ordinary, identify the problems and send to your developers ASAP. You want to nip problems in the bud.Checking XML Sitemap Warnings in Google Search Console

Second, you should crawl your new sitemaps to ensure they lead to the right urls that resolve with 200 header response codes. I can’t tell you how many times I’ve crawled new sitemaps and found 404s, 500s, and redirects. Note, if your old urls are changing, then they should 301, since the old urls are redirecting to your new ones. But the new sitemap should not have any redirects or non-200 response codes. You want clean sitemaps, not “dirty” ones.

You don’t want your crawl graph to look like this:

Crawl Graph of XML Sitemap URLs

Crawling XML Sitemaps To Check Response Codes

And while we’re on the topic of crawling urls, let’s jump to an incredibly important check – crawling your top urls from before the migration!

8. Crawl Top Landing Pages From The Old Site (and Check Redirect Chains)
When you migrate to a new CMS, there’s a good chance you’ll be changing urls. And even if one character changes in your url, then it’s a brand new one to Google. So, in order to maintain search equity during the migration, it’s critically important to organize and then crawl your top landing pages to ensure they resolve accurately. Your top landing pages are urls that ranked well prior to the migration, the ones driving traffic, the ones that have built inbound links, and obviously the ones sustaining your business. Don’t assume your redirects are working well. There are many reasons they might not be.

The first thing you need to do is collect all important landing pages (as explained earlier). You can find these landing pages in Google Analytics, Google Search Console, Bing Webmaster Tools, and from various link analysis tools. Once you download them all, you should combine them, and then dedupe them.

Deduping URLs in Excel

I use both DeepCrawl and Screaming Frog to check the redirects list. Depending on the size of your list, you might have a few hundred urls, or you might have a few hundred thousand urls (or even more). DeepCrawl is extraordinarily good at crawling many urls (over 100K), while Screaming Frog is outstanding for laser-focused crawls (under 50K urls).

When crawling with Screaming Frog, ensure that “follow redirects” is enabled in your settings. This will allow the Frog to not only check the url you feed it, but it will also follow the redirect chain. That’s incredibly important, since just setting up a 301 isn’t enough… That 301 needs to lead to a 200 code at the true destination url.

Following Redirects in Screaming Frog

One of the biggest mistakes I’ve seen is assuming all 301s work perfectly. In other words, you crawl your top landing pages and they all 301. That’s great, but where do they lead? If they don’t lead to the new url that resolves with a 200 code, then you could be killing your SEO.

In Screaming Frog, crawl your urls and then export the redirect chains report (which is accessible from the reports dropdown). When opening that file in Excel, you will see the initial url crawled, the header response code, and the number of redirects encountered. You can follow the sheet along to the right to see the next url in the chain, along with its response code.

Analyzing Redirect Chains After A CMS Migration

Does the destination url resolve with a 200 code or does that redirect again? If it redirects again, you are now daisy-chaining redirects. That’s not great, as Google will only follow a certain number of redirects before giving up. And as you can guess, you can keep following the chain to the right to see how each url in the list resolves. You might be surprised what you find… like three, four, or even more redirects in the chain. And even worse, you might find daisy-chained redirects that lead to 404s. Not good.

In DeepCrawl, you can export the 301 redirect report, which will also provide the redirect chain. If you have many urls to check, then DeepCrawl can be extremely helpful. It doesn’t run locally and can handle hundreds or thousands, or even millions of urls.

Once your export the urls, you’ll need to use Excel’s “text to columns” functionality to break apart the redirect chain column. Once you do, you can follow the chain to the right to see how each url resolves. Again, you might find 404s, excessive redirect chains, or redirects that lead to 500s or other errors. The core point is that you won’t know until you crawl the old urls, and the redirect chain. So crawl away.

Checking Redirect Chains in DeepCrawl


Summary – Never Assume All Is OK With A Migration
Unfortunately, Murphy’s Law almost always comes into play with a CMS migration. And that’s especially the case for larger-scale websites with a lot of moving parts. Even if you’ve heavily checked a site in staging, make sure you perform the necessary checks once the new site is pushed to production. If not, you run the risk of having major problems severely impact SEO.

And the point of migrating to a new CMS isn’t to destroy SEO… it’s to increase the effectiveness of your website! Although the list I provided above is just a starting point, they are important factors to check once the new site goes live. Good luck.



Phantom Tremors Continue As SEOs Wait For Panda – Latest Tremor on 7/14/15

Phantom 2 Tremors

As many of you know, Phantom 2 began rolling out on April 29, 2015, just days after the mobile-friendly update was released. Phantom 2 was a big update that Google initially denied. During my analysis of the update, I saw a lot of movement across sites dealing with content quality problems. It was clear from the beginning that the algorithm update was focused on content quality (like Panda). That’s one of the reasons many people (including myself) initially believed it was a Panda update.

Once Google confirmed the update, they explained that it was a change to its core ranking algorithm with how it assessed “quality”. Even though Google explained that Phantom was part of its core ranking algorithm, I had a feeling that websites would not be able to recover quickly. The reason? I had seen this before, specifically with the original Phantom update in May of 2013. Phantom 2 was eerily similar to Phantom 1 (and rolled out almost 2 years to the day that Phantom 1 rolled out). Interesting timing to say the least.

Both Phantom 1 and 2 Were Panda-Like
With Phantom 1, I also saw the update target content quality problems. The companies I helped with fresh hits had a dangerous recipe of user engagement issues, content quality problems, and technical issues that all led to significant drops in organic search traffic. It took nearly three months before I saw the first Phantom 1 recovery, with others following as time went on.

It took a lot of work to see recovery from Phantom 1… it was not a trivial undertaking. And I expect Phantom 2 remediation and recovery to follow suit.

Actually, we are already over two months out from the Phantom 2 update (late April/early May), and I haven’t seen any major recoveries yet. That being said, I’ve seen a lot of movement on specific dates following Phantom. That includes both additional drops in traffic or sudden jumps for Phantom 2 victims. And that’s the topic of this post – Phantom tremors.

Multiple Phantom Tremors Since 4/29
After Panda 4.0 in May of 2014, I saw what looked like near-weekly updates impacting Panda victims. I called these updates Panda tremors, and they were confirmed by Google’s John Mueller.

Panda 4.0 Tremors and John Mueller

Basically, after Google pushes a major update, it can refine and tweak an algorithm and then push smaller updates (to ensure everything is working the way they want). I saw many of those tremors after Panda 4.0. Well, Phantom 2 has been similar. I started seeing Phantom tremors soon after the initial rollout and I have seen several more over time.

And most recently, I saw the latest Phantom tremor starting on 7/14/15. I’ve seen the impact with companies I am helping now, but I’ve also had new companies reach out to me explaining what they are seeing. And when tremors roll out, Phantom victims can see more impact (mostly negative based on my analysis, but there has been some positive movement).

Negative movement makes a lot of sense if the site in question hasn’t made any efforts to improve from a quality standpoint. For example, here is the hourly trending for a site hit by a Phantom tremor on 7/14. Notice the change in trending starting around 11AM ET on 7/14:

Traffic Drop From Phantom Tremor on July 14, 2015

And here is a screenshot of a site that saw positive movement starting around that time:

Traffic Gain From Phantom Tremor on July 14, 2015

Here is another screenshot of negative impact, this time showing the drop in clicks when comparing traffic after the tremor to the timeframe prior (based on the 6/8 tremor):

Traffic Drop Based on 6/8 Phantom Tremor

(Update: Here are some more screenshots of impact from the 7/14 Phantom tremor):

This example shows impact during the late June tremor and then more impact on 7/14:

Impact from two Phantom tremors

And here is an example of a quick jump at the end of June with that gain being rolled back during the 7/14 update:

Temporary Recovery During Phantom Tremor

Documenting Important Phantom Dates:
In order for Phantom victims to track their own trending based on Phantom tremors, I have included important dates below. It’s always important to understand why drops or spikes occur, so I hope this list of dates provides some clarity:

Original Phantom 2 Rollout: 4/29/15
Phantom Tremor: 5/27/15
Phantom Tremor: 6/8/15
Phantom Tremor: 6/17/15
Phantom Tremor: 6/28/15
Phantom Tremor: 7/14/15

What This Means for Panda Victims
Remember, many Phantom victims have been impacted previously by Panda. For example, some sites that were still working to recover from Panda got hit by Phantom, while others that already recovered from Panda got hit too. Yes, some of those sites saw drops after fully recovering from Panda.

For those sites still impacted by previous Panda updates that got hit by Phantom 2, it’s clearly not a great sign. We know that Phantom focuses on “quality” and so does Panda. I can’t imagine Phantom reacting negatively to your site, while Panda reacts positively. If that’s the case, then Google has an algo problem.

Now, if a site has recovered from Panda and then saw negative impact from Phantom, then that does not bode well for the next Panda update. Phantom is clearly picking up quality problems, which could also contribute to a future Panda hit. We are eagerly awaiting the next Panda refresh so recent Phantom hits on sites that have recovered from Panda should be concerning.

Panda Recovery Followed By Phantom 2 Hit

What Are Phantom Tremors?
Great question, and it’s hard to say exactly. Google could be refining the algorithm and then rolling it out again. Or they could be adding more factors to Phantom and rolling it out. Or they could be adjusting the threshold for each factor. All three are possibilities. That’s why it’s so important to get out of the gray area of Phantom.

I wrote a post recently about the SEO nuclear option and how it relates to both Panda and Phantom hits. Let’s face it, with multiple major quality algorithms running like Phantom and Panda, it’s ultra-important for websites to identify and fix all quality problems riddling their sites. If not, then they can see more impact. And in a worst-case scenario, they could get hit by both Phantom and Panda, which is what I call Phantanda. And although Phantanda sounds like a soda or rock band, it’s not cool or fun. You don’t want to experience it.

Is Phantom The Real-time Panda?
In my Phantanda post, I brought up the possibility that Phantom was actually the migration of Panda (or Panda factors) to Google’s real-time algorithm. It’s entirely possible this is the case. I’ve analyzed many sites hit by Phantom 2 and every quality problem I surfaced would have been picked up by a thorough Panda audit. The factors are extremely similar…. almost identical actually.

By the way, after Phantom 2 rolled out on 4/29, Josh Bachynski tweeted a similar thought. I said at the time that he was on to something… and I still believe that.

Josh Bachynski Tweet After Phantom 2

And with Google having all sorts of problems with Panda, this makes even more sense. For all we know, Panda might have problems with Phantom (as Phantom is part of Google’s core ranking algorithm and also focuses on quality). If that’s the case, then the “technical problems” Google has explained could be Panda and Phantom at odds with one another.

That’s total speculation, but there are now two quality cowboys in town. And to me, this SEO town might be too small for the both of them. It wouldn’t surprise me to find out that Panda is slowly being incorporated into Google’s core ranking algorithm. Remember, it’s been almost nine months since the last Panda update (10/24/14), while we’ve had multiple Phantom updates (the initial rollout and then several tremors). Hopefully we’ll find out soon what’s going on with Panda, and its relationship with Phantom.

Moving Forward: Keep Implementing (The Right) Changes
To quickly recap this post, if you were impacted by Phantom 2 in late April or early May, then you very well could have seen further impact during one or more Phantom tremors. I would check the dates I listed above and see if your site saw any drops or spikes during that timeframe.

And more importantly, continue to make the right changes to your website. Audit your site through a quality lens, identify all problems riddling your site, and move quickly to rectify those problems. That’s how you can rid your site of both bamboo and ectoplasm. Good luck.



Phantanda – Why The SEO Nuclear Option Is Important For Sites Hit By Phantom 2 and Panda

The SEO Nuclear Option for Phantanda Victims

Panda can be devastating. We all know that’s the case and it’s been documented to the nth degree since February of 2011. And now we have Phantom (AKA “The Quality Update”), which was a change to Google’s core ranking algorithm regarding how it assesses “quality”. Between the two, you can clearly see that Google is heavily focused on increasing the quality of the search results.

For webmasters, it’s bad enough when you see a big drop in rankings and traffic from Panda alone. Some sites can drop by 60%+ when Panda rolls through. But for many sites riddled with bamboo, little did they know that Panda has a close friend named Phantom who has no problem kicking a site while it’s down.

Since the end of April, I saw a number of sites that were already impacted by Panda see more negative impact from Phantom. And then there were some sites that recovered from Panda previously, only to get hit to some degree by Phantom. And as you can guess from the title of this post, I call these double-hits Phantanda.

In this post, I’ll explain more about Phantanda, how it relates to Panda, I’ll introduce the SEO nuclear option, explain why it’s important, and then I’ll end by providing some recommendations for those that want to go nuclear.

Opportunities for Recovery and Frequency of Quality Updates
As mentioned above, Google now has a one-two quality punch which I’m calling Phantanda. It’s not a soft drink or a rock band, but instead, a devastating mix of algorithms that can wreak havoc on a website’s organic search traffic.

If you’ve been hit by Phantanda, then it’s incredibly important to heavily audit your site through a quality lens. That audit should produce a thorough remediation plan for tackling any problems that were surfaced during the audit. Then you need to move quickly to execute those changes (flawlessly). And then you need Google to recrawl all of those changes and remeasure engagement. Yes, this is not a trivial process by any means…

An example of a Phantom hit on a site that has struggled with Panda:
Hit by The Quality Update (AKA Phantom 2)

In the past, Google used to roll out Panda monthly. That was great, since there were many opportunities for sites to recover as they made changes, removed bamboo, improved the user experience on their websites, and published higher quality content. But as many of you know, Panda hasn’t rolled out since 10/24/14. That’s a horrible situation for many battling the mighty Panda.

Sure, Gary Illyes said the next Panda update is coming soon (within weeks), but it has still been way too long between Panda updates. And that’s especially the case when we saw weekly Panda tremors after Panda 4.0.

The fact is we need more Panda updates and not less (as crazy as that sounds). Hopefully the next Panda update is right around the corner. We’ll see.

John Mueller clarifies Panda tremors after Panda 4.0:
John Mueller Clarifies Panda Tremors

Ectoplasm vs. Bamboo
From a Phantom standpoint, Google implemented changes to its core ranking algorithm with how it assessed “quality”. It was very similar to the first Phantom update, which was in May of 2013. Phantom 2 was eerily similar to Phantom 1 and I’ve done a boatload of research and work with both updates.

The good news is that websites were able to recover from Phantom 1. The bad news is that it took months of remediation work (similar to Panda remediation). I believe the first recovery I saw took approximately three months, while others took longer.

An example of recovery from Phantom 1 in 2013:
Example of Phantom 1 Recovery from 2013

Based on my analysis of Phantom 2 (and my work helping companies that have been impacted), I believe the remediation and recovery process is similar to Phantom 1 (longer-term). And that make sense. The Quality Update (Phantom 2) rolled out on 4/29, so websites need enough time to audit, strategize, execute, and wait for Google to process those changes. Remember, Google needs to crawl and measure the changes. It’s not like going mobile-friendly, which is a binary test for now (yes or no). In my opinion, it’s much more complicated than that.

Phantom 2 victim still negatively impacted:
Sustained Negative Impact from Phantom 2

In my opinion, Phantom remediation is very Panda-like. Take a long-term approach, truly understand the various quality problems riddling a website, and then take aggressive action to rectify those problems. And that leads me to the core point of this post – the nuclear option.

The SEO Nuclear Option – It’s No Walk in the Park, But It’s Worth It
I started referring to the nuclear option in June of 2013 when Google first started talking about Panda going real-time. I explained that if that was the case, then webmasters would have no idea what hit them. And that would make it harder to understand what caused the negative impact and how to fix the problem(s).

And as other algorithms crossed Panda in the wild, I brought up the nuclear option again. For example, when Google rolled out Panda during an extended Penguin rollout. Yes, they did that… To pull a quote from Jurassic Park, “Clever girl…” :)

Panda During Penguin - Clever Girl

When this happened, most people thought they were hit by Penguin, when in fact, they were hit by Panda. And that’s a horrible situation with the potential of disastrous results. Imagine nuking many of your links thinking you were hit by Penguin, when you really needed to improve content quality. I had many confused webmasters reach out to me after the 10/24 Panda update.

And now we have Phantom throwing its hat in the ring. As mentioned earlier, many of the problems I surfaced with Phantom victims were extremely similar to Panda problems. For example, I would have surfaced the same problems when completing a thorough audit (whether a site was impacted by Panda or Phantom). Heck, for all we know Phantom is actually the beginning of Panda being incorporated into Google’s core ranking algorithm. It’s entirely possible.

And of course we still have Penguin, with a release schedule less frequent than Halley’s comet passing earth. So based on what I just explained, what can a webmaster do when all of these algorithms are running around the web? Enter the nuclear option.

What Is The SEO Nuclear Option?
Simply put, the nuclear option involves identifying all SEO problems for a particular website and forming a plan for executing all of the changes over a period of time. That includes both on-site problems (like content quality, user engagement, advertising issues, mobile-friendliness, etc.) and off-site problems (like unnatural links, syndication issues, etc.)

SEO Thermonuclear War

Yes, it’s a lot of work, but again, it’s completely worth it in the long-term (in my opinion). The companies I’ve helped that decided to go down the nuclear path simply couldn’t take it anymore… They were tired of chasing algorithms, tinkering with urls, tweaking this, and then that, only to find themselves hit again by another algorithm update. The grey area of Panda or Phantom is enough to drive a webmaster insane.

Recommendations for Going Nuclear:
As you can imagine, pushing the giant red button is a big undertaking that should not be taken lightly. So based on my experience helping companies with the nuclear option, I’ve provided some recommendations below. My hope is that if you choose to go nuclear, you can follow these tips to ensure you maximize your efforts (and avoid launching duds).

  • Long-Term Process – Understand that you are making changes to achieve long-term success. You are not looking for short-term wins. You want to avoid quality problems (and negative impact from quality algorithms) over the long-term. Understand that you might actually drop in the short-term until Google crawls, processes, and measures the changes.
  • Extensive and Thorough Audit – Tackle on-site quality problems head-on. Have a thorough SEO audit completed through a quality lens. That should help identify problems that could be contributing to both Panda and Phantom. The audit should produce a clear remediation plan organized by priority.
  • Nuke Those Links – Deal quickly and decisively with unnatural links. Have a thorough link audit completed. Identify all unnatural links and deal with them as quickly as possible. That includes removing, nofollowing, or disavowing unnatural links.
  • Go Mobile-friendly – This is NOT just for Google. This is for users as well. Depending on your situation, you may choose to go responsive or you might choose separate mobile urls. Regardless, create a solid mobile user experience. It should pay off on several levels, including more organic search traffic, stronger engagement, more social sharing, and increased conversion.
  • Redesigns Can Be Good (just be careful) – Don’t be afraid to redesign what needs to be improved. Website designs from 1997 will not suffice anymore. Cluttered and disorganized user experiences can kill you from a Panda and Phantom standpoint. Don’t be afraid to redesign your site, or various elements of the site. Just make sure you take the necessary steps to maintain search equity during the launch.
  • Ask Humans (Yes, real people.) – I’ve helped a number of companies deal with Panda and Phantom hits that had serious engagement problems. For example, aggressive advertising, deceptive affiliate issues, horrible user experience, and other nasty problems). Those problems were abundantly clear to me, but not to the companies I was helping. Panda is about user happiness (and so is Phantom to an extent). Have unbiased real people go through your site. Have them provide feedback based on going through a task, or several tasks. Take their feedback to heart and then make changes.
  • Avoid SEO Band-aids – Make the necessary changes no matter how big they are. If you want the greatest chance of recovery for the long-term, then you must be decisive and then execute. Band-aids lead to future algorithm hits. Big changes lead to success.

Summary – It’s OK To Push The Giant Red Button
With multiple quality algorithms crossing streams, it has never been more important to consider the SEO nuclear option. Pushing the giant red button can be a tough decision, but it can also rid your site of nasty quality problems that attract Pandas and Phantoms. And if there’s something worse than having bamboo on your site, it’s adding ectoplasm to the mix. Sure, choosing the SEO nuclear option is a bold move, but that might be exactly what you need to do.



How To Identify and Avoid Technical SEO Optical Illusions

Technical SEO Optical Illusions

Without a clean and crawlable website structure, you’re dead in the water SEO-wise. For example, if you don’t have a solid SEO foundation, you can end up providing serious obstacles for both users and search engines. And that’s never a good idea. And even if you have clean and crawlable structure, problems with various SEO directives can throw a wrench into the situation. And those problems can lie beneath the surface just waiting to kill your SEO efforts. That’s one of the reasons I’ve always believed that a thorough technical audit is the most powerful deliverables in all of SEO.

The Power of Technical SEO Audits: Crawls + Manual Audits = Win
“What lies beneath” can be scary. Really scary… The reality for SEO is that what looks fine on the surface may have serious flaws. And finding those hidden problems and rectifying them as quickly as possible can help turn a site around SEO-wise.

When performing an SEO audit, it’s incredibly important to manually dig through a site to see what’s going on. That’s a given. But it’s also important to crawl the site to pick up potential land mines. In my opinion, the combination of both a manual audit and extensive crawl analysis can help you uncover problems that may be inhibiting the performance of the site SEO-wise. And both might help you surface dangerous optical illusions, which is the core topic of my post today.

Uncovering Optical SEO Illusions
Optical illusions can be fun to check out, but they aren’t so fun when they can negatively impact your business. When your eyes play tricks on you, and your website takes a Google hit due to that illusion, it’s not so fun.

The word “technical” in technical SEO is important to highlight. If your code is even one character off, it could have a big impact on your site SEO-wise. For example, if you implement the meta robots tag on a site with 500,000 pages, then the wrong directives could wreak havoc on your site. Or maybe you are providing urls in multiple languages using hreflang, and those additional urls are adding 30,000 urls to your site. You would definitely want to make sure those hreflang tags are set up correctly.

But what if you thought those directives and tags were set up perfectly when in fact, they aren’t set up correctly. The look right at first glance, but there’s something just not right…

That’s the focus of this post today, and it can happen easier than you think. I’ll walk through several examples of SEO optical illusions, and then explain how to avoid or pick up those illusions.

Abracadabra, let’s begin. :)

Three Examples of Technical SEO Optical Illusions
First, take a quick look at this code:

Technical SEO Problem with hreflang

Did you catch the problem? The code uses “alternative” versus “alternate”. And that was on a site with 2.3M pages indexed, many of which had hreflang implemented pointing to various language pages.

Hreflang Using Alternative vs. Alternate

Now take a look at this code:

SEO Technical Problem with rel canonical

All looks ok, right? At first glance you might miss it, but the code uses “content” versus “href”. If rolled out to a website, it means rel canonical won’t be set up correctly for any pages using the flawed directive. And on sites where rel canonical is extremely important, like sites with urls resolving multiple ways, this can be a huge problem.

Technical SEO problem with rel canonical

Now how about this one?
Technical SEO problem with meta robots

OK, so you are probably getting better at this already. The correct value should be “noindex” and not “no index”. So if you thought you were keeping those 75,000 pages out of Google’s index, you were wrong. Not a good thing to happen while Pandas and Phantoms roam the web.

Meta Robots problem using no index vs. noindex

I think you get the point.

How To Avoid Falling Victim To Optical Illusions?
As mentioned earlier, using an approach that leverages manual audits, site-wide crawls, and then surgical crawls (when needed) can help you nip problems in the bud. And leveraging reporting in Google Search Console (formerly Google Webmaster Tools) is obviously a smart way to proceed as well. Below, I’ll cover several things you can do to identify SEO optical illusions while auditing a site.

SEO Plugins
From a manual audit standpoint, using plugins like Mozbar, SEO Site Tools, and others, can help you quickly identify key elements on the page. For example, you can easily check rel canonical and the meta robots tag via both plugins.

Using Mozbar to identify technical seo problems.

From a crawl perspective, you can use DeepCrawl for larger crawls and Screaming Frog for small to medium size crawls. I often use both DeepCrawl and Screaming Frog on the same site (using “The Frog” for surgical crawls once I identify issues through manual audits or the site-wide crawl).

Each tool provides data about key technical SEO components like rel canonical, meta robots, rel next/prev, and hreflang. Note, DeepCrawl has built-in support for checking hreflang, while Screaming Frog requires a custom search.

Using DeepCrawl to identify SEO technical problems.

Once the crawl is completed, you can double-check the technical implementation of each directive by comparing what you are seeing during the manual audit to the crawl data you have collected. It’s a great way to ensure each element is ok and won’t cause serious problems SEO-wise. And that’s especially the case on larger-scale websites that may have thousands, hundreds of thousands, or millions of pages on the site.

Google Search Console Reports
I mentioned earlier that Google Search Console reports can help identify and avoid optical illusions. Below, I’ll touch on several reports that are important from a technical SEO standpoint.

Index Status
Using index status, you can identify how many pages Google has indexed for the site at hand. And by the way, this can directory-level (which is a smart way to go). Index Status reporting will not identify specific directives or technical problems, but can help you understand if Google is over or under-indexing your site content.

For example, if you have 100,000 pages on your site, but Google has indexed just 35,000, then you probably have an issue…

Using Index Status in Google Search Console to identify indexation problems.

International Targeting
Using the international targeting reporting, you can troubleshoot hreflang implementations. The reporting will identify hreflang errors on specific pages of your site. Hreflang is a confusing topic for many webmasters and the reporting in GSC can get you moving in the right direction troubleshooting-wise.

Using International Targeting reporting in GSC to troubleshoot hreflang.

Fetch as Google

Using Fetch as Google, you can see exactly what Googlebot is crawling and the response code it is receiving. This includes viewing the meta robots tag, rel canonical tags, rel next/prev, and hreflang tags. You can also use fetch and render to see how Googlebot is rendering the page (and compare that to what users are seeing).

Uisng fetch as google to troubleshoot techncial SEO problems.

Robots.txt and Blocked Resources
Using the new robots.txt Tester in Google Search Console enables you to test the current set of robots.txt directives against your actual urls (to see what’s blocked and what’s allowed). You can also use the Tester as a sandbox to change directives and test urls. It’s a great way to identify current problems with your robots.txt file and see if future changes will cause issues.

Using robots.txt Tester to troubleshoot technical SEO problems.

Summary – Don’t Let Optical Illusions Trick You, and Google…
If there’s one thing you take away from this post, it’s that technical SEO problems can be easy to miss. Your eyes can absolutely play tricks on you when directives are even just a few characters off in your code. And those flawed directives can cause serious problems SEO-wise if not caught and refined.

The good news is that you can start checking your own site today. Using the techniques and reports I listed above, you can dig through your own site to ensure all is coded properly. So keep your eyes peeled, and catch those illusions before they cause any damage. Good luck.