Thursday, April 5th, 2012

How to Use Fetch as Googlebot to Submit a Recently Changed Page to Google’s Index [Tutorial]

by Glenn Gabe

Fetch as Googlebot in Google Webmaster Tools

Any marketer focused on SEO will tell you that it’s sometimes frustrating to wait for Googlebot to recrawl an updated page.  The reason is simple.  Until Googlebot recrawls the page, the old content will show up in the search results.  For example, imagine someone added content that shouldn’t be on a page, and that new content was already indexed by Google.  Since it’s an important page, you don’t want to take the entire page down.  In a situation like this, you would typically update the page, resubmit your xml sitemap, and hope Googlebot stops by soon.  As you can guess, that doesn’t make anyone involved with the update very happy.

For some companies, Googlebot is visiting their website multiple times per day.  But for others, it could take weeks to get recrawled.  So, how can you make sure that a recently updated page gets into Google’s index as quickly as possible?  Well, Google has you covered.  There’s a tool called Fetch as Googlebot that can be accessed within Google Webmaster Tools that you can use for this purpose.  Let’s explore Fetch as Googlebot in greater detail below.

Fetch as Googlebot and Submit to Index
If you aren’t using Google Webmaster Tools, you should be.  It’s an incredible resource offered by Google that enables webmasters to receive data directly from Google about their verified websites.  Google Webmaster Tools also includes a number of valuable tools for diagnosing website issues.  One of the tools is called Fetch as Googlebot.  The primary purpose of Fetch as Googlebot is to submit a URL, and then view how the page looks to Googlebot (Google’s web crawler).  This can help you diagnose issues with the URL at hand.  For example, is Googlebot not seeing the right content, is the wrong header response code being thrown, etc.

But, that’s not the only use for Fetch as Googlebot.  In 2011, Google added functionality for submitting that URL to its index, right from the tool itself.  You can submit up to 500 URL’s per week via Fetch as Googlebot, which should be sufficient for most websites.  This can be a great solution for times where you updated a webpage and want that page refreshed in Google’s index as quickly as possible.  In addition, Google provides an option for submitting the URL and linked pages to the index.  This enables you to have the page at hand submitted to the index, but also other pages that are linked to from that URL.  You can do this up to 10 times per month, so make sure you need it if you use it!

Let’s go through the process of using Fetch as Googlebot to submit a recently updated page to Google’s index.  I’ll walk you step by step through the process below.

How to Use Fetch as Googlebot to Submit a Recently Updated Page to Google’s Index
1. Access Google Webmaster Tools and Find “Fetch as Googlebot”
You need a verified website in Google Webmaster Tools in order to use Fetch for Googlebot.  Sign into Google Webmaster Tools, select the website you want to work on, expand the left side navigation link for “Diagnostics”.  Then click the link for “Fetch as Googlebot”.

Accessing Fetch as Googlebot in Google Webmaster Tools

2. Enter the URL to Fetch
You will see a text field that begins with your domain name.  This is where you want to add the URL of the page you want submitted to Google’s index.  Enter the URL and leave the default option for Googlebot type as “Web”, which will use Google’s standard web crawler (versus one of its mobile crawlers).  Then click “Fetch”.

Enter a URL to Fetch

3.  Submit to Index
Once you click Fetch, Google will fetch the page and provide the results below.  At this point, you can view the status of the fetch and click through that status to learn more.  But, you’ll notice another option next to the status field that says, “Submit to Index”.  Clicking that link brings up a dialog box asking if you want just the URL submitted or the URL and Linked Pages.  Select the option you want and then click Submit.

A Successful Fetch:
Submit a URL to Fetch as Googlebot

The Submit to Index Dialog Box:
The Submit to Index Dialog Box

4. Submit to Index Documented with Date and Time:
Once you click “Submit”, Google will present a message that your URL has been submitted to the index, along with the date and time it was submitted.

Submit to Index Date and Time

That’s it!  You just successfully added an updated page to Google’s index.
Note, this doesn’t mean the page will automatically be updated in the index.  It can take a little time for this to happen, but I’ve seen this happen pretty quickly (sometimes in just a few hours).  The update might not happen as quickly for every website, but again, it should be quicker than waiting for Googlebot to recrawl your site.

Expedite Updates Using Fetch as Googlebot
Let’s face it, nobody like waiting. And that’s especially the case when you have updated content that you want indexed by Google!  If you have a page that’s been recently updated, then I recommend using Fetch as Googlebot to make sure the page gets updated as quickly as possible.  It’s easy to use, fast, and can also be used to submit all linked URL’s from the page at hand.  Go ahead, try it out today.

GG

 

  • http://www.mikewilton.com/ Mike Wilton

    Great post, as this is a method I have used to update content/sites in Google’s index a number of times. That being said, I learned the hard way that there is one instance where this cannot help you.  A couple months back I was migrating my website to a new domain and initially I had a number of pages disallowed via robots.txt file. When I finally got everything situated on the new domain I removed the disallow and tried to fetch as Googlebot.  Unfortunately it appears that Google uses the last version of your robots.txt file it crawled for the “Fetch as Googlebot” feature, so if the pages were disallowed during the last crawl it will not be able to fetch the page.  In my case I had to wait until the next crawl to run it, but by then Google was already seeing the new page.  Not a big deal for minor changes, but it sucks when you are trying to get migrated content indexed.

    • http://www.hmtweb.com/imd/ Glenn Gabe

      Thanks for your comment Mike, and that’s a great point.  I’m going to look into the robots.txt issue.  I’m wondering if you can use Fetch as Googlebot on robots.txt.  I’ll post an update if I find out anything on that front.  Thanks again.
      GG

  • Ravi Kashyap

    Dear sir i am facing a problems with googlewebmaster tool.I see fetch status is pending .I am not able to resolve my this problems if you have any solution them please send me on my id nishantrana100@gmail.com…and my blog is http://www.seoworldindia.in

  • Anu

    Hi, thanks for your step-by-step instructions. I have actually used Fetch earlier and found that it did not really update. However, I tried it again today. Also, found in my Webmaster Tools it shows just as ‘Fetch’ and under ‘Health’ and not ‘Diagnostics.’ Has the interface changed since you wrote this post?

    • http://www.hmtweb.com/marketing-blog/ Glenn Gabe

      Hi Anu. I’m glad you found my post helpful. You’re right, the interface was updated after I wrote this post. You can now find “Fetch as Google” under “Health” versus under “Diagnostics”.

      GG

  • Velu

    Nice post,i have a doubt,in the final step their is a message as “URL Submitted to index” for me more than 10days that message is in existence,how come i know that my site is indexed or not
    Waiting for your reply ASAP…
    Thank you

    • http://www.hmtweb.com/marketing-blog/ Glenn Gabe

      Is the page still not indexed? Submitting via fetch as googlebot doesn’t guarantee the page will be indexed quickly. Is it a new website?

      • velu

        yes it is not been indexed ,it is an old website only,is their any other way to get indexed by google soon…otherthan “fetch as google” in webmaster tools…i have also submitted sitemap

        • http://www.hmtweb.com/marketing-blog/ Glenn Gabe

          The site should get recrawled naturally… Are there any inbound links to the site? Has the site been penalized at all??

  • http://www.wisestep.com/ WiseStep

    I have personally tested this tool and is very useful. Great job Google Webmasters :)

  • ram ch. Dash

    Really interested article, But i want know how to remove cached pages using Google Webmaster tool.

  • http://twitter.com/KimJunHong Kim Jun Hong

    Good instructions! But just how long will Google fetch my site after it shows Success? Because after 24 hours, I googled my site and it still shows my old description and page title.

  • http://thetrickslab.com/ Zeeshan Ahmed

    Googlebots blocked my Website posts URL… And when I Open my site from Google Cache, it will show my oldest post, not updated even after a week..

    Anyone Help me :(

    • http://www.hmtweb.com/marketing-blog/ Glenn Gabe

      Zeeshan, remove the allow directive. It’s not necessary. Also, make sure you are using the right format for your xml sitemap. I haven’t checked your specific file out, but make sure you are pointing to a valid xml sitemap file. How does your sitemap reporting look in Google Webmaster Tools?

      Regarding the URL’s blocked, you are only blocking one directory, so those are probably the files being blocked. If your posts are not in that directory, then they shouldn’t be blocked.

      You can test this out in google webmaster tools. When you test posts, what results are you getting?

  • http://YouTube.com/pushkacom Pushka Ben

    I was doing this and it was coming up with errors, but retrying helped eventually ~ ~

    • http://www.hmtweb.com/marketing-blog/ Glenn Gabe

      Right, that can happen sometimes. Trying again usually results in the page getting submitted to the index. Glad it worked for you.

  • Mohammad Nasir

    What does the Fetch remaining number means, its also showing around 500 to me in my webmaster tool………..

    • http://www.hmtweb.com/marketing-blog/ Glenn Gabe

      You can submit up to 500 urls per week, or 10 per week for URL + all pages linked from that URL. That’s the total you are seeing at the top (your remaining fetches). I hope that helps.