How to Use Fetch as Googlebot to Submit a Recently Changed Page to Google’s Index [Tutorial]

Fetch as Googlebot in Google Webmaster Tools

Any marketer focused on SEO will tell you that it’s sometimes frustrating to wait for Googlebot to recrawl an updated page.  The reason is simple.  Until Googlebot recrawls the page, the old content will show up in the search results.  For example, imagine someone added content that shouldn’t be on a page, and that new content was already indexed by Google.  Since it’s an important page, you don’t want to take the entire page down.  In a situation like this, you would typically update the page, resubmit your xml sitemap, and hope Googlebot stops by soon.  As you can guess, that doesn’t make anyone involved with the update very happy.

For some companies, Googlebot is visiting their website multiple times per day.  But for others, it could take weeks to get recrawled.  So, how can you make sure that a recently updated page gets into Google’s index as quickly as possible?  Well, Google has you covered.  There’s a tool called Fetch as Googlebot that can be accessed within Google Webmaster Tools that you can use for this purpose.  Let’s explore Fetch as Googlebot in greater detail below.

Fetch as Googlebot and Submit to Index
If you aren’t using Google Webmaster Tools, you should be.  It’s an incredible resource offered by Google that enables webmasters to receive data directly from Google about their verified websites.  Google Webmaster Tools also includes a number of valuable tools for diagnosing website issues.  One of the tools is called Fetch as Googlebot.  The primary purpose of Fetch as Googlebot is to submit a URL, and then view how the page looks to Googlebot (Google’s web crawler).  This can help you diagnose issues with the URL at hand.  For example, is Googlebot not seeing the right content, is the wrong header response code being thrown, etc.

But, that’s not the only use for Fetch as Googlebot.  In 2011, Google added functionality for submitting that URL to its index, right from the tool itself.  You can submit up to 500 URL’s per week via Fetch as Googlebot, which should be sufficient for most websites.  This can be a great solution for times where you updated a webpage and want that page refreshed in Google’s index as quickly as possible.  In addition, Google provides an option for submitting the URL and linked pages to the index.  This enables you to have the page at hand submitted to the index, but also other pages that are linked to from that URL.  You can do this up to 10 times per month, so make sure you need it if you use it!

Let’s go through the process of using Fetch as Googlebot to submit a recently updated page to Google’s index.  I’ll walk you step by step through the process below.

How to Use Fetch as Googlebot to Submit a Recently Updated Page to Google’s Index
1. Access Google Webmaster Tools and Find “Fetch as Googlebot”
You need a verified website in Google Webmaster Tools in order to use Fetch for Googlebot.  Sign into Google Webmaster Tools, select the website you want to work on, expand the left side navigation link for “Diagnostics”.  Then click the link for “Fetch as Googlebot”.

Accessing Fetch as Googlebot in Google Webmaster Tools

2. Enter the URL to Fetch
You will see a text field that begins with your domain name.  This is where you want to add the URL of the page you want submitted to Google’s index.  Enter the URL and leave the default option for Googlebot type as “Web”, which will use Google’s standard web crawler (versus one of its mobile crawlers).  Then click “Fetch”.

Enter a URL to Fetch

3.  Submit to Index
Once you click Fetch, Google will fetch the page and provide the results below.  At this point, you can view the status of the fetch and click through that status to learn more.  But, you’ll notice another option next to the status field that says, “Submit to Index”.  Clicking that link brings up a dialog box asking if you want just the URL submitted or the URL and Linked Pages.  Select the option you want and then click Submit.

A Successful Fetch:
Submit a URL to Fetch as Googlebot

The Submit to Index Dialog Box:
The Submit to Index Dialog Box

4. Submit to Index Documented with Date and Time:
Once you click “Submit”, Google will present a message that your URL has been submitted to the index, along with the date and time it was submitted.

Submit to Index Date and Time

That’s it!  You just successfully added an updated page to Google’s index.
Note, this doesn’t mean the page will automatically be updated in the index.  It can take a little time for this to happen, but I’ve seen this happen pretty quickly (sometimes in just a few hours).  The update might not happen as quickly for every website, but again, it should be quicker than waiting for Googlebot to recrawl your site.

Expedite Updates Using Fetch as Googlebot
Let’s face it, nobody like waiting. And that’s especially the case when you have updated content that you want indexed by Google!  If you have a page that’s been recently updated, then I recommend using Fetch as Googlebot to make sure the page gets updated as quickly as possible.  It’s easy to use, fast, and can also be used to submit all linked URL’s from the page at hand.  Go ahead, try it out today.



  • Mike Wilton

    Great post, as this is a method I have used to update content/sites in Google’s index a number of times. That being said, I learned the hard way that there is one instance where this cannot help you.  A couple months back I was migrating my website to a new domain and initially I had a number of pages disallowed via robots.txt file. When I finally got everything situated on the new domain I removed the disallow and tried to fetch as Googlebot.  Unfortunately it appears that Google uses the last version of your robots.txt file it crawled for the “Fetch as Googlebot” feature, so if the pages were disallowed during the last crawl it will not be able to fetch the page.  In my case I had to wait until the next crawl to run it, but by then Google was already seeing the new page.  Not a big deal for minor changes, but it sucks when you are trying to get migrated content indexed.

    • Glenn Gabe

      Thanks for your comment Mike, and that’s a great point.  I’m going to look into the robots.txt issue.  I’m wondering if you can use Fetch as Googlebot on robots.txt.  I’ll post an update if I find out anything on that front.  Thanks again.

  • Ravi Kashyap

    Dear sir i am facing a problems with googlewebmaster tool.I see fetch status is pending .I am not able to resolve my this problems if you have any solution them please send me on my id…and my blog is

  • Anu

    Hi, thanks for your step-by-step instructions. I have actually used Fetch earlier and found that it did not really update. However, I tried it again today. Also, found in my Webmaster Tools it shows just as ‘Fetch’ and under ‘Health’ and not ‘Diagnostics.’ Has the interface changed since you wrote this post?

    • Glenn Gabe

      Hi Anu. I’m glad you found my post helpful. You’re right, the interface was updated after I wrote this post. You can now find “Fetch as Google” under “Health” versus under “Diagnostics”.


  • Velu

    Nice post,i have a doubt,in the final step their is a message as “URL Submitted to index” for me more than 10days that message is in existence,how come i know that my site is indexed or not
    Waiting for your reply ASAP…
    Thank you

    • Glenn Gabe

      Is the page still not indexed? Submitting via fetch as googlebot doesn’t guarantee the page will be indexed quickly. Is it a new website?

      • velu

        yes it is not been indexed ,it is an old website only,is their any other way to get indexed by google soon…otherthan “fetch as google” in webmaster tools…i have also submitted sitemap

        • Glenn Gabe

          The site should get recrawled naturally… Are there any inbound links to the site? Has the site been penalized at all??

  • WiseStep

    I have personally tested this tool and is very useful. Great job Google Webmasters :)

  • ram ch. Dash

    Really interested article, But i want know how to remove cached pages using Google Webmaster tool.

  • Kim Jun Hong

    Good instructions! But just how long will Google fetch my site after it shows Success? Because after 24 hours, I googled my site and it still shows my old description and page title.

  • Zeeshan Ahmed

    Googlebots blocked my Website posts URL… And when I Open my site from Google Cache, it will show my oldest post, not updated even after a week..

    Anyone Help me :(

    • Glenn Gabe

      Zeeshan, remove the allow directive. It’s not necessary. Also, make sure you are using the right format for your xml sitemap. I haven’t checked your specific file out, but make sure you are pointing to a valid xml sitemap file. How does your sitemap reporting look in Google Webmaster Tools?

      Regarding the URL’s blocked, you are only blocking one directory, so those are probably the files being blocked. If your posts are not in that directory, then they shouldn’t be blocked.

      You can test this out in google webmaster tools. When you test posts, what results are you getting?

  • Pushka Ben

    I was doing this and it was coming up with errors, but retrying helped eventually ~ ~

    • Glenn Gabe

      Right, that can happen sometimes. Trying again usually results in the page getting submitted to the index. Glad it worked for you.

  • Mohammad Nasir

    What does the Fetch remaining number means, its also showing around 500 to me in my webmaster tool………..

    • Glenn Gabe

      You can submit up to 500 urls per week, or 10 per week for URL + all pages linked from that URL. That’s the total you are seeing at the top (your remaining fetches). I hope that helps.

  • Valerie Whitmore

    Hi Glenn, I know this post is a couple years old now but I have a question that I haven’t been able to find an answer to elsewhere about using the Fetch tool. Do you recommend using it just for pages that have been changed, or would you also recommend it to speed up Google finding pages that have a 301 redirect in place? The “status” when a page with a 301 is submitted shows “Redirected” but it does also have the “submit to index” option.

    • Glenn Gabe

      Hi Valerie. I wouldn’t worry about doing that for 301 redirects. Just let Google find the 301 during the normal crawling process. Is there a specific reason you would want Google to find the 301 faster than usual?

      • Valerie Whitmore

        Well, just trying to recover from Panda as quickly as possible! The changes I have to do are very slow going – but seem to be working (thankfully!) I submit updated pages using the Fetch tool, but the “bad” pages that I am adding a 301 redirect to are only very slowly spidered (for some it’s been over 6 months since Googlebot last stopped by). We’re talking a substantial number of pages that have the 301 that I assume won’t help the site recover more until Google’s re-evaluated them (and probably spidered multiple times).

        • Glenn Gabe

          Valerie, definitely a topic that should be covered elsewhere, but I wouldn’t 301 low quality pages. I would noindex or 404 them. Why are you 301 redirecting the low quality pages? Again, definitely for another post! :)

  • styzzz

    Webmaster Tools is showing me mobile errors in the Moble Usability Report. When I check the live versions, it says the page is 100% user friendly.

    I dont think robots.txt is blocking anything, because when I “Fetch as Google” using smartphone – it loads complete.

    What gives? Why is the Usability report not accurate to the live tools?

    • Glenn Gabe

      I’ve seen this a few times. If the mobile friendly tool is saying you’re ok, then it’s probably ok. But, it’s hard for me to tell without seeing the actual urls. Are you using a responsive design, dynamic delivery, or separate mobile urls? I’ve seen the problem more on separate mobile urls.

  • Ken Thomas

    is there any way to remove an item from the index than has been submitted to google’s index via this process

    • Glenn Gabe

      Hi Ken. Why would you want to remove a url that you just tried to have indexed?

      If you updated the url (content-wise), you can use this process to ensure Google has the latest version indexed. If you want a specific url removed for some reason, then you can 410 the url. Google will drop the url from its index after crawling it again. And if you really need it removed quickly, then use the url removal tool in GSC. That’s under “Google Index -> Remove URLs”. I hope that helps.

  • Ken Thomas

    Suppose you want google to recrawl your whole site, because it seems to be taking days or weeks for new pages to be picked up and also for old removed pages to be removed from google’s search results. Can the site map be submitted to be indexed via this process Crawl > Fetch as Google > Add to Index > URL and all linked pages? or will this mess everything up ?

    • Glenn Gabe

      Great question. You could use an html sitemap that links to each page and then submit that via fetch as google (and crawl all linked urls). I would also make sure your xml sitemaps are updated and resubmit them in GSC.

      Regarding crawling all links in an xml sitemap, I’m not sure if that would work well in GSC. It theoretically should work, but I haven’t used it that way. You can definitely try it and see how it works. I hope that helps.

      • Glenn Gabe

        Seems John Mueller explained you can submit an xml file or text file with a list of urls. So go ahead! :)

      • Ken Thomas

        I tried it by accident and shortly after that, my sitemap started showing up in search results. not good! so far I haven’t been able to resolve that problem. just recently got the X-Robots tag added to the HTTP header but it is still showing sitemap in some search results

        • Glenn Gabe

          Ken, it’s common for Google to index an xml sitemap. It just shouldn’t surface that in the results for unrelated queries. What queries are you seeing your sitemap actually rank for? If you can provide your domain name, I can take a quick look.

          • Glenn Gabe

            Ken, your xml sitemap was last indexed on 8/16. That’s before you submitted yesterday. Again, that’s normal, but Google shouldn’t be surfacing that in the SERP for queries. If you want to email me, you can via my contact page. Would be interesting to learn more about your situation.