[openspending-dev] Fwd: [Webmaster Tools] http://openspending.org/: Googlebot can't access your site

Rufus Pollock rufus.pollock at okfn.org
Mon Dec 23 08:21:12 UTC 2013


There seem to be some issues with google bot accessing
openspending.org/robots.txt ...


---------- Forwarded message ----------
From: <wmt-noreply at google.com>
Date: 22 December 2013 08:45
Subject: [Webmaster Tools] http://openspending.org/: Googlebot can't access
your site

 [image: Google Logo]
http://openspending.org/: Googlebot can't access your site

Over the last 24 hours, Googlebot encountered 2591 errors while attempting
to access your robots.txt. To ensure that we didn't crawl any pages listed
in that file, we postponed our crawl. Your site's overall robots.txt error
rate is 100.0%.

You can see more details about these errors in Webmaster
Tools<https://www.google.com/webmasters/tools/crawl-errors?siteUrl=http://openspending.org/&utm_source=wnc_94051&utm_term=link_2&utm_content=uns_d70924de84000000&utm_campaign=t_1387612798849000&utm_medium=email#t1=2>
.

------------------------------

 *Recommended action*
If the site error rate is 100%:

   - Using a web browser, attempt to access
   http://openspending.org/robots.txt. If you are able to access it from
   your browser, then your site may be configured to deny access to googlebot.
   Check the configuration of your firewall and site to ensure that you are
   not denying access to googlebot.
   - If your robots.txt is a static page, verify that your web service has
   proper permissions to access the file.
   - If your robots.txt is dynamically generated, verify that the scripts
   that generate the robots.txt are properly configured and have permission to
   run. Check the logs for your website to see if your scripts are failing,
   and if so attempt to diagnose the cause of the failure.

If the site error rate is less than 100%:

   - Using Webmaster
Tools<https://www.google.com/webmasters/tools/crawl-errors?siteUrl=http://openspending.org/&utm_source=wnc_94051&utm_term=link_4&utm_content=uns_d70924de84000000&utm_campaign=t_1387612798849000&utm_medium=email#t1=2>,
   find a day with a high error rate and examine the logs for your web server
   for that day. Look for errors accessing robots.txt in the logs for that day
   and fix the causes of those errors.
   - The most likely explanation is that your site is overloaded. Contact
   your hosting provider and discuss reconfiguring your web server or adding
   more resources to your website.
   - If your site redirects to another hostname, another possible
   explanation is that a URL on your site is redirecting to a hostname whose
   serving of its robots.txt file is exhibiting one or more of these issues.

 After you think you've fixed the problem, use Fetch as
Google<https://www.google.com/webmasters/tools/googlebot-fetch?hl=en_GB&siteUrl=http://openspending.org/&utm_source=wnc_94051&utm_term=link_4&utm_content=uns_d70924de84000000&utm_campaign=t_1387612798849000&utm_medium=email>to
fetch
http://openspending.org/robots.txt to verify that Googlebot can properly
access your site.

Learn more in our Help
Center<http://support.google.com/webmasters/bin/answer.py?answer=2409682&hl=en_GB&utm_source=wnc_94051&utm_term=link_4&utm_content=uns_d70924de84000000&utm_campaign=t_1387612798849000&utm_medium=email>.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.okfn.org/pipermail/openspending-dev/attachments/20131223/a879f18f/attachment.html>


More information about the openspending-dev mailing list