How to Fix this specific Google "Fetch as Googlebot" error appearing on my Webmaster Tools?

5,030

Solution 1

thanks for all your help. They were all very helpful and gave me more information that I didn't know. I ended up checking the .htaccess file , and suddenly found some "Deny" commands with IP addresses after it.

One of the ip addresses was blocking all connections with Google.

So it was being blocked, maybe by a "Bogus" Plugin from Wordpress, or maybe someone possibly hacked into my server and changed the settings, which I find unlikely coz if they could access my account they could have easily been able to erase the website and do something funny. Doesn't seem too unreal as my website has a very high traffic demand and is about politics in my country.

So, the solution is to remove the following line from the ".htaccess" file - see below:

deny from 66.249

Thank you all.

Solution 2

Install the user agent switcher extension for Firefox and create a user agent for Googlebot. Then view your site and see what happens. If you get the above error then your site is configured wrong somewhere. If you don't then the issue lies with Google and you should go to their forums for support.

Solution 3

Ran into the same issue and the problem stemmed from a line of code in my .htaccess file.

Particularly, I defined a page for "DirectoryIndex" that did not exist, making it impossible for Google to access a non-existent page.

Be sure to do a careful review of this file if you have problems.

Share:
5,030

Related videos on Youtube

UXdesigner
Author by

UXdesigner

User Experience Designer, Information Architect, Usability Preacher. Good with xhtml, css, jquery and really crappy with .NET.

Updated on September 18, 2022

Comments

  • UXdesigner
    UXdesigner over 1 year

    Good day,

    I'm currently finding out why I have lost all of my website's rank in google. I don't even appear in google results by the domain. But other sites do link me and they appear in the google results.

    I think it's all about leaving my site two months alone and finding out I had 20k in comment spam, which I completely deleted and fixed with filters and adding a new Disqus comment service.

    Thing is, I added my site to Google Webmaster Tools and I'm finding out several awful things. For example, when I click in Google Fetch As GoogleBot. I receive this error message below in response to my request. And I don't even know what's the real problem and how to fix it. I simply don't get it. This is what appears:

    Date: Wednesday, July 20, 2011 9:43:35 AM PDT

    Googlebot Type: Web

    Download Time (in milliseconds): 55

    HTTP/1.1 403 Forbidden Date: Wed, 20 Jul 2011 16:43:36 GMT Server: Apache Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 248 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1

    403 Forbidden

    Forbidden

    You don't have permission to access / on this server.

    Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.

    Do you guys know anything about this problem ? I need to have Google crawl my site again. I used to have a really nice google result in the past three years. Now, there's nothing.

    thanks,

  • UXdesigner
    UXdesigner almost 13 years
    Very good approach, clever ! I just followed your instructions and saw the website as the Googlebot would do. . . no errors whatsoever. I have no clue now on what is going on over there. I will check the wordpress configuration and re-check with Dreamhost to see if is there any changes or something. I got to say not even the sitemap.xml is being crawled by Google. Something's really wrong.