I am running the Ghost application on Ubuntu 14.04 and have my site up and running.
I am having problems with Googlebot being unable to crawl my site and access my robots.txt. The robots.txt file is accessible via a web browser. This is the error I am getting.
Your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file.
I have correctly set permissions for all files and checked nginx for any issues that I could find with no luck.
Any ideas?
Edit: I just did a pingdom dns check and I am getting errors with PTR records?
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
Hello, I have same problem in my website shubhh.com(That’s running on Ubuntu LAMP on 14.04), But my site have 15-16 server errors in Google crawl errors, can given soln “Turns out Googlebot requires TLSv1 to be turned on” resolve my error without affecting my website , If yes please define me how to turned on “Googlebot requires TLSv1”.
Please any one help me. Thank you.
I solved this problem a week ago. Turns out Googlebot requires TLSv1 to be turned on.
Whats the ls -al
output of your robots.txt? If your robots.txt is located at /var/www/directory/robots.txt, please cd /var/www/directory/
, and run a ls -al
.
Perhaps the file permissions are making it inaccessible.
I installed using a pre-configured image. Apart from the nginx config nothing was changed.
Click below to sign up and get $100 of credit to try our products over 60 days!
@neilrooney
Did you install a pre-configured image or did you install your own stack? If you did your own stack, let me know how you went about installing / configuring (i.e. from source, using apt-get, I changed “this” and “that”).