By Neil Rooney
I am running the Ghost application on Ubuntu 14.04 and have my site up and running.
I am having problems with Googlebot being unable to crawl my site and access my robots.txt. The robots.txt file is accessible via a web browser. This is the error I am getting.
Your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file.
I have correctly set permissions for all files and checked nginx for any issues that I could find with no luck.
Any ideas?
Edit: I just did a pingdom dns check and I am getting errors with PTR records?
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
I solved this problem a week ago. Turns out Googlebot requires TLSv1 to be turned on.
I installed using a pre-configured image. Apart from the nginx config nothing was changed.
Whats the ls -al output of your robots.txt? If your robots.txt is located at /var/www/directory/robots.txt, please cd /var/www/directory/, and run a ls -al.
Perhaps the file permissions are making it inaccessible.
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Full documentation for every DigitalOcean product.
The Wave has everything you need to know about building a business, from raising funding to marketing your product.
Stay up to date by signing up for DigitalOcean’s Infrastructure as a Newsletter.
New accounts only. By submitting your email you agree to our Privacy Policy
Scale up as you grow — whether you're running one virtual machine or ten thousand.
Sign up and get $200 in credit for your first 60 days with DigitalOcean.*
*This promotional offer applies to new accounts only.