I am running the Ghost application on Ubuntu 14.04 and have my site up and running.
I am having problems with Googlebot being unable to crawl my site and access my robots.txt. The robots.txt file is accessible via a web browser. This is the error I am getting.
Your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file.
I have correctly set permissions for all files and checked nginx for any issues that I could find with no luck.
Edit: I just did a pingdom dns check and I am getting errors with PTR records?
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.
Click below to sign up and get $100 of credit to try our products over 60 days!