Question
Googlebot unable to crawl my site and robots.txt
I am running the Ghost application on Ubuntu 14.04 and have my site up and running.
I am having problems with Googlebot being unable to crawl my site and access my robots.txt. The robots.txt file is accessible via a web browser. This is the error I am getting.
Your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file.
I have correctly set permissions for all files and checked nginx for any issues that I could find with no luck.
Any ideas?
Edit: I just did a pingdom dns check and I am getting errors with PTR records?
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
×
@neilrooney
Did you install a pre-configured image or did you install your own stack? If you did your own stack, let me know how you went about installing / configuring (i.e. from source, using apt-get, I changed “this” and “that”).