Question

Nginx: Same root folder for multiple website different robots.txt

Posted August 6, 2021 124 views
Nginx

Hello,

I have a first website with this configuration:

server {
        server_name mywebsite.com;

        root /var/www/website/prod;
...
}

and a second one like this with the same root folder:

server {
        server_name pro.mywebsite.com;

        root /var/www/website/prod;
...
}

And the only problem is they have the same robots.txt file.

Is it possible to tell the second server to link the file robots.txt to another file like robots-pro.txt?

And by the way the url http://pro.mywebsite.com/robots.txt would open the file robots-pro.txt.

Thanks,
Vincent.

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

×
Submit an Answer
1 answer

Hello,

Yes, this is doable with Nginx aliases.

In each server block you could specify the following:

    location /robots.txt { alias /var/www/html/site1-robots.txt ; }

And then just change the site1 part depending on the name of your file for each site accordingly.

Hope that this helps!
Regards,
Bobby

  • Yes, so I did:

    For the main domain I did to block robots-denyall.txt:

    location /robots-denyall.txt {
            return 404;
    }
    

    And for the subdomain which I don’t want to be referenced:

    location = /robots.txt {
            alias /var/www/mywebsite/prod/robots-denyall.txt;
    }