django serving robots.txt efficiently django serving robots.txt efficiently python python

django serving robots.txt efficiently


Yes, robots.txt should not be served by Django if the file is static. Try something like this in your Nginx config file:

location  /robots.txt {    alias  /path/to/static/robots.txt;}

See here for more info: http://wiki.nginx.org/HttpCoreModule#alias

Same thing applies to the favicon.ico file if you have one.

The equivalent code for Apache config is:

Alias /robots.txt /path/to/static/robots.txt


I know this is a late reply, I was looking for similar solution when don't have access to the web server config. So for anyone else looking for a similar solution, I found this page: http://www.techstricks.com/adding-robots-txt-to-your-django-project/

which suggests adding this to your project url.py:

from django.conf.urls import urlfrom django.http import HttpResponseurlpatterns = [    #.... your project urls    url(r'^robots.txt', lambda x: HttpResponse("User-Agent: *\nDisallow:", content_type="text/plain"), name="robots_file"),]

which I think should be slightly more efficient that using a template file, although it could make your url rules untidy if need multiple 'Disallow:' options.