One of the problems I ran into early with The Beer Stash was Google not indexing the site because /robots.txt was 503-ing. Researching solutions lead me to django-robots which lead me back to Django’s Sitemaps. The sitemaps do not seem to be cached which could be a bad thing. I think. I am throwing Knuth’s caution to the wind and optimizing from now. This is a little view I wrote that caches the Sitemap for 15 minutes.

{% codeblock Cache the generated sitemap.xml for 15 minutes lang:python%} from django.views.decorators.cache import cache_page …. @cache_page(60 * 15) # cache_page(seconds) def sitemap(request, sitemaps, section=None, template_name=‘sitemap.xml’, mimetype=‘application/xml’): from django.contrib.sitemaps.views import sitemap return sitemap(request, sitemaps, section, template_name, mimetype) {% endcodeblock %}

I get the feeling that this may not be the best way, but it’s the best I could come up with for a quick solution. I am using the Filesystem cache. Nothing else is being cached atm.