One of the problems I ran into early with The Beer Stash was Google not indexing the site because /robots.txt was 503-ing. Researching solutions lead me to django-robots which lead me back to Django’s Sitemaps. The sitemaps do not seem to be cached which could be a bad thing. I think. I am throwing Knuth’s caution to the wind and optimizing from now. This is a little view I wrote that caches the Sitemap for 15 minutes.
from django.views.decorators.cache import cache_page
I get the feeling that this may not be the best way, but it’s the best I could come up with for a quick solution. I am using the Filesystem cache. Nothing else is being cached atm.