Stop. Caching. Static. Files. In. Varnish.

Want to help support this blog? Try out Oh Dear, the best all-in-one monitoring tool for your entire website, co-founded by me (the guy that wrote this blogpost). Start with a 10-day trial, no strings attached.

We offer uptime monitoring, SSL checks, broken links checking, performance & cronjob monitoring, branded status pages & so much more. Try us out today!

Profile image of Mattias Geniar

Mattias Geniar, November 27, 2012

Follow me on Twitter as @mattiasgeniar

I maintain a Varnish Configuration Repository that everyone can commit to. It’s meant to be a good base to start any kind of Varnish implementation. I use it as my personal Varnish Boilerplate. And I personally have always disliked configurations such as these:

# Remove all cookies for static files, force a cache hit
if (req.url ~ "^[^?]*\.(css|jpg|js|gif|png|xml|flv|gz|txt|...)(\?.*)?$") {
  unset req.http.Cookie;
  return (lookup);
}

The snippet above will strip all cookies from static files causing them to be cached by default. Why do I dislike this? Because it seems like this is causing people to cheer at their 99% cache hitrate but is causing more cache evictions because their memory is so limited. The only moment you should ever cache static file is if you have memory to spare.

Static files do not cause load.

Sure, they cause disk access. Not just for reading the file, but for logging their entry in the webserver logs as well (if you have not excluded caching static content from your server logs). And your webserver needs to send the file back to the client.

But in all likeliness, your OS probably has your most frequent static files in their buffer meaning no disk access is required to serve the file. And your webserver should simply be good at serving static files, or you should consider switching (to, say, lighttpd or nginx).

You’re staring blindly at the cache rate

If a default config caches all static files without exception, you’ll see high cache hitrates. You (and your client) will be happy. And you’ll be mislead in thinking that the performance bottlenecks have been solved. It’s not the static files that cause your webserver load, it’s the PHP/Ruby/Perl/… scripts that are hitting your database and are calling external API’s. Those are the requests that need to be cached, not a static file that’s not causing any CPU load.

Proceed with spare memory

Therefore, focus should be given to the dynamic pages causing server load. Those cached requests actually make a difference. Then, if you have spare memory, open up some static file requests to be cached. And continue to monitor your cache size, because if static files are causing heavy PHP pages to be evicted from memory, you will only hurt your performance further.

Nearly no one has the capacity to store all their static content in a Varnish memory cache. Don’t sacrifice a 5ms static file request for a 500ms PHP processing script. I’ve commented all entries that cause static files to be cached explaining this, and I hope that people don’t blindly copy the config templates but read through them and think about this.



Want to subscribe to the cron.weekly newsletter?

I write a weekly-ish newsletter on Linux, open source & webdevelopment called cron.weekly.

It features the latest news, guides & tutorials and new open source projects. You can sign up via email below.

No spam. Just some good, practical Linux & open source content.