I’ve been using the Caddy webserver for all my projects lately. Here’s my current default config for a Laravel project.
First, I want to separate all my configs in different config files, to keep things tidy. My Caddyfile
is simply an import of all my vhosts.
$ cat /etc/caddy/Caddyfile
import /etc/caddy/conf.d/*.conf
As a result, all configs live in /etc/caddy/conf.d/
.
Here’s an example for my own company, robotstudios.be. (which is a Laravel app just to receive Stripe payments via credit card. Damn international wire transfers suck.)
$ cat /etc/caddy/conf.d/robotstudios.be.conf
www.robotstudios.be {
redir / https://robotstudios.be 301
}
# The actual vhost, on a single domain
robotstudios.be {
root /var/www/html/robotstudios.be/public
gzip
# Point to the upstream PHP-FPM socket
fastcgi / unix:/run/php/robotstudios.be-fpm.sock php
# This rewrite is to prevent access to dot files and folders such
# as .htaccess, .git, etc.
rewrite {
# This regex catches everything that contains "/." in the URL
r \/\.
if {path} not_starts_with .well-known
to /index.php{uri}
}
# Rewrite non-existent URLs to our index.php controller
rewrite {
if {file} not favicon.ico
to {path} {path}/ /index.php?{query}
}
header / {
Strict-Transport-Security "max-age=30758400"
X-Content-Type-Options "nosniff"
X-Frame-Options "deny"
X-XSS-Protection "1; mode=block"
Referrer-Policy "same-origin"
}
# Access logging in the combined format
log / /var/www/html/robotstudios.be/logs/access.log "{combined}" {
rotate_size 50
rotate_age 14
rotate_keep 14
rotate_compress
}
# Error logs in a separate file
errors /var/www/html/robotstudios.be/logs/error.log {
rotate_size 50
rotate_age 14
rotate_keep 14
rotate_compress
}
}
This config serves a couple of purposes:
- Redirect all versions of the domain to
robotstudios.be
, without the www-prefix - Enable gzip compression
- Store logs in the home dir of the site, rotate them and keep 14 versions
- Point all PHP requests to an upstream socket, served by php-fpm
- Redirect all URLs to the index.php (for pretty-printed URLs)
- Prevent access to files like
.htaccess
,.git
, … (like robotstudios.be/.htaccess) - Set a sane set of default security headers, like HSTS (force HTTPS)
The logs use the combined
format, so I get the referer header & user agents too.
It’s readable and pretty short, I like that.